Feb 02 10:31:13 crc systemd[1]: Starting Kubernetes Kubelet... Feb 02 10:31:13 crc restorecon[4739]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:13 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:14 crc restorecon[4739]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 02 10:31:14 crc restorecon[4739]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 02 10:31:14 crc kubenswrapper[4909]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:31:14 crc kubenswrapper[4909]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 02 10:31:14 crc kubenswrapper[4909]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:31:14 crc kubenswrapper[4909]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:31:14 crc kubenswrapper[4909]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 02 10:31:14 crc kubenswrapper[4909]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.784040 4909 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788902 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788927 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788934 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788940 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788945 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788950 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788956 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788963 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788970 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788976 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788980 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788984 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788994 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.788998 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789002 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789005 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789009 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789013 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789017 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789021 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789024 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789028 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789032 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789037 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789040 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789044 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789048 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789052 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789055 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789059 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789064 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789069 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789074 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789078 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789082 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789087 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789092 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789096 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789100 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789104 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789107 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789111 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789114 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789118 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789122 4909 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789125 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789129 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789132 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789135 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789139 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789143 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789146 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789149 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789153 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789156 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789160 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789163 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789167 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789172 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789176 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789179 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789183 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789187 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789190 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789194 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789197 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789201 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789204 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789208 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789212 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.789216 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789308 4909 flags.go:64] FLAG: --address="0.0.0.0" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789318 4909 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789325 4909 flags.go:64] FLAG: --anonymous-auth="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789332 4909 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789339 4909 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789343 4909 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789349 4909 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789355 4909 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789360 4909 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789364 4909 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789369 4909 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789373 4909 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789377 4909 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789382 4909 flags.go:64] FLAG: --cgroup-root="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789386 4909 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789390 4909 flags.go:64] FLAG: --client-ca-file="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789394 4909 flags.go:64] FLAG: --cloud-config="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789398 4909 flags.go:64] FLAG: --cloud-provider="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789402 4909 flags.go:64] FLAG: --cluster-dns="[]" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789408 4909 flags.go:64] FLAG: --cluster-domain="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789413 4909 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789417 4909 flags.go:64] FLAG: --config-dir="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789421 4909 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789426 4909 flags.go:64] FLAG: --container-log-max-files="5" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789432 4909 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789438 4909 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789442 4909 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789446 4909 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789476 4909 flags.go:64] FLAG: --contention-profiling="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789482 4909 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789488 4909 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789494 4909 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789498 4909 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789504 4909 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789508 4909 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789513 4909 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789518 4909 flags.go:64] FLAG: --enable-load-reader="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789523 4909 flags.go:64] FLAG: --enable-server="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789527 4909 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789532 4909 flags.go:64] FLAG: --event-burst="100" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789536 4909 flags.go:64] FLAG: --event-qps="50" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789540 4909 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789545 4909 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789549 4909 flags.go:64] FLAG: --eviction-hard="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789554 4909 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789559 4909 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789563 4909 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789568 4909 flags.go:64] FLAG: --eviction-soft="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789572 4909 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789576 4909 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789580 4909 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789584 4909 flags.go:64] FLAG: --experimental-mounter-path="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789590 4909 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789594 4909 flags.go:64] FLAG: --fail-swap-on="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789598 4909 flags.go:64] FLAG: --feature-gates="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789604 4909 flags.go:64] FLAG: --file-check-frequency="20s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789608 4909 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789613 4909 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789617 4909 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789621 4909 flags.go:64] FLAG: --healthz-port="10248" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789626 4909 flags.go:64] FLAG: --help="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789630 4909 flags.go:64] FLAG: --hostname-override="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789635 4909 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789639 4909 flags.go:64] FLAG: --http-check-frequency="20s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789644 4909 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789648 4909 flags.go:64] FLAG: --image-credential-provider-config="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789652 4909 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789656 4909 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789661 4909 flags.go:64] FLAG: --image-service-endpoint="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789666 4909 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789671 4909 flags.go:64] FLAG: --kube-api-burst="100" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789676 4909 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789682 4909 flags.go:64] FLAG: --kube-api-qps="50" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789688 4909 flags.go:64] FLAG: --kube-reserved="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789693 4909 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789698 4909 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789702 4909 flags.go:64] FLAG: --kubelet-cgroups="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789706 4909 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789711 4909 flags.go:64] FLAG: --lock-file="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789715 4909 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789719 4909 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789723 4909 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789730 4909 flags.go:64] FLAG: --log-json-split-stream="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789734 4909 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789739 4909 flags.go:64] FLAG: --log-text-split-stream="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789743 4909 flags.go:64] FLAG: --logging-format="text" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789748 4909 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789752 4909 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789756 4909 flags.go:64] FLAG: --manifest-url="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789760 4909 flags.go:64] FLAG: --manifest-url-header="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789766 4909 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789770 4909 flags.go:64] FLAG: --max-open-files="1000000" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789786 4909 flags.go:64] FLAG: --max-pods="110" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789790 4909 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789795 4909 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789799 4909 flags.go:64] FLAG: --memory-manager-policy="None" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789803 4909 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789823 4909 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789827 4909 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789832 4909 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789844 4909 flags.go:64] FLAG: --node-status-max-images="50" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789848 4909 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789853 4909 flags.go:64] FLAG: --oom-score-adj="-999" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789857 4909 flags.go:64] FLAG: --pod-cidr="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789861 4909 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789870 4909 flags.go:64] FLAG: --pod-manifest-path="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789874 4909 flags.go:64] FLAG: --pod-max-pids="-1" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789879 4909 flags.go:64] FLAG: --pods-per-core="0" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789883 4909 flags.go:64] FLAG: --port="10250" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789888 4909 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789892 4909 flags.go:64] FLAG: --provider-id="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789897 4909 flags.go:64] FLAG: --qos-reserved="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789901 4909 flags.go:64] FLAG: --read-only-port="10255" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789905 4909 flags.go:64] FLAG: --register-node="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789909 4909 flags.go:64] FLAG: --register-schedulable="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789914 4909 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789922 4909 flags.go:64] FLAG: --registry-burst="10" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789926 4909 flags.go:64] FLAG: --registry-qps="5" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789930 4909 flags.go:64] FLAG: --reserved-cpus="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789934 4909 flags.go:64] FLAG: --reserved-memory="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789940 4909 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789944 4909 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789948 4909 flags.go:64] FLAG: --rotate-certificates="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789953 4909 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789958 4909 flags.go:64] FLAG: --runonce="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789963 4909 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789969 4909 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789974 4909 flags.go:64] FLAG: --seccomp-default="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789979 4909 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789985 4909 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789990 4909 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.789995 4909 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790001 4909 flags.go:64] FLAG: --storage-driver-password="root" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790006 4909 flags.go:64] FLAG: --storage-driver-secure="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790010 4909 flags.go:64] FLAG: --storage-driver-table="stats" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790017 4909 flags.go:64] FLAG: --storage-driver-user="root" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790023 4909 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790029 4909 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790036 4909 flags.go:64] FLAG: --system-cgroups="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790042 4909 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790051 4909 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790057 4909 flags.go:64] FLAG: --tls-cert-file="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790063 4909 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790071 4909 flags.go:64] FLAG: --tls-min-version="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790076 4909 flags.go:64] FLAG: --tls-private-key-file="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790082 4909 flags.go:64] FLAG: --topology-manager-policy="none" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790088 4909 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790093 4909 flags.go:64] FLAG: --topology-manager-scope="container" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790099 4909 flags.go:64] FLAG: --v="2" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790107 4909 flags.go:64] FLAG: --version="false" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790114 4909 flags.go:64] FLAG: --vmodule="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790120 4909 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790126 4909 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790259 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790266 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790271 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790276 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790280 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790283 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790289 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790293 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790297 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790301 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790304 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790308 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790312 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790316 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790320 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790324 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790329 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790333 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790338 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790342 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790347 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790351 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790356 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790361 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790367 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790372 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790378 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790384 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790390 4909 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790395 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790400 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790405 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790410 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790415 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790421 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790425 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790430 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790436 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790442 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790449 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790454 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790458 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790464 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790469 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790474 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790479 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790484 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790488 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790493 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790500 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790506 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790512 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790518 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790523 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790528 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790533 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790537 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790542 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790547 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790552 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790557 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790562 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790567 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790571 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790576 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790582 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790587 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790592 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790598 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790602 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.790607 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.790621 4909 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.800972 4909 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.801013 4909 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801074 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801087 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801093 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801098 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801103 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801108 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801112 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801117 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801121 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801125 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801129 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801134 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801140 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801148 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801154 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801159 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801164 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801171 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801176 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801182 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801186 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801191 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801195 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801199 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801203 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801207 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801212 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801216 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801221 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801225 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801229 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801235 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801242 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801246 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801254 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801259 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801264 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801268 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801273 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801278 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801282 4909 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801286 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801291 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801295 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801299 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801304 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801308 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801312 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801317 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801322 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801326 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801331 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801335 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801339 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801343 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801347 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801351 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801355 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801359 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801363 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801367 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801371 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801375 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801379 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801383 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801387 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801390 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801395 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801399 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801403 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801407 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.801414 4909 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801560 4909 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801569 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801574 4909 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801578 4909 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801585 4909 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801589 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801594 4909 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801598 4909 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801602 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801607 4909 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801611 4909 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801616 4909 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801623 4909 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801628 4909 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801632 4909 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801636 4909 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801641 4909 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801647 4909 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801652 4909 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801657 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801661 4909 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801666 4909 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801670 4909 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801674 4909 feature_gate.go:330] unrecognized feature gate: Example Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801678 4909 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801682 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801686 4909 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801690 4909 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801695 4909 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801699 4909 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801703 4909 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801707 4909 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801711 4909 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801715 4909 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801719 4909 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801723 4909 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801728 4909 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801732 4909 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801736 4909 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801740 4909 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801745 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801749 4909 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801753 4909 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801757 4909 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801763 4909 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801767 4909 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801774 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801779 4909 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801784 4909 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801790 4909 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801795 4909 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801800 4909 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801804 4909 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801824 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801830 4909 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801836 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801840 4909 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801847 4909 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801853 4909 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801859 4909 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801863 4909 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801868 4909 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801874 4909 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801880 4909 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801884 4909 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801889 4909 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801894 4909 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801898 4909 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801902 4909 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801906 4909 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.801911 4909 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.801918 4909 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.802157 4909 server.go:940] "Client rotation is on, will bootstrap in background" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.806277 4909 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.806353 4909 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.807699 4909 server.go:997] "Starting client certificate rotation" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.807737 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.807927 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-16 15:02:19.412000467 +0000 UTC Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.808057 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.838219 4909 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 02 10:31:14 crc kubenswrapper[4909]: E0202 10:31:14.840120 4909 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.840528 4909 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.854958 4909 log.go:25] "Validated CRI v1 runtime API" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.900495 4909 log.go:25] "Validated CRI v1 image API" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.902956 4909 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.910278 4909 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-02-10-26-22-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.910308 4909 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.925758 4909 manager.go:217] Machine: {Timestamp:2026-02-02 10:31:14.923521319 +0000 UTC m=+0.669622074 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:632b64ae-2264-4464-afbe-4696d2c7d3e4 BootID:81efcdbd-597c-45e5-a1b0-0f6832442cdd Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:aa:ed:a3 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:aa:ed:a3 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a5:5c:cb Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:59:3a:f1 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:76:5a:28 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:19:e9:44 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:dd:09:e8 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:8a:8a:8f:2d:60:22 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:6e:c1:48:ef:40:6a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.925983 4909 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.926197 4909 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.926586 4909 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.926761 4909 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.926796 4909 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.927108 4909 topology_manager.go:138] "Creating topology manager with none policy" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.927121 4909 container_manager_linux.go:303] "Creating device plugin manager" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.927728 4909 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.928264 4909 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.929841 4909 state_mem.go:36] "Initialized new in-memory state store" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.929922 4909 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.935138 4909 kubelet.go:418] "Attempting to sync node with API server" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.935168 4909 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.935318 4909 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.935334 4909 kubelet.go:324] "Adding apiserver pod source" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.935348 4909 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.941524 4909 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.943738 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:14 crc kubenswrapper[4909]: E0202 10:31:14.944321 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.943744 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:14 crc kubenswrapper[4909]: E0202 10:31:14.944865 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.946434 4909 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.951433 4909 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954281 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954317 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954327 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954337 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954350 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954359 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954368 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954382 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954394 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954404 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954419 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.954427 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.956277 4909 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.957185 4909 server.go:1280] "Started kubelet" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.957304 4909 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.957480 4909 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.958206 4909 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.958389 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:14 crc systemd[1]: Started Kubernetes Kubelet. Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.960108 4909 server.go:460] "Adding debug handlers to kubelet server" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.960284 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.960326 4909 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.960353 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 15:03:00.103857354 +0000 UTC Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.960519 4909 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.960529 4909 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 02 10:31:14 crc kubenswrapper[4909]: E0202 10:31:14.960643 4909 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.960668 4909 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 02 10:31:14 crc kubenswrapper[4909]: W0202 10:31:14.961348 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:14 crc kubenswrapper[4909]: E0202 10:31:14.962055 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:14 crc kubenswrapper[4909]: E0202 10:31:14.962033 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="200ms" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.963093 4909 factory.go:55] Registering systemd factory Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.963123 4909 factory.go:221] Registration of the systemd container factory successfully Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.967012 4909 factory.go:153] Registering CRI-O factory Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.967037 4909 factory.go:221] Registration of the crio container factory successfully Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.967103 4909 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.967138 4909 factory.go:103] Registering Raw factory Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.967155 4909 manager.go:1196] Started watching for new ooms in manager Feb 02 10:31:14 crc kubenswrapper[4909]: E0202 10:31:14.966116 4909 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189067568b37b7a3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:31:14.957137827 +0000 UTC m=+0.703238562,LastTimestamp:2026-02-02 10:31:14.957137827 +0000 UTC m=+0.703238562,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.967792 4909 manager.go:319] Starting recovery of all containers Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980480 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980583 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980604 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980615 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980633 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980646 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980660 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980678 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980697 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980714 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980725 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980755 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980769 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980793 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980825 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980838 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980853 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980864 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980877 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980887 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980897 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980909 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980918 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980933 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.980983 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981000 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981025 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981056 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981077 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981095 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981109 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981127 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981141 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981180 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981199 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981219 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981241 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981255 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981276 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981289 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981302 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981319 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981334 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981362 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981377 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981873 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981943 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981961 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981979 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.981994 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982009 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982058 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982362 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982459 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982477 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982494 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982509 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982524 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982572 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982587 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982604 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982643 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982652 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982692 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982707 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982751 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982760 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982769 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982788 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982834 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982872 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982881 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982891 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982903 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982913 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982925 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982934 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.982989 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983014 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983025 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983105 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983130 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983198 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983311 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983333 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983372 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983447 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983534 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983703 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983747 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983837 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.983871 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984038 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984145 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984171 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984208 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984225 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984251 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984280 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984296 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984314 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984329 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984354 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984369 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984402 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984424 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984458 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984486 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984506 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984532 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984550 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984566 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984587 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984603 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984621 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984632 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984685 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984699 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984714 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984730 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984742 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984761 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984772 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984785 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984796 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.984848 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993204 4909 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993333 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993359 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993373 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993432 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993446 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993485 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993505 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993518 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993557 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993608 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993647 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993661 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993674 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993694 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993746 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993763 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993824 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993840 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993862 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993906 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993922 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993937 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.993983 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994006 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994019 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994058 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994072 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994138 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994163 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994177 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994212 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994224 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994243 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994254 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994283 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994305 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994321 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994334 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994374 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994399 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994410 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994447 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994461 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994472 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994518 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994539 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994551 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994629 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994647 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994686 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994704 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994720 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994732 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994743 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994853 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994866 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994878 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994907 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994918 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994929 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994941 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994952 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994981 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.994993 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995004 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995016 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995029 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995059 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995077 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995092 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995108 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995149 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995161 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995187 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995232 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995251 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995264 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995274 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995284 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995941 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.995998 4909 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.996024 4909 reconstruct.go:97] "Volume reconstruction finished" Feb 02 10:31:14 crc kubenswrapper[4909]: I0202 10:31:14.996040 4909 reconciler.go:26] "Reconciler: start to sync state" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.000373 4909 manager.go:324] Recovery completed Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.011796 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.012769 4909 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.014441 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.014491 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.014506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.014897 4909 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.015006 4909 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.015205 4909 kubelet.go:2335] "Starting kubelet main sync loop" Feb 02 10:31:15 crc kubenswrapper[4909]: E0202 10:31:15.015324 4909 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 02 10:31:15 crc kubenswrapper[4909]: W0202 10:31:15.015884 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:15 crc kubenswrapper[4909]: E0202 10:31:15.017625 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.017880 4909 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.017972 4909 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.018056 4909 state_mem.go:36] "Initialized new in-memory state store" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.035323 4909 policy_none.go:49] "None policy: Start" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.036639 4909 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.036686 4909 state_mem.go:35] "Initializing new in-memory state store" Feb 02 10:31:15 crc kubenswrapper[4909]: E0202 10:31:15.060767 4909 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.090274 4909 manager.go:334] "Starting Device Plugin manager" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.090376 4909 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.090387 4909 server.go:79] "Starting device plugin registration server" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.090745 4909 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.090758 4909 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.090980 4909 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.091095 4909 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.091111 4909 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 02 10:31:15 crc kubenswrapper[4909]: E0202 10:31:15.095996 4909 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.115528 4909 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.115650 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.116652 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.116692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.116705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.116892 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.117017 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.117058 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.117938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.117974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.117989 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.118125 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.118200 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.118221 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119018 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119049 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119155 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119062 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119291 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119505 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119690 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.119730 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.120362 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.120395 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.120407 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.120512 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.120533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.120547 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.120516 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.120614 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.120643 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.121572 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.121573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.121607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.121697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.121707 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.121735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.121821 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.121842 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.122412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.122428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.122437 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: E0202 10:31:15.162642 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="400ms" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.191257 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.193194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.193239 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.193250 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.193278 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:31:15 crc kubenswrapper[4909]: E0202 10:31:15.193768 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197715 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197772 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197821 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197844 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197866 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197883 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197899 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197915 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197931 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197949 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.197998 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.198042 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.198065 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.198088 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.198126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.299581 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.299995 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300106 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300226 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300320 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300169 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300070 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300330 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300512 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300544 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.299849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300604 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300637 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300674 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300626 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300703 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300704 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300730 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300730 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300734 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300786 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300790 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300765 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300870 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300920 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300958 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.300988 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.301028 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.301102 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.301036 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.398119 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.399430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.399478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.399489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.399516 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:31:15 crc kubenswrapper[4909]: E0202 10:31:15.400031 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.471186 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.478655 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.499756 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.520015 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.524996 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:15 crc kubenswrapper[4909]: W0202 10:31:15.551543 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-44c1939370344b94aa684f5cfebbdf85586590568577ee1f20f848d9b7bef947 WatchSource:0}: Error finding container 44c1939370344b94aa684f5cfebbdf85586590568577ee1f20f848d9b7bef947: Status 404 returned error can't find the container with id 44c1939370344b94aa684f5cfebbdf85586590568577ee1f20f848d9b7bef947 Feb 02 10:31:15 crc kubenswrapper[4909]: W0202 10:31:15.552504 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-6211fb1fe50f8bb2f46d814e399efff3761076b8751a36f6ed02476a1f820225 WatchSource:0}: Error finding container 6211fb1fe50f8bb2f46d814e399efff3761076b8751a36f6ed02476a1f820225: Status 404 returned error can't find the container with id 6211fb1fe50f8bb2f46d814e399efff3761076b8751a36f6ed02476a1f820225 Feb 02 10:31:15 crc kubenswrapper[4909]: W0202 10:31:15.555386 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-4196738100c89e8d9aa49360bd2273d782c3d957ffffadfd4b9cefd0548ccb07 WatchSource:0}: Error finding container 4196738100c89e8d9aa49360bd2273d782c3d957ffffadfd4b9cefd0548ccb07: Status 404 returned error can't find the container with id 4196738100c89e8d9aa49360bd2273d782c3d957ffffadfd4b9cefd0548ccb07 Feb 02 10:31:15 crc kubenswrapper[4909]: W0202 10:31:15.559340 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-376a2095b7982feb45c0a8bca41b981dab5734851a05f5cea477678248d570b8 WatchSource:0}: Error finding container 376a2095b7982feb45c0a8bca41b981dab5734851a05f5cea477678248d570b8: Status 404 returned error can't find the container with id 376a2095b7982feb45c0a8bca41b981dab5734851a05f5cea477678248d570b8 Feb 02 10:31:15 crc kubenswrapper[4909]: W0202 10:31:15.561113 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-aa44a75728bbdc404f6afd548455a2006e56aeaafb039769c81452c8e168c636 WatchSource:0}: Error finding container aa44a75728bbdc404f6afd548455a2006e56aeaafb039769c81452c8e168c636: Status 404 returned error can't find the container with id aa44a75728bbdc404f6afd548455a2006e56aeaafb039769c81452c8e168c636 Feb 02 10:31:15 crc kubenswrapper[4909]: E0202 10:31:15.563144 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="800ms" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.800744 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.802482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.802519 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.802531 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.802554 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:31:15 crc kubenswrapper[4909]: E0202 10:31:15.803067 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.959638 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:15 crc kubenswrapper[4909]: I0202 10:31:15.960727 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 16:10:37.389334206 +0000 UTC Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.022161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4196738100c89e8d9aa49360bd2273d782c3d957ffffadfd4b9cefd0548ccb07"} Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.023130 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"44c1939370344b94aa684f5cfebbdf85586590568577ee1f20f848d9b7bef947"} Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.024019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6211fb1fe50f8bb2f46d814e399efff3761076b8751a36f6ed02476a1f820225"} Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.025730 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"aa44a75728bbdc404f6afd548455a2006e56aeaafb039769c81452c8e168c636"} Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.027405 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"376a2095b7982feb45c0a8bca41b981dab5734851a05f5cea477678248d570b8"} Feb 02 10:31:16 crc kubenswrapper[4909]: W0202 10:31:16.030221 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:16 crc kubenswrapper[4909]: E0202 10:31:16.030300 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:16 crc kubenswrapper[4909]: W0202 10:31:16.287218 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:16 crc kubenswrapper[4909]: E0202 10:31:16.287333 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:16 crc kubenswrapper[4909]: E0202 10:31:16.363839 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="1.6s" Feb 02 10:31:16 crc kubenswrapper[4909]: W0202 10:31:16.391451 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:16 crc kubenswrapper[4909]: E0202 10:31:16.391616 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:16 crc kubenswrapper[4909]: W0202 10:31:16.499783 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:16 crc kubenswrapper[4909]: E0202 10:31:16.499887 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.603960 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.605229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.605269 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.605283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.605313 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:31:16 crc kubenswrapper[4909]: E0202 10:31:16.605861 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.946576 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 02 10:31:16 crc kubenswrapper[4909]: E0202 10:31:16.948767 4909 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.959783 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:16 crc kubenswrapper[4909]: I0202 10:31:16.960823 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 03:21:16.950196972 +0000 UTC Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.032053 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4" exitCode=0 Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.032112 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4"} Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.032179 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.033221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.033260 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.033273 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.034536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e"} Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.034566 4909 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e" exitCode=0 Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.034602 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.034727 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.035599 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.035629 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.035640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.035637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.035708 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.035725 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.036475 4909 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e" exitCode=0 Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.036545 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e"} Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.036570 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.038116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.038147 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.038159 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.038474 4909 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="77a45f7f6bf17cb3340a56769fb18deaca777960a0582243179c0151c8782dc3" exitCode=0 Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.038500 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"77a45f7f6bf17cb3340a56769fb18deaca777960a0582243179c0151c8782dc3"} Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.038528 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.039143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.039164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.039174 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.041257 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad"} Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.041288 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179"} Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.041304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e"} Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.041313 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a"} Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.041338 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.042275 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.042313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.042329 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:17 crc kubenswrapper[4909]: W0202 10:31:17.638227 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:17 crc kubenswrapper[4909]: E0202 10:31:17.638339 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.959978 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Feb 02 10:31:17 crc kubenswrapper[4909]: I0202 10:31:17.960905 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 07:07:22.278432499 +0000 UTC Feb 02 10:31:17 crc kubenswrapper[4909]: E0202 10:31:17.965051 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="3.2s" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.047490 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc"} Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.047651 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880"} Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.047663 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a"} Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.047671 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe"} Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.050122 4909 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d" exitCode=0 Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.050193 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d"} Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.050329 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.052079 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.052110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.052121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.054027 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220"} Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.054055 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72"} Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.054072 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07"} Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.054102 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.055087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.055130 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.055140 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.057593 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e7c5a7ef3ac7b5bb69e5783d19d53ba17ebe05a10903ba260faa7c15290f5fea"} Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.057639 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.057718 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.058317 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.058341 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.058350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.058926 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.058951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.058959 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.206189 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.207714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.207750 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.207759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.207782 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:31:18 crc kubenswrapper[4909]: E0202 10:31:18.208276 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Feb 02 10:31:18 crc kubenswrapper[4909]: I0202 10:31:18.961157 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 18:47:25.72974172 +0000 UTC Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.061440 4909 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f" exitCode=0 Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.061485 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f"} Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.061530 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.062308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.062344 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.062356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.068852 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e2a9dc57fe7e369d66cd5152e7d90152f908c3c09f726725480ed13fa7493945"} Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.068902 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.068951 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.068986 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.069574 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.069787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.069825 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.069838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.070036 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.070063 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.070074 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.070528 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.070549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.070557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.811132 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:31:19 crc kubenswrapper[4909]: I0202 10:31:19.961465 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 13:44:26.530165654 +0000 UTC Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.075032 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b"} Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.075082 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.075115 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f"} Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.075146 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe"} Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.075083 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.075205 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.075167 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f"} Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.075272 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f"} Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.075310 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.075942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.076006 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.076016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.076199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.076295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.076335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.077076 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.077125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.077144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:20 crc kubenswrapper[4909]: I0202 10:31:20.962348 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 11:29:36.392309325 +0000 UTC Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.077672 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.079150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.079238 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.079258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.216886 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.271631 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.271882 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.273093 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.273146 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.273161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.315786 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.316071 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.317545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.317584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.317593 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.409107 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.410199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.410256 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.410269 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.410298 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.528764 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.536089 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.802979 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:21 crc kubenswrapper[4909]: I0202 10:31:21.962936 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 00:25:46.020767353 +0000 UTC Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.081931 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.082978 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.083027 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.083039 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.331857 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.332080 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.333236 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.333267 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.333278 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.854587 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:22 crc kubenswrapper[4909]: I0202 10:31:22.963636 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 15:52:04.463120796 +0000 UTC Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.084146 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.084146 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.085037 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.085066 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.085074 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.086101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.086126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.086134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.143794 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.144445 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.145771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.145856 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.145871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.507779 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:23 crc kubenswrapper[4909]: I0202 10:31:23.964583 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 21:44:37.812664599 +0000 UTC Feb 02 10:31:24 crc kubenswrapper[4909]: I0202 10:31:24.086210 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:24 crc kubenswrapper[4909]: I0202 10:31:24.087199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:24 crc kubenswrapper[4909]: I0202 10:31:24.087231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:24 crc kubenswrapper[4909]: I0202 10:31:24.087243 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:24 crc kubenswrapper[4909]: I0202 10:31:24.965630 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 20:50:15.045858093 +0000 UTC Feb 02 10:31:25 crc kubenswrapper[4909]: E0202 10:31:25.096169 4909 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 02 10:31:25 crc kubenswrapper[4909]: I0202 10:31:25.621697 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 02 10:31:25 crc kubenswrapper[4909]: I0202 10:31:25.621899 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:25 crc kubenswrapper[4909]: I0202 10:31:25.622998 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:25 crc kubenswrapper[4909]: I0202 10:31:25.623052 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:25 crc kubenswrapper[4909]: I0202 10:31:25.623061 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:25 crc kubenswrapper[4909]: I0202 10:31:25.965903 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 14:49:21.677849242 +0000 UTC Feb 02 10:31:26 crc kubenswrapper[4909]: I0202 10:31:26.508071 4909 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:31:26 crc kubenswrapper[4909]: I0202 10:31:26.508147 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:31:26 crc kubenswrapper[4909]: I0202 10:31:26.967089 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 14:00:22.762159084 +0000 UTC Feb 02 10:31:27 crc kubenswrapper[4909]: I0202 10:31:27.967642 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 05:38:03.991486043 +0000 UTC Feb 02 10:31:28 crc kubenswrapper[4909]: W0202 10:31:28.814971 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 02 10:31:28 crc kubenswrapper[4909]: I0202 10:31:28.815059 4909 trace.go:236] Trace[1271274289]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:31:18.814) (total time: 10000ms): Feb 02 10:31:28 crc kubenswrapper[4909]: Trace[1271274289]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (10:31:28.814) Feb 02 10:31:28 crc kubenswrapper[4909]: Trace[1271274289]: [10.000780758s] [10.000780758s] END Feb 02 10:31:28 crc kubenswrapper[4909]: E0202 10:31:28.815079 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 02 10:31:28 crc kubenswrapper[4909]: W0202 10:31:28.815997 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 02 10:31:28 crc kubenswrapper[4909]: I0202 10:31:28.816077 4909 trace.go:236] Trace[914739780]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:31:18.814) (total time: 10001ms): Feb 02 10:31:28 crc kubenswrapper[4909]: Trace[914739780]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (10:31:28.815) Feb 02 10:31:28 crc kubenswrapper[4909]: Trace[914739780]: [10.001354511s] [10.001354511s] END Feb 02 10:31:28 crc kubenswrapper[4909]: E0202 10:31:28.816097 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 02 10:31:28 crc kubenswrapper[4909]: I0202 10:31:28.960160 4909 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 02 10:31:28 crc kubenswrapper[4909]: I0202 10:31:28.968626 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 14:24:19.196826472 +0000 UTC Feb 02 10:31:28 crc kubenswrapper[4909]: W0202 10:31:28.974961 4909 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 02 10:31:28 crc kubenswrapper[4909]: I0202 10:31:28.975069 4909 trace.go:236] Trace[1205478089]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:31:18.974) (total time: 10000ms): Feb 02 10:31:28 crc kubenswrapper[4909]: Trace[1205478089]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (10:31:28.974) Feb 02 10:31:28 crc kubenswrapper[4909]: Trace[1205478089]: [10.000829029s] [10.000829029s] END Feb 02 10:31:28 crc kubenswrapper[4909]: E0202 10:31:28.975098 4909 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.099958 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.101830 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e2a9dc57fe7e369d66cd5152e7d90152f908c3c09f726725480ed13fa7493945" exitCode=255 Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.101843 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e2a9dc57fe7e369d66cd5152e7d90152f908c3c09f726725480ed13fa7493945"} Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.102026 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.102771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.102811 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.102832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.103356 4909 scope.go:117] "RemoveContainer" containerID="e2a9dc57fe7e369d66cd5152e7d90152f908c3c09f726725480ed13fa7493945" Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.414779 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.414828 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.421651 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.421709 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 02 10:31:29 crc kubenswrapper[4909]: I0202 10:31:29.969266 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 22:59:30.487364835 +0000 UTC Feb 02 10:31:30 crc kubenswrapper[4909]: I0202 10:31:30.107025 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 10:31:30 crc kubenswrapper[4909]: I0202 10:31:30.109328 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a"} Feb 02 10:31:30 crc kubenswrapper[4909]: I0202 10:31:30.109549 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:30 crc kubenswrapper[4909]: I0202 10:31:30.110506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:30 crc kubenswrapper[4909]: I0202 10:31:30.110539 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:30 crc kubenswrapper[4909]: I0202 10:31:30.110549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:30 crc kubenswrapper[4909]: I0202 10:31:30.970666 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 06:47:40.151688244 +0000 UTC Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.276576 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.276780 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.278072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.278104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.278116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.316646 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.317071 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.318133 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.318235 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.318254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:31 crc kubenswrapper[4909]: I0202 10:31:31.970859 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 11:55:12.245782999 +0000 UTC Feb 02 10:31:32 crc kubenswrapper[4909]: I0202 10:31:32.311367 4909 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 02 10:31:32 crc kubenswrapper[4909]: I0202 10:31:32.339701 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:32 crc kubenswrapper[4909]: I0202 10:31:32.339914 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:32 crc kubenswrapper[4909]: I0202 10:31:32.341340 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:32 crc kubenswrapper[4909]: I0202 10:31:32.341372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:32 crc kubenswrapper[4909]: I0202 10:31:32.341383 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:32 crc kubenswrapper[4909]: I0202 10:31:32.347502 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:32 crc kubenswrapper[4909]: I0202 10:31:32.796921 4909 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 02 10:31:32 crc kubenswrapper[4909]: I0202 10:31:32.972529 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 20:37:24.533887603 +0000 UTC Feb 02 10:31:33 crc kubenswrapper[4909]: I0202 10:31:33.116049 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:33 crc kubenswrapper[4909]: I0202 10:31:33.119638 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:33 crc kubenswrapper[4909]: I0202 10:31:33.119700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:33 crc kubenswrapper[4909]: I0202 10:31:33.119711 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:33 crc kubenswrapper[4909]: I0202 10:31:33.973007 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 07:48:26.228833461 +0000 UTC Feb 02 10:31:34 crc kubenswrapper[4909]: E0202 10:31:34.413141 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 02 10:31:34 crc kubenswrapper[4909]: E0202 10:31:34.425081 4909 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.425924 4909 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.426073 4909 trace.go:236] Trace[18046945]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Feb-2026 10:31:22.960) (total time: 11465ms): Feb 02 10:31:34 crc kubenswrapper[4909]: Trace[18046945]: ---"Objects listed" error: 11465ms (10:31:34.425) Feb 02 10:31:34 crc kubenswrapper[4909]: Trace[18046945]: [11.465311477s] [11.465311477s] END Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.426107 4909 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.428850 4909 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.444876 4909 csr.go:261] certificate signing request csr-cxnjl is approved, waiting to be issued Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.535083 4909 csr.go:257] certificate signing request csr-cxnjl is issued Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.565498 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.574102 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.807827 4909 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 02 10:31:34 crc kubenswrapper[4909]: E0202 10:31:34.808015 4909 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events\": read tcp 38.102.83.30:42962->38.102.83.30:6443: use of closed network connection" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18906756af099eeb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:31:15.558096619 +0000 UTC m=+1.304197354,LastTimestamp:2026-02-02 10:31:15.558096619 +0000 UTC m=+1.304197354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:31:34 crc kubenswrapper[4909]: W0202 10:31:34.808110 4909 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Node ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.947793 4909 apiserver.go:52] "Watching apiserver" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.952308 4909 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.952588 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.952902 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.952948 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:34 crc kubenswrapper[4909]: E0202 10:31:34.952993 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.953312 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:34 crc kubenswrapper[4909]: E0202 10:31:34.953350 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.953533 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.953648 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.954133 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:34 crc kubenswrapper[4909]: E0202 10:31:34.954182 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.956281 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.956440 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.956495 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.956520 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.956587 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.956642 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.956669 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.956858 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.957106 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.962669 4909 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.973710 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 05:11:57.19529074 +0000 UTC Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.976006 4909 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 02 10:31:34 crc kubenswrapper[4909]: I0202 10:31:34.982218 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.002197 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.018951 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029037 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029094 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029123 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029150 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029173 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029198 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029219 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029241 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029265 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029291 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029316 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029337 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029358 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029380 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029401 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029421 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029439 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029461 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029481 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029502 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029526 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029531 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029535 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029543 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029560 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029550 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029638 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029663 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029683 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029701 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029720 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029742 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029743 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029765 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029790 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029816 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029862 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029887 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029909 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029931 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029956 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029979 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030004 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030030 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030057 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030119 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030210 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030293 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030318 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030340 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030361 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030383 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030404 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030427 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030452 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030474 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030495 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030516 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030536 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030559 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030581 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030633 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030657 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030682 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030706 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030730 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030754 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030776 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030800 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030862 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030883 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030904 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030892 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030924 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031173 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031209 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031240 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031268 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031298 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031331 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031357 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031383 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031412 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031440 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031468 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031492 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031516 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031543 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031568 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031593 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031620 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031647 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031674 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031700 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031725 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031749 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031774 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031796 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031825 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031872 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031896 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031924 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031949 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031975 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032000 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032025 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032051 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032075 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032100 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032123 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032149 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032175 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032200 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032225 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032249 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032276 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032301 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032323 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032355 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032378 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032404 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032427 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032454 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032503 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032526 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032607 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032635 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032662 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032686 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032712 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032740 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032763 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032789 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032815 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032861 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032891 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032918 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032947 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.032974 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033002 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033031 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033060 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033085 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033108 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033131 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033157 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033187 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033209 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033234 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033258 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033283 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033307 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033334 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033364 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033388 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033416 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033439 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033533 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033564 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033588 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033611 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033638 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033666 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033694 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033718 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033745 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033770 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033796 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033824 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033868 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033892 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033917 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033942 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033966 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.033992 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034020 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034044 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034070 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034094 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034119 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034144 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034169 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034194 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034219 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034245 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035100 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035135 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035161 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035186 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035210 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035237 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035300 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035325 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035352 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035380 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035410 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035435 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035461 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035485 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035513 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035574 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035609 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035645 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035674 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035702 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035732 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035760 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035790 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035848 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.029977 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030066 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030078 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030173 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030282 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030259 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030383 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030592 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030592 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.036414 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030676 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030815 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030865 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030882 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.030924 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031102 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031259 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031319 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031383 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031515 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.031581 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.034148 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.035625 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.036164 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.036511 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.036551 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.036678 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.036776 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037145 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037313 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037402 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037473 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037483 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037502 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037539 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037743 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037798 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.037811 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038014 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038059 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038130 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038185 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038292 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038466 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038476 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038599 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038733 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038771 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038790 4909 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038816 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038875 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038894 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038913 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038929 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.038842 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.039397 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.039212 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.039105 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.039145 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.039164 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.039586 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.039790 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.039851 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.039843 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.040014 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.040083 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.040564 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.040868 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.040872 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.041563 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.041900 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.042025 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.042487 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.042508 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.042667 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:31:35.542644923 +0000 UTC m=+21.288745658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.042859 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.043024 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.043301 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.043545 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.043882 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.043180 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.043213 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.044102 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.044097 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.044211 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.044402 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.044631 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.044745 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.044772 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.045204 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.045533 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.045705 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.045766 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.045873 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.045900 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.045918 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.046400 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.046643 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.046409 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.046724 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.046758 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.046857 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.047176 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.047299 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.047435 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.047907 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.047927 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.047975 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.048073 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.048391 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.048400 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.048549 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.046759 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.048961 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.049019 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.049224 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.049235 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.049339 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.049434 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.049602 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.049602 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.049903 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.049917 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.050024 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.050196 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.052052 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.052054 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.052124 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.052248 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.052304 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.050282 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.052544 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.052880 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.053002 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.053373 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.053374 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.055988 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:35.555964988 +0000 UTC m=+21.302065723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.056033 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:35.55602416 +0000 UTC m=+21.302124885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.056155 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.057090 4909 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.057331 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.057389 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.057722 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.057757 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.057772 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.057859 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:35.557844972 +0000 UTC m=+21.303945707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.058013 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.058387 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.058630 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.058858 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.059096 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.059174 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.059211 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.060324 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.060787 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.062192 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.062552 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.063783 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.068461 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.069960 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.070555 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.070859 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.071110 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.071135 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.071155 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.071167 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.071216 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:35.571199308 +0000 UTC m=+21.317300043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.071636 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.072885 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.075878 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.078814 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.081031 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.081988 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.082095 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.082998 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.083319 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.083475 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.083848 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.088227 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.089285 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.089505 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.089578 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.090294 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.090561 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.090766 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.090987 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.091181 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.091404 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.091497 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.089746 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.093511 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.093984 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.095065 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.097232 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.097255 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.097943 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.105193 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.105196 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.105368 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.110750 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.111128 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.114190 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.115695 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.115856 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.116090 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.117897 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.118260 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.118669 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.121426 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.123782 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.129232 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.129322 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.129366 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.129486 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.129505 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.129691 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.129770 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.129823 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.130096 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.130155 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.130162 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.130359 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.130526 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.130555 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.130421 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.130670 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139395 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139436 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139514 4909 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139525 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139534 4909 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139544 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139553 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139561 4909 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139569 4909 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139578 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139587 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139595 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139603 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139612 4909 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139620 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139629 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139637 4909 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139645 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139655 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139666 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139676 4909 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139685 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139695 4909 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139707 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139715 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139773 4909 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139783 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139791 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139800 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139812 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139832 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139841 4909 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139850 4909 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139859 4909 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139868 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139878 4909 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139887 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139895 4909 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139904 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139912 4909 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139922 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139931 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139943 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139967 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139976 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139984 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.139998 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140007 4909 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140015 4909 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140024 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140034 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140043 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140052 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140061 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140069 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140077 4909 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140085 4909 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140094 4909 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140102 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140110 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140120 4909 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140129 4909 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140137 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140146 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140154 4909 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140163 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140178 4909 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140186 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140193 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140211 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140219 4909 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140228 4909 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140236 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140244 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140253 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140262 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140270 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140278 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140287 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140295 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140303 4909 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140312 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140321 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140329 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140337 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140345 4909 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140354 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140362 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140370 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140378 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140386 4909 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140395 4909 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140405 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140413 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140422 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140430 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140438 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140446 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140455 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140464 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140472 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140481 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140490 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140499 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140508 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140516 4909 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140525 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140534 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140542 4909 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140550 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140558 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140567 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140575 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140583 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140591 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140599 4909 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140606 4909 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140615 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140623 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140631 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140640 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140649 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140657 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140665 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140676 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.140674 4909 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141060 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.140688 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141123 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141139 4909 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141152 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141184 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141194 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141204 4909 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141213 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141223 4909 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141231 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141261 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141270 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141278 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141287 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141295 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141304 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141312 4909 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141341 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141350 4909 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141358 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141367 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141379 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141386 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141394 4909 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141421 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141457 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141467 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141476 4909 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141485 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141493 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141519 4909 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141527 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141536 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141544 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141552 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141560 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141568 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141577 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141629 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141638 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141646 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141654 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141663 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141671 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141679 4909 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141707 4909 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141716 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141724 4909 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141735 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141743 4909 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141752 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141760 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141786 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141797 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141827 4909 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141836 4909 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141855 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141864 4909 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141872 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141886 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141916 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141925 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.141933 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.143760 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.150588 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.161749 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.172482 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.185517 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.196321 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.207390 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.218107 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.230028 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.242236 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.269403 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.279950 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 02 10:31:35 crc kubenswrapper[4909]: W0202 10:31:35.291082 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-859e0a5e0043f6b96ac6a7b967c6d1052c900e6b1553082ee083ae10e794ff98 WatchSource:0}: Error finding container 859e0a5e0043f6b96ac6a7b967c6d1052c900e6b1553082ee083ae10e794ff98: Status 404 returned error can't find the container with id 859e0a5e0043f6b96ac6a7b967c6d1052c900e6b1553082ee083ae10e794ff98 Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.291117 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.536888 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-02 10:26:34 +0000 UTC, rotation deadline is 2026-10-30 17:26:56.045744433 +0000 UTC Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.537284 4909 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6486h55m20.508464782s for next certificate rotation Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.544292 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.544489 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:31:36.544461798 +0000 UTC m=+22.290562583 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.645640 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.645672 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.645695 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.645714 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645795 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645808 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645840 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645844 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645850 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645839 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645883 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645895 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645883 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:36.645865806 +0000 UTC m=+22.391966551 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645938 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:36.645926988 +0000 UTC m=+22.392027723 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645950 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:36.645944859 +0000 UTC m=+22.392045594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:35 crc kubenswrapper[4909]: E0202 10:31:35.645959 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:36.645954859 +0000 UTC m=+22.392055594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.646523 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.656014 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.657445 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.663387 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.671663 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.680407 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.689632 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.702702 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.712437 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.722133 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.730447 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.738850 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.748805 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.757333 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.760564 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-f49tk"] Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.760897 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f49tk" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.762145 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.762416 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.763257 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.765490 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.777338 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.783790 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.790541 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.798007 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.809079 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.817889 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.827502 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.836135 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.845631 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.847847 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxwr2\" (UniqueName: \"kubernetes.io/projected/f7ca81ab-b06b-4e03-879a-fb5546436e54-kube-api-access-zxwr2\") pod \"node-resolver-f49tk\" (UID: \"f7ca81ab-b06b-4e03-879a-fb5546436e54\") " pod="openshift-dns/node-resolver-f49tk" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.847894 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7ca81ab-b06b-4e03-879a-fb5546436e54-hosts-file\") pod \"node-resolver-f49tk\" (UID: \"f7ca81ab-b06b-4e03-879a-fb5546436e54\") " pod="openshift-dns/node-resolver-f49tk" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.854677 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.948660 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxwr2\" (UniqueName: \"kubernetes.io/projected/f7ca81ab-b06b-4e03-879a-fb5546436e54-kube-api-access-zxwr2\") pod \"node-resolver-f49tk\" (UID: \"f7ca81ab-b06b-4e03-879a-fb5546436e54\") " pod="openshift-dns/node-resolver-f49tk" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.948712 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7ca81ab-b06b-4e03-879a-fb5546436e54-hosts-file\") pod \"node-resolver-f49tk\" (UID: \"f7ca81ab-b06b-4e03-879a-fb5546436e54\") " pod="openshift-dns/node-resolver-f49tk" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.948771 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7ca81ab-b06b-4e03-879a-fb5546436e54-hosts-file\") pod \"node-resolver-f49tk\" (UID: \"f7ca81ab-b06b-4e03-879a-fb5546436e54\") " pod="openshift-dns/node-resolver-f49tk" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.965101 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxwr2\" (UniqueName: \"kubernetes.io/projected/f7ca81ab-b06b-4e03-879a-fb5546436e54-kube-api-access-zxwr2\") pod \"node-resolver-f49tk\" (UID: \"f7ca81ab-b06b-4e03-879a-fb5546436e54\") " pod="openshift-dns/node-resolver-f49tk" Feb 02 10:31:35 crc kubenswrapper[4909]: I0202 10:31:35.974175 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 00:15:51.447334372 +0000 UTC Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.071261 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-f49tk" Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.081135 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7ca81ab_b06b_4e03_879a_fb5546436e54.slice/crio-0e094a73385fd3ccad3e4ee44f92e911903ad1f8ff6e6aaedebd84c0318a4fe0 WatchSource:0}: Error finding container 0e094a73385fd3ccad3e4ee44f92e911903ad1f8ff6e6aaedebd84c0318a4fe0: Status 404 returned error can't find the container with id 0e094a73385fd3ccad3e4ee44f92e911903ad1f8ff6e6aaedebd84c0318a4fe0 Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.134544 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-ftn2z"] Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.134881 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.136200 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.137290 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.138198 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6t82h"] Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.138391 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.138721 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-qnbvb"] Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.138921 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.138941 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.139770 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.139849 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.139955 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.140076 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-775zr"] Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.140682 4909 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.140692 4909 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.140723 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.140716 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.140951 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.141176 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.141549 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a" exitCode=255 Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.141618 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a"} Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.141648 4909 scope.go:117] "RemoveContainer" containerID="e2a9dc57fe7e369d66cd5152e7d90152f908c3c09f726725480ed13fa7493945" Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.141974 4909 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": failed to list *v1.Secret: secrets "ovn-node-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.141993 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-node-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.142163 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.142619 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.143417 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.144343 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.151022 4909 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-config": failed to list *v1.ConfigMap: configmaps "ovnkube-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.151069 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.151135 4909 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.151147 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.154852 4909 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": failed to list *v1.ConfigMap: configmaps "ovnkube-script-lib" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.154914 4909 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.154926 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-script-lib\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.154952 4909 reflector.go:561] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.154963 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.154983 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.154924 4909 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.155013 4909 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.155034 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.155213 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f49tk" event={"ID":"f7ca81ab-b06b-4e03-879a-fb5546436e54","Type":"ContainerStarted","Data":"0e094a73385fd3ccad3e4ee44f92e911903ad1f8ff6e6aaedebd84c0318a4fe0"} Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.156806 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"dc4bd3e29f4fdde54a9d6be4ab7d3b794778f349216b27b5aac99917fb83b205"} Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.165213 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60"} Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.165267 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012"} Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.165280 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"859e0a5e0043f6b96ac6a7b967c6d1052c900e6b1553082ee083ae10e794ff98"} Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.171153 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4"} Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.171191 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"328b0b9a547692d28ca11436bb3bdcd10d5c59cea5a9dfd9c4d31af8e2fb1133"} Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.175128 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.180698 4909 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.193747 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.211432 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.236273 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252086 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-slash\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252134 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-etc-openvswitch\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252156 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-netd\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252262 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7de68b6c-f308-498c-95a3-27c9caf44f4f-mcd-auth-proxy-config\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252341 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-run-k8s-cni-cncf-io\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252384 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-var-lib-kubelet\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252418 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-node-log\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252439 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-bin\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmlrc\" (UniqueName: \"kubernetes.io/projected/7de68b6c-f308-498c-95a3-27c9caf44f4f-kube-api-access-dmlrc\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252589 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-systemd-units\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252613 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-systemd\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252685 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-os-release\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252722 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-hostroot\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252746 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l8pk\" (UniqueName: \"kubernetes.io/projected/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-kube-api-access-7l8pk\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252770 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tkj6\" (UniqueName: \"kubernetes.io/projected/ca5084bc-8bd1-4964-9a52-384222fc8374-kube-api-access-4tkj6\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252792 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-openvswitch\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252833 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252858 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-daemon-config\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252878 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-kubelet\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252899 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-cni-binary-copy\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252919 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9776f741-d318-4076-b337-a496344f1d2d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252946 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhk9b\" (UniqueName: \"kubernetes.io/projected/9776f741-d318-4076-b337-a496344f1d2d-kube-api-access-mhk9b\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252975 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-socket-dir-parent\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.252995 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-var-lib-openvswitch\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253015 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-ovn-kubernetes\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253036 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-run-netns\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253055 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-var-lib-cni-bin\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253071 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-etc-kubernetes\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253090 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-log-socket\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253104 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-env-overrides\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253127 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-cni-dir\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253142 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-conf-dir\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253155 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-run-multus-certs\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-cnibin\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253181 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-os-release\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253194 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-netns\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253210 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7de68b6c-f308-498c-95a3-27c9caf44f4f-rootfs\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253229 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7de68b6c-f308-498c-95a3-27c9caf44f4f-proxy-tls\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253250 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-ovn\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253270 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca5084bc-8bd1-4964-9a52-384222fc8374-ovn-node-metrics-cert\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253293 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-script-lib\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253313 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-system-cni-dir\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253343 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-system-cni-dir\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253363 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-cnibin\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253383 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-var-lib-cni-multus\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253438 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-config\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253473 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9776f741-d318-4076-b337-a496344f1d2d-cni-binary-copy\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.253515 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.255701 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.267805 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.279608 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.290697 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.290781 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.291195 4909 scope.go:117] "RemoveContainer" containerID="98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a" Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.291442 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.302676 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.312282 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.322708 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.333444 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.343226 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354162 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-daemon-config\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-kubelet\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354216 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9776f741-d318-4076-b337-a496344f1d2d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354232 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhk9b\" (UniqueName: \"kubernetes.io/projected/9776f741-d318-4076-b337-a496344f1d2d-kube-api-access-mhk9b\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-cni-binary-copy\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354266 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-socket-dir-parent\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354282 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-var-lib-openvswitch\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354298 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-ovn-kubernetes\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354311 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-log-socket\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354326 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-env-overrides\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354342 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-run-netns\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354359 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-var-lib-cni-bin\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354377 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-etc-kubernetes\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354393 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-cnibin\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354407 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-os-release\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354423 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-cni-dir\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-conf-dir\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354453 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-run-multus-certs\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354468 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7de68b6c-f308-498c-95a3-27c9caf44f4f-rootfs\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354483 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7de68b6c-f308-498c-95a3-27c9caf44f4f-proxy-tls\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354497 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-netns\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354523 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-ovn\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354538 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca5084bc-8bd1-4964-9a52-384222fc8374-ovn-node-metrics-cert\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354554 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-script-lib\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354569 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-system-cni-dir\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354583 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-config\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-ovn-kubernetes\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9776f741-d318-4076-b337-a496344f1d2d-cni-binary-copy\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354661 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354702 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-system-cni-dir\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354721 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-cnibin\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354742 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-var-lib-cni-multus\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354762 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-slash\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354780 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-etc-openvswitch\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354801 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-netd\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354858 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-var-lib-kubelet\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354875 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-node-log\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354892 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-bin\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354914 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7de68b6c-f308-498c-95a3-27c9caf44f4f-mcd-auth-proxy-config\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354932 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-run-k8s-cni-cncf-io\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354970 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmlrc\" (UniqueName: \"kubernetes.io/projected/7de68b6c-f308-498c-95a3-27c9caf44f4f-kube-api-access-dmlrc\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.354991 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-systemd-units\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355010 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-systemd\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355175 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l8pk\" (UniqueName: \"kubernetes.io/projected/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-kube-api-access-7l8pk\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355202 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tkj6\" (UniqueName: \"kubernetes.io/projected/ca5084bc-8bd1-4964-9a52-384222fc8374-kube-api-access-4tkj6\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355305 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-os-release\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355314 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9776f741-d318-4076-b337-a496344f1d2d-cni-binary-copy\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355328 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-hostroot\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355348 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-openvswitch\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355376 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355383 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-log-socket\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355464 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-cnibin\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355483 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-os-release\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356025 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7de68b6c-f308-498c-95a3-27c9caf44f4f-rootfs\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356060 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-conf-dir\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356086 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356119 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-run-multus-certs\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356167 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-netns\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356272 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-ovn\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356292 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-socket-dir-parent\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356311 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-cnibin\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.355963 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-cni-dir\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356602 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-slash\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356599 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-systemd-units\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356630 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356652 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-openvswitch\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356653 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-system-cni-dir\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356683 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-var-lib-cni-multus\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356692 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-run-netns\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356713 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-var-lib-openvswitch\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356726 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-var-lib-cni-bin\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356749 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-etc-kubernetes\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356750 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-system-cni-dir\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356754 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9776f741-d318-4076-b337-a496344f1d2d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356766 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-etc-openvswitch\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356801 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-netd\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356799 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-run-k8s-cni-cncf-io\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356856 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-host-var-lib-kubelet\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.356997 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-node-log\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.357024 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-bin\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.357037 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-hostroot\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.357056 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-systemd\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.357058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-kubelet\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.357230 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-os-release\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.357515 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7de68b6c-f308-498c-95a3-27c9caf44f4f-mcd-auth-proxy-config\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.357745 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9776f741-d318-4076-b337-a496344f1d2d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.357877 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-cni-binary-copy\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.363324 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7de68b6c-f308-498c-95a3-27c9caf44f4f-proxy-tls\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.371760 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a9dc57fe7e369d66cd5152e7d90152f908c3c09f726725480ed13fa7493945\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:28Z\\\",\\\"message\\\":\\\"W0202 10:31:18.178934 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0202 10:31:18.179237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1770028278 cert, and key in /tmp/serving-cert-1239714854/serving-signer.crt, /tmp/serving-cert-1239714854/serving-signer.key\\\\nI0202 10:31:18.393157 1 observer_polling.go:159] Starting file observer\\\\nW0202 10:31:18.397461 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 10:31:18.397710 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:18.398259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1239714854/tls.crt::/tmp/serving-cert-1239714854/tls.key\\\\\\\"\\\\nF0202 10:31:28.836889 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.372650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhk9b\" (UniqueName: \"kubernetes.io/projected/9776f741-d318-4076-b337-a496344f1d2d-kube-api-access-mhk9b\") pod \"multus-additional-cni-plugins-6t82h\" (UID: \"9776f741-d318-4076-b337-a496344f1d2d\") " pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.373293 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l8pk\" (UniqueName: \"kubernetes.io/projected/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-kube-api-access-7l8pk\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.377903 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmlrc\" (UniqueName: \"kubernetes.io/projected/7de68b6c-f308-498c-95a3-27c9caf44f4f-kube-api-access-dmlrc\") pod \"machine-config-daemon-ftn2z\" (UID: \"7de68b6c-f308-498c-95a3-27c9caf44f4f\") " pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.383252 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.395688 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.412686 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.429878 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.449146 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.462225 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.473740 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6t82h" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.486837 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: W0202 10:31:36.495305 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9776f741_d318_4076_b337_a496344f1d2d.slice/crio-342bfd7db22d76dbb352b6e9da972e32f79f9c239ca16d568b47235a0af94d83 WatchSource:0}: Error finding container 342bfd7db22d76dbb352b6e9da972e32f79f9c239ca16d568b47235a0af94d83: Status 404 returned error can't find the container with id 342bfd7db22d76dbb352b6e9da972e32f79f9c239ca16d568b47235a0af94d83 Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.501109 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.516323 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.531618 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:36Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.557101 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.557289 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:31:38.557264462 +0000 UTC m=+24.303365197 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.658404 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.658478 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.658539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.658558 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658609 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658636 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658647 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658666 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658695 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:38.658678382 +0000 UTC m=+24.404779117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658695 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658704 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658724 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:38.658703553 +0000 UTC m=+24.404804288 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658729 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658745 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:38.658738964 +0000 UTC m=+24.404839699 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658746 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:36 crc kubenswrapper[4909]: E0202 10:31:36.658779 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:38.658770825 +0000 UTC m=+24.404871560 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.974581 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 10:41:56.245634744 +0000 UTC Feb 02 10:31:36 crc kubenswrapper[4909]: I0202 10:31:36.982002 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.016406 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.016458 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.016502 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.016526 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.016595 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.016673 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.020131 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.020697 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.021508 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.022101 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.022643 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.023198 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.023801 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.024401 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.025054 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.025630 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.026264 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.027150 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.027765 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.028411 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.029043 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.029557 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.030993 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.031436 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.032169 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.033414 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.034022 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.034763 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.036006 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.036725 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.037207 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.038367 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.039133 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.039668 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.040313 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.040922 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.041383 4909 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.041777 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.044717 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.045446 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.045911 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.047644 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.048295 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.049390 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.050100 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.051113 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.051554 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.052578 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.053383 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.054540 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.055095 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.056156 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.056800 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.058029 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.058501 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.059441 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.060059 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.060664 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.061806 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.062392 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.175230 4909 generic.go:334] "Generic (PLEG): container finished" podID="9776f741-d318-4076-b337-a496344f1d2d" containerID="ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb" exitCode=0 Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.175304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" event={"ID":"9776f741-d318-4076-b337-a496344f1d2d","Type":"ContainerDied","Data":"ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb"} Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.175336 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" event={"ID":"9776f741-d318-4076-b337-a496344f1d2d","Type":"ContainerStarted","Data":"342bfd7db22d76dbb352b6e9da972e32f79f9c239ca16d568b47235a0af94d83"} Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.178135 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-f49tk" event={"ID":"f7ca81ab-b06b-4e03-879a-fb5546436e54","Type":"ContainerStarted","Data":"848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b"} Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.179768 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af"} Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.179835 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013"} Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.179850 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"6ede642c1d632695d35434e30f9c3925a1365bf3991bf840ee65e5c508f78b50"} Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.182094 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.188370 4909 scope.go:117] "RemoveContainer" containerID="98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a" Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.188736 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.195657 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.213633 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.234233 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.249996 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2a9dc57fe7e369d66cd5152e7d90152f908c3c09f726725480ed13fa7493945\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:28Z\\\",\\\"message\\\":\\\"W0202 10:31:18.178934 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0202 10:31:18.179237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1770028278 cert, and key in /tmp/serving-cert-1239714854/serving-signer.crt, /tmp/serving-cert-1239714854/serving-signer.key\\\\nI0202 10:31:18.393157 1 observer_polling.go:159] Starting file observer\\\\nW0202 10:31:18.397461 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0202 10:31:18.397710 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:18.398259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1239714854/tls.crt::/tmp/serving-cert-1239714854/tls.key\\\\\\\"\\\\nF0202 10:31:28.836889 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.262989 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.276876 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.291643 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.311955 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.329647 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.340536 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.357258 4909 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-config: failed to sync configmap cache: timed out waiting for the condition Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.357340 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-config podName:ca5084bc-8bd1-4964-9a52-384222fc8374 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:37.857316112 +0000 UTC m=+23.603416847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-config" (UniqueName: "kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-config") pod "ovnkube-node-775zr" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374") : failed to sync configmap cache: timed out waiting for the condition Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.357371 4909 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/env-overrides: failed to sync configmap cache: timed out waiting for the condition Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.357390 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-env-overrides podName:ca5084bc-8bd1-4964-9a52-384222fc8374 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:37.857384424 +0000 UTC m=+23.603485159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "env-overrides" (UniqueName: "kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-env-overrides") pod "ovnkube-node-775zr" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374") : failed to sync configmap cache: timed out waiting for the condition Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.357403 4909 configmap.go:193] Couldn't get configMap openshift-multus/multus-daemon-config: failed to sync configmap cache: timed out waiting for the condition Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.357422 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-daemon-config podName:bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af nodeName:}" failed. No retries permitted until 2026-02-02 10:31:37.857416975 +0000 UTC m=+23.603517710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "multus-daemon-config" (UniqueName: "kubernetes.io/configmap/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-daemon-config") pod "multus-qnbvb" (UID: "bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af") : failed to sync configmap cache: timed out waiting for the condition Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.357444 4909 secret.go:188] Couldn't get secret openshift-ovn-kubernetes/ovn-node-metrics-cert: failed to sync secret cache: timed out waiting for the condition Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.357462 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca5084bc-8bd1-4964-9a52-384222fc8374-ovn-node-metrics-cert podName:ca5084bc-8bd1-4964-9a52-384222fc8374 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:37.857456896 +0000 UTC m=+23.603557631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-node-metrics-cert" (UniqueName: "kubernetes.io/secret/ca5084bc-8bd1-4964-9a52-384222fc8374-ovn-node-metrics-cert") pod "ovnkube-node-775zr" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374") : failed to sync secret cache: timed out waiting for the condition Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.357476 4909 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-script-lib: failed to sync configmap cache: timed out waiting for the condition Feb 02 10:31:37 crc kubenswrapper[4909]: E0202 10:31:37.357492 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-script-lib podName:ca5084bc-8bd1-4964-9a52-384222fc8374 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:37.857488087 +0000 UTC m=+23.603588822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-script-lib" (UniqueName: "kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-script-lib") pod "ovnkube-node-775zr" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374") : failed to sync configmap cache: timed out waiting for the condition Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.358291 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.370300 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.385108 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.397939 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.410785 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.423802 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.424570 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.433506 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.436423 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.456787 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tkj6\" (UniqueName: \"kubernetes.io/projected/ca5084bc-8bd1-4964-9a52-384222fc8374-kube-api-access-4tkj6\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.458598 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.471001 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.488032 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.498323 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.513786 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.534862 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.547550 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.558567 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.559791 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.566548 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.576923 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.610021 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.637042 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:37Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.651213 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.669967 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.695763 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.718894 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.871109 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-daemon-config\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.871160 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-env-overrides\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.871186 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca5084bc-8bd1-4964-9a52-384222fc8374-ovn-node-metrics-cert\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.871207 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-script-lib\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.871227 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-config\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.871771 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-env-overrides\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.871922 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af-multus-daemon-config\") pod \"multus-qnbvb\" (UID: \"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\") " pod="openshift-multus/multus-qnbvb" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.871949 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-config\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.872335 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-script-lib\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.875184 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca5084bc-8bd1-4964-9a52-384222fc8374-ovn-node-metrics-cert\") pod \"ovnkube-node-775zr\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.964878 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qnbvb" Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.975256 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 08:24:58.770527118 +0000 UTC Feb 02 10:31:37 crc kubenswrapper[4909]: W0202 10:31:37.977687 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf9f2a3_7bc1_4fb0_a51f_1855bf92b3af.slice/crio-da505cb3566821f6b77c41d7286c636eddb453b5779e6c5517c65bf63b3f95bc WatchSource:0}: Error finding container da505cb3566821f6b77c41d7286c636eddb453b5779e6c5517c65bf63b3f95bc: Status 404 returned error can't find the container with id da505cb3566821f6b77c41d7286c636eddb453b5779e6c5517c65bf63b3f95bc Feb 02 10:31:37 crc kubenswrapper[4909]: I0202 10:31:37.981981 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.191975 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae" exitCode=0 Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.192153 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae"} Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.192402 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"35583ab6ade1e2b1d5fd17a8639ef2ac48ee7c4d5bc253dd5ba9f97f41e8182d"} Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.194340 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnbvb" event={"ID":"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af","Type":"ContainerStarted","Data":"6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb"} Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.194376 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnbvb" event={"ID":"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af","Type":"ContainerStarted","Data":"da505cb3566821f6b77c41d7286c636eddb453b5779e6c5517c65bf63b3f95bc"} Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.196031 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c"} Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.198244 4909 generic.go:334] "Generic (PLEG): container finished" podID="9776f741-d318-4076-b337-a496344f1d2d" containerID="e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113" exitCode=0 Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.198306 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" event={"ID":"9776f741-d318-4076-b337-a496344f1d2d","Type":"ContainerDied","Data":"e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113"} Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.211392 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.234341 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.250298 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.262201 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.272134 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.286498 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.301072 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.312275 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.325234 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.338019 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.356030 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.370283 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.385362 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.398691 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.409366 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.421687 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.438602 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.456321 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.477643 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.519933 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.559711 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.578453 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.578646 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:31:42.57861233 +0000 UTC m=+28.324713085 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.608301 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.641220 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679353 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679387 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679401 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679447 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:42.679433662 +0000 UTC m=+28.425534397 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.679480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.679504 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.679535 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.679557 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679619 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679641 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:42.679635038 +0000 UTC m=+28.425735773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679706 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679717 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679725 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679744 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:42.679738011 +0000 UTC m=+28.425838746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679794 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:38 crc kubenswrapper[4909]: E0202 10:31:38.679839 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:42.679833484 +0000 UTC m=+28.425934219 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.680151 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.719696 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.757606 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.798389 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.841086 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:38Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:38 crc kubenswrapper[4909]: I0202 10:31:38.975979 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 13:28:08.560610699 +0000 UTC Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.015888 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.016786 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:39 crc kubenswrapper[4909]: E0202 10:31:39.016827 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.016912 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:39 crc kubenswrapper[4909]: E0202 10:31:39.017045 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:39 crc kubenswrapper[4909]: E0202 10:31:39.017147 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.205143 4909 generic.go:334] "Generic (PLEG): container finished" podID="9776f741-d318-4076-b337-a496344f1d2d" containerID="e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f" exitCode=0 Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.205212 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" event={"ID":"9776f741-d318-4076-b337-a496344f1d2d","Type":"ContainerDied","Data":"e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f"} Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.212088 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f"} Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.212219 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85"} Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.212281 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177"} Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.212348 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983"} Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.212404 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2"} Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.212458 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b"} Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.221124 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.243979 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.258301 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.269634 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.279670 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.291172 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.304107 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.319314 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.334950 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.348004 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.367470 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.383767 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.397158 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.411325 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:39Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:39 crc kubenswrapper[4909]: I0202 10:31:39.976900 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 08:02:11.014108396 +0000 UTC Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.215752 4909 generic.go:334] "Generic (PLEG): container finished" podID="9776f741-d318-4076-b337-a496344f1d2d" containerID="ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d" exitCode=0 Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.215839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" event={"ID":"9776f741-d318-4076-b337-a496344f1d2d","Type":"ContainerDied","Data":"ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d"} Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.227708 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.257714 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.271825 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.284455 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.296832 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.308637 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.323664 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.343996 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.359515 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.372790 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.394852 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.416628 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.428011 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.438730 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.825566 4909 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.828693 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.828725 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.828735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.828874 4909 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.835397 4909 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.835684 4909 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.836645 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.836675 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.836684 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.836697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.836708 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:40Z","lastTransitionTime":"2026-02-02T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:40 crc kubenswrapper[4909]: E0202 10:31:40.852780 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.857230 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.857268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.857278 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.857292 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.857303 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:40Z","lastTransitionTime":"2026-02-02T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:40 crc kubenswrapper[4909]: E0202 10:31:40.872024 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.875544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.875576 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.875588 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.875604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.875617 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:40Z","lastTransitionTime":"2026-02-02T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:40 crc kubenswrapper[4909]: E0202 10:31:40.887097 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.890882 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.890939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.890949 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.890970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.890984 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:40Z","lastTransitionTime":"2026-02-02T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:40 crc kubenswrapper[4909]: E0202 10:31:40.903538 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.906890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.906947 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.906957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.906974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.906984 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:40Z","lastTransitionTime":"2026-02-02T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:40 crc kubenswrapper[4909]: E0202 10:31:40.919330 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:40Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:40 crc kubenswrapper[4909]: E0202 10:31:40.919448 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.921169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.921199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.921208 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.921225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.921235 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:40Z","lastTransitionTime":"2026-02-02T10:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:40 crc kubenswrapper[4909]: I0202 10:31:40.978036 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 19:57:46.019184994 +0000 UTC Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.015615 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.015638 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.015686 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:41 crc kubenswrapper[4909]: E0202 10:31:41.015728 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:41 crc kubenswrapper[4909]: E0202 10:31:41.015783 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:41 crc kubenswrapper[4909]: E0202 10:31:41.015915 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.022798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.022846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.022857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.022869 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.022877 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:41Z","lastTransitionTime":"2026-02-02T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.124835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.124869 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.124877 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.124889 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.124898 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:41Z","lastTransitionTime":"2026-02-02T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.222515 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.225021 4909 generic.go:334] "Generic (PLEG): container finished" podID="9776f741-d318-4076-b337-a496344f1d2d" containerID="804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198" exitCode=0 Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.225089 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" event={"ID":"9776f741-d318-4076-b337-a496344f1d2d","Type":"ContainerDied","Data":"804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.226301 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.226332 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.226342 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.226355 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.226366 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:41Z","lastTransitionTime":"2026-02-02T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.251250 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.266521 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.279078 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.295764 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.310183 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.324024 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.328023 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.328060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.328072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.328094 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.328106 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:41Z","lastTransitionTime":"2026-02-02T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.335012 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.347342 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.361238 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.373175 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.385339 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.397469 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.410056 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.425129 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:41Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.430488 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.430522 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.430531 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.430547 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.430558 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:41Z","lastTransitionTime":"2026-02-02T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.533150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.533182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.533191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.533206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.533217 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:41Z","lastTransitionTime":"2026-02-02T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.635640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.635671 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.635679 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.635691 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.635701 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:41Z","lastTransitionTime":"2026-02-02T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.738084 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.738124 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.738134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.738148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.738159 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:41Z","lastTransitionTime":"2026-02-02T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.840635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.840668 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.840676 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.840688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.840697 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:41Z","lastTransitionTime":"2026-02-02T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.879876 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.880461 4909 scope.go:117] "RemoveContainer" containerID="98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a" Feb 02 10:31:41 crc kubenswrapper[4909]: E0202 10:31:41.880685 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.942963 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.943014 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.943025 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.943042 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.943053 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:41Z","lastTransitionTime":"2026-02-02T10:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:41 crc kubenswrapper[4909]: I0202 10:31:41.978230 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 11:48:56.103418951 +0000 UTC Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.046082 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.046125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.046137 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.046159 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.046171 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:42Z","lastTransitionTime":"2026-02-02T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.149108 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.149182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.149201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.149257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.149290 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:42Z","lastTransitionTime":"2026-02-02T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.231158 4909 generic.go:334] "Generic (PLEG): container finished" podID="9776f741-d318-4076-b337-a496344f1d2d" containerID="1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b" exitCode=0 Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.231205 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" event={"ID":"9776f741-d318-4076-b337-a496344f1d2d","Type":"ContainerDied","Data":"1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.247512 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.251488 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.251600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.251617 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.251640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.251661 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:42Z","lastTransitionTime":"2026-02-02T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.261362 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.272592 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.287397 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.302544 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.314768 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.326139 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.339422 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.351371 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.353470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.353505 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.353514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.353527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.353538 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:42Z","lastTransitionTime":"2026-02-02T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.367057 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.386401 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.398989 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.410039 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.424017 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.455446 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.455483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.455491 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.455503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.455511 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:42Z","lastTransitionTime":"2026-02-02T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.558280 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.558319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.558328 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.558343 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.558351 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:42Z","lastTransitionTime":"2026-02-02T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.619167 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.619296 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:31:50.619279415 +0000 UTC m=+36.365380150 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.663145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.663188 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.663197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.663212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.663222 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:42Z","lastTransitionTime":"2026-02-02T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.719714 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.719750 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.719781 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.719798 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.719879 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.719943 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.719961 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.720002 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.720014 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.719965 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:50.719917591 +0000 UTC m=+36.466018316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.720082 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:50.720067306 +0000 UTC m=+36.466168041 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.719977 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.720096 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.720117 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:50.720111797 +0000 UTC m=+36.466212532 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.719959 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:42 crc kubenswrapper[4909]: E0202 10:31:42.720145 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:50.720140458 +0000 UTC m=+36.466241183 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.768126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.768173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.768182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.768196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.768207 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:42Z","lastTransitionTime":"2026-02-02T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.870459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.870542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.870555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.870569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.870580 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:42Z","lastTransitionTime":"2026-02-02T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.972953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.972999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.973011 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.973029 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.973041 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:42Z","lastTransitionTime":"2026-02-02T10:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:42 crc kubenswrapper[4909]: I0202 10:31:42.979089 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 05:51:05.312298918 +0000 UTC Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.016542 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.016580 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.016588 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:43 crc kubenswrapper[4909]: E0202 10:31:43.016692 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:43 crc kubenswrapper[4909]: E0202 10:31:43.017025 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:43 crc kubenswrapper[4909]: E0202 10:31:43.017113 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.075070 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.075109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.075117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.075132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.075144 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:43Z","lastTransitionTime":"2026-02-02T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.177450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.177484 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.177493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.177507 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.177517 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:43Z","lastTransitionTime":"2026-02-02T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.237370 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" event={"ID":"9776f741-d318-4076-b337-a496344f1d2d","Type":"ContainerStarted","Data":"eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc"} Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.262146 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.274516 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.279397 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.279442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.279455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.279473 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.279485 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:43Z","lastTransitionTime":"2026-02-02T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.288108 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.300990 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.313464 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.325673 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.341883 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.353582 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.365321 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.379077 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.381218 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.381246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.381255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.381268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.381276 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:43Z","lastTransitionTime":"2026-02-02T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.393198 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.404849 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.415257 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.427760 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:43Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.482975 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.483015 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.483024 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.483039 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.483049 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:43Z","lastTransitionTime":"2026-02-02T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.585040 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.585075 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.585083 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.585097 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.585106 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:43Z","lastTransitionTime":"2026-02-02T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.687253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.687285 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.687293 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.687305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.687313 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:43Z","lastTransitionTime":"2026-02-02T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.788946 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.788979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.788991 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.789008 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.789020 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:43Z","lastTransitionTime":"2026-02-02T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.891514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.891551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.891561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.891575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.891587 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:43Z","lastTransitionTime":"2026-02-02T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.979769 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 02:59:58.358950023 +0000 UTC Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.993433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.993483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.993495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.993513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:43 crc kubenswrapper[4909]: I0202 10:31:43.993523 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:43Z","lastTransitionTime":"2026-02-02T10:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.095491 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.095539 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.095550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.095566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.095577 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:44Z","lastTransitionTime":"2026-02-02T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.198563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.198617 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.198628 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.198653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.198665 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:44Z","lastTransitionTime":"2026-02-02T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.246533 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.246961 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.259311 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.270497 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.271258 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.282384 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.294571 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.300744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.300791 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.300823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.300842 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.300856 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:44Z","lastTransitionTime":"2026-02-02T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.308504 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.329684 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.342936 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.358104 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.371483 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.386975 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.402681 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.402726 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.402740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.402757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.402768 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:44Z","lastTransitionTime":"2026-02-02T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.404569 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.418785 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.431097 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.442522 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.455665 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.466969 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.478102 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.500288 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.504932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.504977 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.504991 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.505007 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.505019 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:44Z","lastTransitionTime":"2026-02-02T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.536718 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.548059 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.558730 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.572640 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.584092 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.601200 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.607087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.607122 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.607131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.607144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.607152 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:44Z","lastTransitionTime":"2026-02-02T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.621891 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.632065 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.641919 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.655414 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:44Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.708722 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.708762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.708773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.708788 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.708797 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:44Z","lastTransitionTime":"2026-02-02T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.811624 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.811683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.811696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.811715 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.811730 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:44Z","lastTransitionTime":"2026-02-02T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.914897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.914960 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.914979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.915018 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.915051 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:44Z","lastTransitionTime":"2026-02-02T10:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.980608 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 14:34:05.577498489 +0000 UTC Feb 02 10:31:44 crc kubenswrapper[4909]: I0202 10:31:44.987123 4909 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.015573 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.015640 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.015650 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:45 crc kubenswrapper[4909]: E0202 10:31:45.015828 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:45 crc kubenswrapper[4909]: E0202 10:31:45.015931 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:45 crc kubenswrapper[4909]: E0202 10:31:45.016243 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.017648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.017688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.017701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.017719 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.017732 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:45Z","lastTransitionTime":"2026-02-02T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.032440 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.044246 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.057368 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.079041 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.094144 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.110472 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.120711 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.120754 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.120764 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.120781 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.120791 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:45Z","lastTransitionTime":"2026-02-02T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.124885 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.146311 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.163976 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.176039 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.186986 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.196658 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.208730 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.218532 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.223983 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.224038 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.224056 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.224084 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.224163 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:45Z","lastTransitionTime":"2026-02-02T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.249711 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.250214 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.273578 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.284643 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.296741 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.307441 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.320711 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.326331 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.326374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.326384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.326399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.326411 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:45Z","lastTransitionTime":"2026-02-02T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.332632 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.343704 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.361021 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.377716 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.388386 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.399306 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.411606 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.422102 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.429001 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.429043 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.429053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.429072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.429083 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:45Z","lastTransitionTime":"2026-02-02T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.431659 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.439745 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.531092 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.531129 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.531144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.531161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.531171 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:45Z","lastTransitionTime":"2026-02-02T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.634346 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.634414 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.634436 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.634465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.634485 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:45Z","lastTransitionTime":"2026-02-02T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.736206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.736243 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.736254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.736269 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.736280 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:45Z","lastTransitionTime":"2026-02-02T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.886602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.886640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.886653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.886669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.886679 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:45Z","lastTransitionTime":"2026-02-02T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.981415 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 07:44:46.856081791 +0000 UTC Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.989059 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.989090 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.989098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.989112 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:45 crc kubenswrapper[4909]: I0202 10:31:45.989122 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:45Z","lastTransitionTime":"2026-02-02T10:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.091395 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.091439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.091455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.091470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.091480 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:46Z","lastTransitionTime":"2026-02-02T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.194209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.194249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.194260 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.194276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.194286 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:46Z","lastTransitionTime":"2026-02-02T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.252075 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.296325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.296361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.296370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.296383 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.296392 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:46Z","lastTransitionTime":"2026-02-02T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.398105 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.398158 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.398171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.398187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.398200 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:46Z","lastTransitionTime":"2026-02-02T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.500409 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.500440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.500448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.500461 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.500469 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:46Z","lastTransitionTime":"2026-02-02T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.602799 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.602895 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.602909 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.602934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.602950 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:46Z","lastTransitionTime":"2026-02-02T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.706173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.706241 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.706251 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.706268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.706277 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:46Z","lastTransitionTime":"2026-02-02T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.808330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.808374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.808382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.808396 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.808404 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:46Z","lastTransitionTime":"2026-02-02T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.911100 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.911130 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.911139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.911152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.911160 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:46Z","lastTransitionTime":"2026-02-02T10:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:46 crc kubenswrapper[4909]: I0202 10:31:46.982421 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 06:17:21.368976452 +0000 UTC Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.013134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.013166 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.013174 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.013187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.013196 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:47Z","lastTransitionTime":"2026-02-02T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.015581 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.015648 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:47 crc kubenswrapper[4909]: E0202 10:31:47.015675 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:47 crc kubenswrapper[4909]: E0202 10:31:47.015757 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.015587 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:47 crc kubenswrapper[4909]: E0202 10:31:47.015986 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.115982 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.116016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.116024 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.116037 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.116045 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:47Z","lastTransitionTime":"2026-02-02T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.218574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.218941 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.219162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.219533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.219671 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:47Z","lastTransitionTime":"2026-02-02T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.256209 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/0.log" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.258399 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c" exitCode=1 Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.258440 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.259142 4909 scope.go:117] "RemoveContainer" containerID="028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.285131 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.296730 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.307417 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.321621 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.321647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.321654 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.321669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.321678 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:47Z","lastTransitionTime":"2026-02-02T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.321630 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.334129 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.347246 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.357250 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.374375 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.392472 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:46Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:31:46.283400 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:31:46.283949 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:31:46.283981 6232 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:31:46.284005 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:31:46.284010 6232 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:31:46.284020 6232 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:31:46.284021 6232 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:31:46.284031 6232 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:31:46.284032 6232 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:31:46.284039 6232 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:31:46.284040 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:31:46.284048 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:31:46.284081 6232 factory.go:656] Stopping watch factory\\\\nI0202 10:31:46.284093 6232 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.411017 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.426052 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.427523 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.427554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.427566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.427580 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.427590 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:47Z","lastTransitionTime":"2026-02-02T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.442776 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.454293 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.465879 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:47Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.529752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.529793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.529822 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.529838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.529850 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:47Z","lastTransitionTime":"2026-02-02T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.632350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.632390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.632401 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.632415 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.632425 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:47Z","lastTransitionTime":"2026-02-02T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.734242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.734277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.734290 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.734305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.734316 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:47Z","lastTransitionTime":"2026-02-02T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.836520 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.836774 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.836871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.836958 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.837045 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:47Z","lastTransitionTime":"2026-02-02T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.939494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.939526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.939535 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.939549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.939558 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:47Z","lastTransitionTime":"2026-02-02T10:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:47 crc kubenswrapper[4909]: I0202 10:31:47.983017 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 09:15:13.327229986 +0000 UTC Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.042070 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.042115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.042126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.042143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.042155 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:48Z","lastTransitionTime":"2026-02-02T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.144542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.144768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.144843 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.144905 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.144989 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:48Z","lastTransitionTime":"2026-02-02T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.157994 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl"] Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.158412 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.159954 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.160080 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.179240 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.190454 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.200784 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.203341 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47146704-6957-41b2-ae8b-866b5fb08c3e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.203471 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47146704-6957-41b2-ae8b-866b5fb08c3e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.203647 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz5vc\" (UniqueName: \"kubernetes.io/projected/47146704-6957-41b2-ae8b-866b5fb08c3e-kube-api-access-lz5vc\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.203782 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47146704-6957-41b2-ae8b-866b5fb08c3e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.218051 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.230142 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.242613 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.246484 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.246525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.246534 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.246548 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.246556 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:48Z","lastTransitionTime":"2026-02-02T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.253335 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.262980 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/1.log" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.263527 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/0.log" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.265874 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a" exitCode=1 Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.265912 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.265946 4909 scope.go:117] "RemoveContainer" containerID="028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.266739 4909 scope.go:117] "RemoveContainer" containerID="1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a" Feb 02 10:31:48 crc kubenswrapper[4909]: E0202 10:31:48.266911 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\"" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.270612 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.283661 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.295038 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.304648 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47146704-6957-41b2-ae8b-866b5fb08c3e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.304737 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47146704-6957-41b2-ae8b-866b5fb08c3e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.304756 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47146704-6957-41b2-ae8b-866b5fb08c3e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.304777 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz5vc\" (UniqueName: \"kubernetes.io/projected/47146704-6957-41b2-ae8b-866b5fb08c3e-kube-api-access-lz5vc\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.305572 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47146704-6957-41b2-ae8b-866b5fb08c3e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.305583 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47146704-6957-41b2-ae8b-866b5fb08c3e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.307727 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.309481 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47146704-6957-41b2-ae8b-866b5fb08c3e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.318176 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.321134 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz5vc\" (UniqueName: \"kubernetes.io/projected/47146704-6957-41b2-ae8b-866b5fb08c3e-kube-api-access-lz5vc\") pod \"ovnkube-control-plane-749d76644c-nfrsl\" (UID: \"47146704-6957-41b2-ae8b-866b5fb08c3e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.330323 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.345637 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:46Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:31:46.283400 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:31:46.283949 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:31:46.283981 6232 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:31:46.284005 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:31:46.284010 6232 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:31:46.284020 6232 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:31:46.284021 6232 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:31:46.284031 6232 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:31:46.284032 6232 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:31:46.284039 6232 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:31:46.284040 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:31:46.284048 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:31:46.284081 6232 factory.go:656] Stopping watch factory\\\\nI0202 10:31:46.284093 6232 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.349642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.349697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.349711 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.349732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.349750 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:48Z","lastTransitionTime":"2026-02-02T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.358990 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.375268 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:46Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:31:46.283400 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:31:46.283949 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:31:46.283981 6232 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:31:46.284005 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:31:46.284010 6232 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:31:46.284020 6232 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:31:46.284021 6232 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:31:46.284031 6232 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:31:46.284032 6232 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:31:46.284039 6232 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:31:46.284040 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:31:46.284048 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:31:46.284081 6232 factory.go:656] Stopping watch factory\\\\nI0202 10:31:46.284093 6232 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"cy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0202 10:31:48.071143 6356 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 10:31:48.071186 6356 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nF0202 10:31:48.071202 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.385053 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.397120 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.407946 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.418343 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.428167 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.439489 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.452090 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.452124 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.452132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.452146 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.452155 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:48Z","lastTransitionTime":"2026-02-02T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.458302 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.470102 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.470861 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.480561 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.499570 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.512083 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.524156 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.534874 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.545866 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:48Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.554439 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.554474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.554483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.554496 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.554506 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:48Z","lastTransitionTime":"2026-02-02T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.656308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.656349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.656358 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.656374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.656383 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:48Z","lastTransitionTime":"2026-02-02T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.758939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.758995 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.759010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.759032 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.759044 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:48Z","lastTransitionTime":"2026-02-02T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.861771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.861838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.861852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.861873 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.861885 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:48Z","lastTransitionTime":"2026-02-02T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.964783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.964846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.964862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.964882 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.964896 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:48Z","lastTransitionTime":"2026-02-02T10:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:48 crc kubenswrapper[4909]: I0202 10:31:48.984793 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 18:07:23.676634035 +0000 UTC Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.016431 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:49 crc kubenswrapper[4909]: E0202 10:31:49.016571 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.016651 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.016696 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:49 crc kubenswrapper[4909]: E0202 10:31:49.016976 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:49 crc kubenswrapper[4909]: E0202 10:31:49.017061 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.068386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.068430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.068440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.068457 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.068469 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:49Z","lastTransitionTime":"2026-02-02T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.171761 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.171805 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.171833 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.171849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.171859 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:49Z","lastTransitionTime":"2026-02-02T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.266565 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-ccs5q"] Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.267151 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.268320 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-2v5vw"] Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.268784 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:49 crc kubenswrapper[4909]: E0202 10:31:49.268857 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.270310 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.270347 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.270477 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.271506 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.278258 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/1.log" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.281516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.281571 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.281584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.281601 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.281612 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:49Z","lastTransitionTime":"2026-02-02T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.284557 4909 scope.go:117] "RemoveContainer" containerID="1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a" Feb 02 10:31:49 crc kubenswrapper[4909]: E0202 10:31:49.284744 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\"" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.286331 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" event={"ID":"47146704-6957-41b2-ae8b-866b5fb08c3e","Type":"ContainerStarted","Data":"34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.286390 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" event={"ID":"47146704-6957-41b2-ae8b-866b5fb08c3e","Type":"ContainerStarted","Data":"ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.286407 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" event={"ID":"47146704-6957-41b2-ae8b-866b5fb08c3e","Type":"ContainerStarted","Data":"086c57507720267856721c4ccb7248f9ee9839df4eade2f9e624daefcc34e142"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.296296 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.312315 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh52q\" (UniqueName: \"kubernetes.io/projected/de56cfec-f410-4c75-b58b-3f82cdc1c603-kube-api-access-wh52q\") pod \"node-ca-ccs5q\" (UID: \"de56cfec-f410-4c75-b58b-3f82cdc1c603\") " pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.312441 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de56cfec-f410-4c75-b58b-3f82cdc1c603-serviceca\") pod \"node-ca-ccs5q\" (UID: \"de56cfec-f410-4c75-b58b-3f82cdc1c603\") " pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.312535 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj9t6\" (UniqueName: \"kubernetes.io/projected/0f457793-f4e0-4417-ae91-4455722372c1-kube-api-access-qj9t6\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.312606 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.312652 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de56cfec-f410-4c75-b58b-3f82cdc1c603-host\") pod \"node-ca-ccs5q\" (UID: \"de56cfec-f410-4c75-b58b-3f82cdc1c603\") " pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.314139 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.328210 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.341519 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.353198 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.363483 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.373398 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.384104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.384192 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.384206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.384224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.384237 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:49Z","lastTransitionTime":"2026-02-02T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.385594 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.398834 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.413049 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de56cfec-f410-4c75-b58b-3f82cdc1c603-host\") pod \"node-ca-ccs5q\" (UID: \"de56cfec-f410-4c75-b58b-3f82cdc1c603\") " pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.413291 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh52q\" (UniqueName: \"kubernetes.io/projected/de56cfec-f410-4c75-b58b-3f82cdc1c603-kube-api-access-wh52q\") pod \"node-ca-ccs5q\" (UID: \"de56cfec-f410-4c75-b58b-3f82cdc1c603\") " pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.413385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de56cfec-f410-4c75-b58b-3f82cdc1c603-serviceca\") pod \"node-ca-ccs5q\" (UID: \"de56cfec-f410-4c75-b58b-3f82cdc1c603\") " pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.413491 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj9t6\" (UniqueName: \"kubernetes.io/projected/0f457793-f4e0-4417-ae91-4455722372c1-kube-api-access-qj9t6\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.413670 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.413275 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.413301 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de56cfec-f410-4c75-b58b-3f82cdc1c603-host\") pod \"node-ca-ccs5q\" (UID: \"de56cfec-f410-4c75-b58b-3f82cdc1c603\") " pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: E0202 10:31:49.413768 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:31:49 crc kubenswrapper[4909]: E0202 10:31:49.414180 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs podName:0f457793-f4e0-4417-ae91-4455722372c1 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:49.914162507 +0000 UTC m=+35.660263242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs") pod "network-metrics-daemon-2v5vw" (UID: "0f457793-f4e0-4417-ae91-4455722372c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.415766 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de56cfec-f410-4c75-b58b-3f82cdc1c603-serviceca\") pod \"node-ca-ccs5q\" (UID: \"de56cfec-f410-4c75-b58b-3f82cdc1c603\") " pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.425913 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.434926 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh52q\" (UniqueName: \"kubernetes.io/projected/de56cfec-f410-4c75-b58b-3f82cdc1c603-kube-api-access-wh52q\") pod \"node-ca-ccs5q\" (UID: \"de56cfec-f410-4c75-b58b-3f82cdc1c603\") " pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.435986 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj9t6\" (UniqueName: \"kubernetes.io/projected/0f457793-f4e0-4417-ae91-4455722372c1-kube-api-access-qj9t6\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.439626 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.452509 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.464063 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.480577 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://028426e559d59a825e657b4dd78ff5b936b39bfebb6bea30b2e88b224f4b7c6c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:46Z\\\",\\\"message\\\":\\\"pi/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:31:46.283400 6232 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:31:46.283949 6232 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0202 10:31:46.283981 6232 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0202 10:31:46.284005 6232 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0202 10:31:46.284010 6232 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0202 10:31:46.284020 6232 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0202 10:31:46.284021 6232 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0202 10:31:46.284031 6232 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0202 10:31:46.284032 6232 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0202 10:31:46.284039 6232 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0202 10:31:46.284040 6232 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0202 10:31:46.284048 6232 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0202 10:31:46.284081 6232 factory.go:656] Stopping watch factory\\\\nI0202 10:31:46.284093 6232 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"cy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0202 10:31:48.071143 6356 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 10:31:48.071186 6356 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nF0202 10:31:48.071202 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.486348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.486390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.486402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.486421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.486433 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:49Z","lastTransitionTime":"2026-02-02T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.490501 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.503975 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.515729 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.527024 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.543185 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"cy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0202 10:31:48.071143 6356 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 10:31:48.071186 6356 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nF0202 10:31:48.071202 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.553015 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.562573 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.574507 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.579591 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ccs5q" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.589220 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.589256 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.589270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.589288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.589300 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:49Z","lastTransitionTime":"2026-02-02T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:49 crc kubenswrapper[4909]: W0202 10:31:49.590920 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde56cfec_f410_4c75_b58b_3f82cdc1c603.slice/crio-b4ddc5e8b8c3a1382ce049796faa602939c61d969895f8f880a88e912911c4a5 WatchSource:0}: Error finding container b4ddc5e8b8c3a1382ce049796faa602939c61d969895f8f880a88e912911c4a5: Status 404 returned error can't find the container with id b4ddc5e8b8c3a1382ce049796faa602939c61d969895f8f880a88e912911c4a5 Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.592287 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.609565 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.624447 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.643184 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.655268 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.665869 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.674726 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.685027 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.691878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.691920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.691934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.691951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.692258 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:49Z","lastTransitionTime":"2026-02-02T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.700267 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.715531 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:49Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.794366 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.794407 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.794418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.794433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.794457 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:49Z","lastTransitionTime":"2026-02-02T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.897109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.897156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.897165 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.897181 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.897192 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:49Z","lastTransitionTime":"2026-02-02T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.917803 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:49 crc kubenswrapper[4909]: E0202 10:31:49.917962 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:31:49 crc kubenswrapper[4909]: E0202 10:31:49.918054 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs podName:0f457793-f4e0-4417-ae91-4455722372c1 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:50.91803678 +0000 UTC m=+36.664137525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs") pod "network-metrics-daemon-2v5vw" (UID: "0f457793-f4e0-4417-ae91-4455722372c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.985635 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 05:52:19.04950891 +0000 UTC Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.999433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.999474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.999485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.999500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:49 crc kubenswrapper[4909]: I0202 10:31:49.999512 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:49Z","lastTransitionTime":"2026-02-02T10:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.102329 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.102367 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.102379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.102396 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.102407 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:50Z","lastTransitionTime":"2026-02-02T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.205387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.205421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.205431 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.205447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.205455 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:50Z","lastTransitionTime":"2026-02-02T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.290330 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ccs5q" event={"ID":"de56cfec-f410-4c75-b58b-3f82cdc1c603","Type":"ContainerStarted","Data":"1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.290392 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ccs5q" event={"ID":"de56cfec-f410-4c75-b58b-3f82cdc1c603","Type":"ContainerStarted","Data":"b4ddc5e8b8c3a1382ce049796faa602939c61d969895f8f880a88e912911c4a5"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.301632 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.307380 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.307413 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.307422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.307436 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.307445 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:50Z","lastTransitionTime":"2026-02-02T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.313161 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.326240 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.337883 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.350246 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.378706 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.400106 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.409852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.409885 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.409895 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.409909 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.409920 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:50Z","lastTransitionTime":"2026-02-02T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.427036 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"cy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0202 10:31:48.071143 6356 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 10:31:48.071186 6356 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nF0202 10:31:48.071202 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.437490 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.454453 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.465244 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.474672 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.486867 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.490865 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.491535 4909 scope.go:117] "RemoveContainer" containerID="1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a" Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.491747 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\"" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.497362 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.506531 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.512164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.512204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.512213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.512226 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.512237 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:50Z","lastTransitionTime":"2026-02-02T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.515243 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.525762 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:50Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.614592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.614623 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.614632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.614646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.614654 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:50Z","lastTransitionTime":"2026-02-02T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.625263 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.625339 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:32:06.625323761 +0000 UTC m=+52.371424496 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.717175 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.717221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.717237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.717262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.717272 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:50Z","lastTransitionTime":"2026-02-02T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.726739 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.726880 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.726933 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.726965 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.726936 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727101 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:06.7270752 +0000 UTC m=+52.473176115 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727004 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727119 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727161 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727165 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:06.727147442 +0000 UTC m=+52.473248167 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727166 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727196 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727213 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727176 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727290 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:06.727264605 +0000 UTC m=+52.473365530 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.727337 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:06.727315007 +0000 UTC m=+52.473415932 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.819703 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.819759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.819783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.819801 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.819831 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:50Z","lastTransitionTime":"2026-02-02T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.922430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.922464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.922473 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.922484 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.922492 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:50Z","lastTransitionTime":"2026-02-02T10:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.928174 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.928373 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:31:50 crc kubenswrapper[4909]: E0202 10:31:50.928466 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs podName:0f457793-f4e0-4417-ae91-4455722372c1 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:52.928444046 +0000 UTC m=+38.674544881 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs") pod "network-metrics-daemon-2v5vw" (UID: "0f457793-f4e0-4417-ae91-4455722372c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:31:50 crc kubenswrapper[4909]: I0202 10:31:50.986028 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 05:59:56.449149835 +0000 UTC Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.016486 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.016537 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.016575 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:51 crc kubenswrapper[4909]: E0202 10:31:51.017073 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.016647 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:51 crc kubenswrapper[4909]: E0202 10:31:51.017138 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:51 crc kubenswrapper[4909]: E0202 10:31:51.016845 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:51 crc kubenswrapper[4909]: E0202 10:31:51.017306 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.024727 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.024759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.024773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.024790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.024801 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.128973 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.129026 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.129044 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.129069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.129089 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.232429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.232479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.232492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.232516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.232530 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.263255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.263296 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.263305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.263319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.263332 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: E0202 10:31:51.276495 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:51Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.280770 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.280903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.280927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.280960 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.280997 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: E0202 10:31:51.302365 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:51Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.306143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.306183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.306195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.306212 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.306226 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: E0202 10:31:51.318922 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:51Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.323276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.323637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.323881 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.324062 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.324205 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: E0202 10:31:51.339320 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:51Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.344926 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.344968 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.344977 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.344991 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.345000 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: E0202 10:31:51.360750 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:51Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:51 crc kubenswrapper[4909]: E0202 10:31:51.360883 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.362761 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.362848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.362868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.362902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.362919 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.466264 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.466360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.466375 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.466402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.466412 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.569782 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.569861 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.569872 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.569902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.569917 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.673264 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.673348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.673384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.673415 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.673437 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.776360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.776407 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.776422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.776446 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.776560 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.879316 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.879349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.879360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.879375 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.879386 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.982587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.982642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.982655 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.982677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.982693 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:51Z","lastTransitionTime":"2026-02-02T10:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:51 crc kubenswrapper[4909]: I0202 10:31:51.986864 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 09:31:42.560726327 +0000 UTC Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.084536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.084688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.084705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.084732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.084752 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:52Z","lastTransitionTime":"2026-02-02T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.187476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.187519 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.187528 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.187545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.187556 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:52Z","lastTransitionTime":"2026-02-02T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.290569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.290623 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.290635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.290656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.290669 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:52Z","lastTransitionTime":"2026-02-02T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.393515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.393592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.393616 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.393643 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.393668 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:52Z","lastTransitionTime":"2026-02-02T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.497158 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.497237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.497260 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.497290 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.497312 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:52Z","lastTransitionTime":"2026-02-02T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.599951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.599988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.599996 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.600008 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.600018 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:52Z","lastTransitionTime":"2026-02-02T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.702358 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.702489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.702498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.702515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.702527 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:52Z","lastTransitionTime":"2026-02-02T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.805654 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.805701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.805712 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.805730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.805743 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:52Z","lastTransitionTime":"2026-02-02T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.909620 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.909663 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.909675 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.909689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.909701 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:52Z","lastTransitionTime":"2026-02-02T10:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.952346 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:52 crc kubenswrapper[4909]: E0202 10:31:52.952631 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:31:52 crc kubenswrapper[4909]: E0202 10:31:52.952776 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs podName:0f457793-f4e0-4417-ae91-4455722372c1 nodeName:}" failed. No retries permitted until 2026-02-02 10:31:56.952742679 +0000 UTC m=+42.698843614 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs") pod "network-metrics-daemon-2v5vw" (UID: "0f457793-f4e0-4417-ae91-4455722372c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:31:52 crc kubenswrapper[4909]: I0202 10:31:52.987904 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 10:17:35.548294537 +0000 UTC Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.011883 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.011935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.011955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.011970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.011979 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:53Z","lastTransitionTime":"2026-02-02T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.016399 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.016445 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:53 crc kubenswrapper[4909]: E0202 10:31:53.016502 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.016410 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:53 crc kubenswrapper[4909]: E0202 10:31:53.016621 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.016715 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:53 crc kubenswrapper[4909]: E0202 10:31:53.016777 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:31:53 crc kubenswrapper[4909]: E0202 10:31:53.016974 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.114956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.115016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.115029 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.115051 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.115063 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:53Z","lastTransitionTime":"2026-02-02T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.218030 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.218080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.218091 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.218108 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.218121 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:53Z","lastTransitionTime":"2026-02-02T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.320677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.320717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.320729 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.320744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.320756 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:53Z","lastTransitionTime":"2026-02-02T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.423332 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.423373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.423388 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.423403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.423412 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:53Z","lastTransitionTime":"2026-02-02T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.526166 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.526213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.526221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.526237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.526247 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:53Z","lastTransitionTime":"2026-02-02T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.629199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.629442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.629525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.629646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.629728 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:53Z","lastTransitionTime":"2026-02-02T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.732728 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.732760 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.732768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.732781 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.732790 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:53Z","lastTransitionTime":"2026-02-02T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.835421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.835454 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.835462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.835476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.835484 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:53Z","lastTransitionTime":"2026-02-02T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.938474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.938513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.938526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.938541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.938550 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:53Z","lastTransitionTime":"2026-02-02T10:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:53 crc kubenswrapper[4909]: I0202 10:31:53.988527 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 12:54:03.545644697 +0000 UTC Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.040583 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.040619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.040627 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.040641 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.040650 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:54Z","lastTransitionTime":"2026-02-02T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.143215 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.143249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.143270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.143294 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.143306 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:54Z","lastTransitionTime":"2026-02-02T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.245487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.245523 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.245532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.245546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.245554 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:54Z","lastTransitionTime":"2026-02-02T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.347388 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.347460 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.347472 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.347487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.347498 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:54Z","lastTransitionTime":"2026-02-02T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.449542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.449581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.449590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.449603 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.449613 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:54Z","lastTransitionTime":"2026-02-02T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.551997 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.552031 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.552039 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.552054 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.552063 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:54Z","lastTransitionTime":"2026-02-02T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.654047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.654090 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.654101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.654119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.654129 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:54Z","lastTransitionTime":"2026-02-02T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.756908 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.756974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.756984 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.757002 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.757014 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:54Z","lastTransitionTime":"2026-02-02T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.859422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.859459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.859470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.859487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.859495 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:54Z","lastTransitionTime":"2026-02-02T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.961541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.961574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.961582 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.961595 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.961603 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:54Z","lastTransitionTime":"2026-02-02T10:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:54 crc kubenswrapper[4909]: I0202 10:31:54.989562 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 03:10:22.740751546 +0000 UTC Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.016994 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.017076 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.017075 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:55 crc kubenswrapper[4909]: E0202 10:31:55.017217 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.017290 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:55 crc kubenswrapper[4909]: E0202 10:31:55.017435 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:31:55 crc kubenswrapper[4909]: E0202 10:31:55.017499 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:55 crc kubenswrapper[4909]: E0202 10:31:55.017544 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.028726 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.037689 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.046525 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.058767 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.063143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.063169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.063177 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.063191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.063199 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:55Z","lastTransitionTime":"2026-02-02T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.070792 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.082340 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.097167 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.108636 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.129315 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"cy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0202 10:31:48.071143 6356 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 10:31:48.071186 6356 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nF0202 10:31:48.071202 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.142475 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.152167 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.162768 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.165200 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.165243 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.165253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.165270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.165282 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:55Z","lastTransitionTime":"2026-02-02T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.172834 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.183023 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.197723 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.214750 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.227238 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:55Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.267408 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.267659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.267728 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.267796 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.267879 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:55Z","lastTransitionTime":"2026-02-02T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.370230 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.370281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.370292 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.370304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.370313 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:55Z","lastTransitionTime":"2026-02-02T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.472754 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.473081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.473172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.473277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.473365 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:55Z","lastTransitionTime":"2026-02-02T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.575951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.576190 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.576268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.576379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.576526 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:55Z","lastTransitionTime":"2026-02-02T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.678908 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.679120 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.679178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.679234 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.679287 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:55Z","lastTransitionTime":"2026-02-02T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.781638 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.781936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.782028 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.782131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.782193 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:55Z","lastTransitionTime":"2026-02-02T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.884252 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.884284 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.884292 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.884502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.884516 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:55Z","lastTransitionTime":"2026-02-02T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.987465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.987746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.987972 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.988064 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.988316 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:55Z","lastTransitionTime":"2026-02-02T10:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:55 crc kubenswrapper[4909]: I0202 10:31:55.990598 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 20:29:54.205865807 +0000 UTC Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.016461 4909 scope.go:117] "RemoveContainer" containerID="98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.091506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.091541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.091549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.091565 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.091591 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:56Z","lastTransitionTime":"2026-02-02T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.194667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.195129 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.195142 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.195161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.195174 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:56Z","lastTransitionTime":"2026-02-02T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.297443 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.297494 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.297506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.297522 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.297533 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:56Z","lastTransitionTime":"2026-02-02T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.310022 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.311392 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630"} Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.311744 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.324793 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.337249 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.346680 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.355950 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.367100 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.378788 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.391025 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.402505 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.402546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.402557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.402572 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.402582 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:56Z","lastTransitionTime":"2026-02-02T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.405584 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.418162 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.430551 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.449714 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"cy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0202 10:31:48.071143 6356 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 10:31:48.071186 6356 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nF0202 10:31:48.071202 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.460058 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.469343 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.485851 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.497071 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.504931 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.504964 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.504972 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.504988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.504997 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:56Z","lastTransitionTime":"2026-02-02T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.507739 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.520924 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:31:56Z is after 2025-08-24T17:21:41Z" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.607288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.607328 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.607336 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.607350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.607363 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:56Z","lastTransitionTime":"2026-02-02T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.710851 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.710888 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.710898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.710920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.710931 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:56Z","lastTransitionTime":"2026-02-02T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.813183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.813229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.813239 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.813255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.813272 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:56Z","lastTransitionTime":"2026-02-02T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.916618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.916673 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.916687 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.916717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.916730 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:56Z","lastTransitionTime":"2026-02-02T10:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.990723 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 19:28:22.824994427 +0000 UTC Feb 02 10:31:56 crc kubenswrapper[4909]: I0202 10:31:56.991765 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:56 crc kubenswrapper[4909]: E0202 10:31:56.991898 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:31:56 crc kubenswrapper[4909]: E0202 10:31:56.991955 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs podName:0f457793-f4e0-4417-ae91-4455722372c1 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:04.991940123 +0000 UTC m=+50.738040858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs") pod "network-metrics-daemon-2v5vw" (UID: "0f457793-f4e0-4417-ae91-4455722372c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.015800 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.015866 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.015945 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:57 crc kubenswrapper[4909]: E0202 10:31:57.015939 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:57 crc kubenswrapper[4909]: E0202 10:31:57.016032 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:57 crc kubenswrapper[4909]: E0202 10:31:57.016095 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.016215 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:57 crc kubenswrapper[4909]: E0202 10:31:57.016270 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.019636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.019659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.019669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.019683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.019692 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:57Z","lastTransitionTime":"2026-02-02T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.122303 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.122334 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.122342 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.122357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.122367 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:57Z","lastTransitionTime":"2026-02-02T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.224278 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.224361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.224381 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.224406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.224423 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:57Z","lastTransitionTime":"2026-02-02T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.326121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.326157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.326168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.326183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.326193 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:57Z","lastTransitionTime":"2026-02-02T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.428134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.428172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.428182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.428199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.428209 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:57Z","lastTransitionTime":"2026-02-02T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.530367 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.530404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.530416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.530434 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.530449 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:57Z","lastTransitionTime":"2026-02-02T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.632704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.632774 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.632798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.632865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.632889 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:57Z","lastTransitionTime":"2026-02-02T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.735088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.735120 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.735132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.735146 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.735157 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:57Z","lastTransitionTime":"2026-02-02T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.837690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.837741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.837756 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.837779 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.837799 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:57Z","lastTransitionTime":"2026-02-02T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.940649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.940704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.940721 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.940744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.940776 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:57Z","lastTransitionTime":"2026-02-02T10:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:57 crc kubenswrapper[4909]: I0202 10:31:57.991789 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 16:18:35.397001482 +0000 UTC Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.043677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.043723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.043733 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.043746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.043755 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:58Z","lastTransitionTime":"2026-02-02T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.145537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.145575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.145584 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.145598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.145608 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:58Z","lastTransitionTime":"2026-02-02T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.247755 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.247790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.247798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.247830 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.247841 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:58Z","lastTransitionTime":"2026-02-02T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.349702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.349737 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.349745 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.349757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.349765 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:58Z","lastTransitionTime":"2026-02-02T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.452318 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.452361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.452370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.452383 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.452392 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:58Z","lastTransitionTime":"2026-02-02T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.554622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.554689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.554712 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.554741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.554765 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:58Z","lastTransitionTime":"2026-02-02T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.657474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.657513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.657557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.657578 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.657592 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:58Z","lastTransitionTime":"2026-02-02T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.760195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.760225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.760234 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.760248 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.760257 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:58Z","lastTransitionTime":"2026-02-02T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.862231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.862275 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.862287 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.862306 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.862315 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:58Z","lastTransitionTime":"2026-02-02T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.964945 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.964997 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.965006 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.965020 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.965029 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:58Z","lastTransitionTime":"2026-02-02T10:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:58 crc kubenswrapper[4909]: I0202 10:31:58.992243 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 14:40:10.75891464 +0000 UTC Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.015985 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.016046 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:31:59 crc kubenswrapper[4909]: E0202 10:31:59.016103 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.016114 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.016158 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:31:59 crc kubenswrapper[4909]: E0202 10:31:59.016178 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:31:59 crc kubenswrapper[4909]: E0202 10:31:59.016246 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:31:59 crc kubenswrapper[4909]: E0202 10:31:59.016447 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.067775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.067829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.067840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.067856 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.067868 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:59Z","lastTransitionTime":"2026-02-02T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.173742 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.173792 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.173823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.173844 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.173857 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:59Z","lastTransitionTime":"2026-02-02T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.276372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.276412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.276420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.276435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.276446 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:59Z","lastTransitionTime":"2026-02-02T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.379290 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.379325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.379335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.379354 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.379364 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:59Z","lastTransitionTime":"2026-02-02T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.481975 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.482005 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.482016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.482030 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.482040 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:59Z","lastTransitionTime":"2026-02-02T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.584243 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.584286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.584297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.584313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.584324 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:59Z","lastTransitionTime":"2026-02-02T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.686738 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.686780 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.686790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.686826 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.686835 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:59Z","lastTransitionTime":"2026-02-02T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.789510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.789565 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.789580 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.789599 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.789612 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:59Z","lastTransitionTime":"2026-02-02T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.892531 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.892568 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.892578 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.892592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.892602 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:59Z","lastTransitionTime":"2026-02-02T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.993140 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 17:33:12.7768319 +0000 UTC Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.995206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.995267 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.995285 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.995313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:31:59 crc kubenswrapper[4909]: I0202 10:31:59.995330 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:31:59Z","lastTransitionTime":"2026-02-02T10:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.097748 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.097777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.097785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.097802 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.097833 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:00Z","lastTransitionTime":"2026-02-02T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.200588 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.200645 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.200660 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.200686 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.200699 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:00Z","lastTransitionTime":"2026-02-02T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.303433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.303496 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.303510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.303532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.303544 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:00Z","lastTransitionTime":"2026-02-02T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.406633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.406668 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.406678 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.406692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.406701 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:00Z","lastTransitionTime":"2026-02-02T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.509127 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.509165 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.509216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.509234 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.509246 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:00Z","lastTransitionTime":"2026-02-02T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.611923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.611964 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.611977 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.611995 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.612006 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:00Z","lastTransitionTime":"2026-02-02T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.714803 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.714856 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.714865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.714878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.714887 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:00Z","lastTransitionTime":"2026-02-02T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.817282 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.817326 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.817340 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.817355 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.817366 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:00Z","lastTransitionTime":"2026-02-02T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.919620 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.919664 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.919676 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.919691 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.919702 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:00Z","lastTransitionTime":"2026-02-02T10:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:00 crc kubenswrapper[4909]: I0202 10:32:00.993874 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 03:28:10.38063635 +0000 UTC Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.016446 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.016500 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.016563 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:01 crc kubenswrapper[4909]: E0202 10:32:01.016565 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.016463 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:01 crc kubenswrapper[4909]: E0202 10:32:01.016663 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:01 crc kubenswrapper[4909]: E0202 10:32:01.016735 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:01 crc kubenswrapper[4909]: E0202 10:32:01.016853 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.021479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.021540 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.021551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.021564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.021574 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.123785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.123836 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.123845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.123859 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.123868 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.226199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.226232 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.226242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.226254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.226263 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.328820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.328868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.328882 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.328896 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.328907 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.431110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.431153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.431162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.431178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.431189 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.535245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.535285 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.535294 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.535311 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.535319 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.628276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.628315 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.628325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.628340 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.628349 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: E0202 10:32:01.644666 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.648387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.648438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.648451 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.648471 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.648483 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: E0202 10:32:01.663844 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.667868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.667927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.667938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.667952 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.667962 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: E0202 10:32:01.682225 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.686344 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.686376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.686388 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.686406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.686417 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: E0202 10:32:01.700280 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.703788 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.703849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.703860 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.703876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.703886 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: E0202 10:32:01.715513 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:01Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:01 crc kubenswrapper[4909]: E0202 10:32:01.715677 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.717481 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.717530 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.717545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.717565 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.717580 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.820618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.820702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.820722 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.820746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.820763 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.923166 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.923221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.923231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.923248 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.923257 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:01Z","lastTransitionTime":"2026-02-02T10:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:01 crc kubenswrapper[4909]: I0202 10:32:01.994587 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 00:22:39.540789281 +0000 UTC Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.026001 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.026055 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.026065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.026081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.026093 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:02Z","lastTransitionTime":"2026-02-02T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.128092 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.128131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.128140 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.128155 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.128166 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:02Z","lastTransitionTime":"2026-02-02T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.230354 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.230401 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.230412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.230428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.230439 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:02Z","lastTransitionTime":"2026-02-02T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.332594 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.332626 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.332637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.332653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.332664 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:02Z","lastTransitionTime":"2026-02-02T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.435004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.435038 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.435048 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.435062 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.435073 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:02Z","lastTransitionTime":"2026-02-02T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.537860 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.537894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.537904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.537919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.537929 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:02Z","lastTransitionTime":"2026-02-02T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.641226 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.641284 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.641297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.641313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.641331 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:02Z","lastTransitionTime":"2026-02-02T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.743932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.743970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.743980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.743995 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.744005 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:02Z","lastTransitionTime":"2026-02-02T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.845869 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.845912 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.845921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.845937 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.845947 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:02Z","lastTransitionTime":"2026-02-02T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.948819 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.948860 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.948876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.948893 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.948903 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:02Z","lastTransitionTime":"2026-02-02T10:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:02 crc kubenswrapper[4909]: I0202 10:32:02.995478 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 19:51:29.201492135 +0000 UTC Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.015828 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.015904 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.015978 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.016073 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:03 crc kubenswrapper[4909]: E0202 10:32:03.016066 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:03 crc kubenswrapper[4909]: E0202 10:32:03.016146 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:03 crc kubenswrapper[4909]: E0202 10:32:03.016256 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:03 crc kubenswrapper[4909]: E0202 10:32:03.016685 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.016932 4909 scope.go:117] "RemoveContainer" containerID="1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.051387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.051424 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.051433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.051447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.051456 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:03Z","lastTransitionTime":"2026-02-02T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.154403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.154474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.154486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.154504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.154516 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:03Z","lastTransitionTime":"2026-02-02T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.256575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.256614 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.256625 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.256652 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.256665 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:03Z","lastTransitionTime":"2026-02-02T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.333952 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/1.log" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.336591 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.337706 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.350648 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.358219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.358253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.358263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.358277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.358287 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:03Z","lastTransitionTime":"2026-02-02T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.363517 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.373770 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.383271 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.395653 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.420975 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"cy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0202 10:31:48.071143 6356 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 10:31:48.071186 6356 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nF0202 10:31:48.071202 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.433386 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.446679 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.460328 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.460362 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.460372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.460386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.460397 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:03Z","lastTransitionTime":"2026-02-02T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.469281 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.485168 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.497044 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.511253 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.522999 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.533845 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.541928 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.554474 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.562489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.562548 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.562557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.562570 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.562578 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:03Z","lastTransitionTime":"2026-02-02T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.569793 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:03Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.664925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.664967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.664979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.664994 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.665006 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:03Z","lastTransitionTime":"2026-02-02T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.766996 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.767037 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.767049 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.767065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.767073 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:03Z","lastTransitionTime":"2026-02-02T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.869614 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.869655 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.869663 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.869679 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.869690 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:03Z","lastTransitionTime":"2026-02-02T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.972178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.972208 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.972215 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.972228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.972237 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:03Z","lastTransitionTime":"2026-02-02T10:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:03 crc kubenswrapper[4909]: I0202 10:32:03.995788 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 13:34:36.173099192 +0000 UTC Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.074125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.074166 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.074176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.074195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.074207 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:04Z","lastTransitionTime":"2026-02-02T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.176524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.176564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.176571 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.176585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.176594 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:04Z","lastTransitionTime":"2026-02-02T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.279431 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.279499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.279516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.279539 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.279560 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:04Z","lastTransitionTime":"2026-02-02T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.340923 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/2.log" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.341593 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/1.log" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.344601 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0" exitCode=1 Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.344659 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0"} Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.344701 4909 scope.go:117] "RemoveContainer" containerID="1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.345777 4909 scope.go:117] "RemoveContainer" containerID="7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0" Feb 02 10:32:04 crc kubenswrapper[4909]: E0202 10:32:04.346118 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\"" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.360795 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.374911 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.382240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.382307 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.382327 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.382356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.382378 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:04Z","lastTransitionTime":"2026-02-02T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.391570 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.405759 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.423715 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.435263 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.453664 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.475060 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"cy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0202 10:31:48.071143 6356 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 10:31:48.071186 6356 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nF0202 10:31:48.071202 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.484680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.484729 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.484741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.484757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.484775 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:04Z","lastTransitionTime":"2026-02-02T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.497853 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.515689 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.529773 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.543526 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.560321 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.573216 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.584603 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.587249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.587272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.587281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.587295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.587304 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:04Z","lastTransitionTime":"2026-02-02T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.597294 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.609983 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:04Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.689882 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.689928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.689939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.689957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.689973 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:04Z","lastTransitionTime":"2026-02-02T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.792428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.792488 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.792504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.792525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.792540 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:04Z","lastTransitionTime":"2026-02-02T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.894745 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.894776 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.894784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.894798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.894822 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:04Z","lastTransitionTime":"2026-02-02T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.995951 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 08:35:21.943223252 +0000 UTC Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.997760 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.997795 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.997804 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.997831 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:04 crc kubenswrapper[4909]: I0202 10:32:04.997840 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:04Z","lastTransitionTime":"2026-02-02T10:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.015510 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.015754 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.015893 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:05 crc kubenswrapper[4909]: E0202 10:32:05.015888 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.015941 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:05 crc kubenswrapper[4909]: E0202 10:32:05.016034 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:05 crc kubenswrapper[4909]: E0202 10:32:05.016179 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:05 crc kubenswrapper[4909]: E0202 10:32:05.016299 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.034799 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.050372 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.063146 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.068598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:05 crc kubenswrapper[4909]: E0202 10:32:05.068724 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:05 crc kubenswrapper[4909]: E0202 10:32:05.068781 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs podName:0f457793-f4e0-4417-ae91-4455722372c1 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:21.068759453 +0000 UTC m=+66.814860188 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs") pod "network-metrics-daemon-2v5vw" (UID: "0f457793-f4e0-4417-ae91-4455722372c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.073792 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.083319 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.097905 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.099606 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.099631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.099639 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.099651 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.099660 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:05Z","lastTransitionTime":"2026-02-02T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.119554 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1294eb236717378e869465b7abdb8373067a0db8b57e4511dbe4637356b7377a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"message\\\":\\\"cy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0202 10:31:48.071143 6356 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb-f2e2-430d-9754-c882a5e924a5 requested-chassis:crc]} port_security:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {960d98b2-dc64-4e93-a4b6-9b19847af71e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0202 10:31:48.071186 6356 lb_config.go:1031] Cluster endpoints for openshift-authentication/oauth-openshift for network=default are: map[]\\\\nF0202 10:31:48.071202 6356 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: fa\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.132537 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.146435 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.161752 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.175020 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.187481 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.200703 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.201723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.201769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.201779 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.201795 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.201821 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:05Z","lastTransitionTime":"2026-02-02T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.253075 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.273500 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.285710 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.301206 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.303615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.303649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.303659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.303676 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.303684 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:05Z","lastTransitionTime":"2026-02-02T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.349991 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/2.log" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.354191 4909 scope.go:117] "RemoveContainer" containerID="7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0" Feb 02 10:32:05 crc kubenswrapper[4909]: E0202 10:32:05.354356 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\"" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.365970 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.378107 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.391852 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.404381 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.406348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.406403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.406416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.406442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.406455 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:05Z","lastTransitionTime":"2026-02-02T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.417946 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.430057 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.442160 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.459281 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.470055 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.478942 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.490726 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.500660 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.508498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.508563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.508574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.508590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.508601 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:05Z","lastTransitionTime":"2026-02-02T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.514282 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.528274 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.546958 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.558621 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.568772 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:05Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.610670 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.610706 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.610718 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.610733 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.610742 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:05Z","lastTransitionTime":"2026-02-02T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.713209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.713241 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.713250 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.713262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.713270 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:05Z","lastTransitionTime":"2026-02-02T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.815645 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.815682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.815690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.815705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.815713 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:05Z","lastTransitionTime":"2026-02-02T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.917970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.917993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.918020 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.918034 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.918042 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:05Z","lastTransitionTime":"2026-02-02T10:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:05 crc kubenswrapper[4909]: I0202 10:32:05.996220 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 23:48:30.572310589 +0000 UTC Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.020020 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.020055 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.020069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.020084 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.020095 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:06Z","lastTransitionTime":"2026-02-02T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.122789 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.122890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.122904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.122918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.122930 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:06Z","lastTransitionTime":"2026-02-02T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.224871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.224900 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.224908 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.224922 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.224932 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:06Z","lastTransitionTime":"2026-02-02T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.326954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.326992 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.327002 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.327016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.327043 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:06Z","lastTransitionTime":"2026-02-02T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.429374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.429409 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.429418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.429430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.429439 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:06Z","lastTransitionTime":"2026-02-02T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.532387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.532475 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.532495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.532527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.532549 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:06Z","lastTransitionTime":"2026-02-02T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.635096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.635144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.635153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.635167 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.635176 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:06Z","lastTransitionTime":"2026-02-02T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.683341 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.683501 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:32:38.683474555 +0000 UTC m=+84.429575290 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.737738 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.737776 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.737786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.737801 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.737827 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:06Z","lastTransitionTime":"2026-02-02T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.784794 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.784861 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.784912 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.784934 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.784970 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785007 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785006 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785019 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785055 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:38.785041538 +0000 UTC m=+84.531142273 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785076 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:38.785060719 +0000 UTC m=+84.531161454 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785085 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785117 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785129 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785159 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785190 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:38.785170302 +0000 UTC m=+84.531271037 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:06 crc kubenswrapper[4909]: E0202 10:32:06.785274 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:38.785250124 +0000 UTC m=+84.531350849 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.839968 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.840017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.840028 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.840042 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.840055 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:06Z","lastTransitionTime":"2026-02-02T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.942632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.942669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.942678 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.942692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.942701 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:06Z","lastTransitionTime":"2026-02-02T10:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:06 crc kubenswrapper[4909]: I0202 10:32:06.996610 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 18:07:24.651230022 +0000 UTC Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.016025 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.016088 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.016136 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.016189 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:07 crc kubenswrapper[4909]: E0202 10:32:07.016197 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:07 crc kubenswrapper[4909]: E0202 10:32:07.016248 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:07 crc kubenswrapper[4909]: E0202 10:32:07.016288 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:07 crc kubenswrapper[4909]: E0202 10:32:07.016340 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.044395 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.044431 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.044440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.044454 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.044464 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:07Z","lastTransitionTime":"2026-02-02T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.146858 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.146891 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.146902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.146919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.146930 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:07Z","lastTransitionTime":"2026-02-02T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.248752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.248788 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.248798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.248830 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.248842 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:07Z","lastTransitionTime":"2026-02-02T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.350349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.350391 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.350400 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.350414 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.350426 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:07Z","lastTransitionTime":"2026-02-02T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.451947 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.451988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.451997 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.452010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.452020 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:07Z","lastTransitionTime":"2026-02-02T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.554555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.554607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.554618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.554631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.554640 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:07Z","lastTransitionTime":"2026-02-02T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.657375 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.657422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.657435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.657453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.657465 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:07Z","lastTransitionTime":"2026-02-02T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.759722 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.759753 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.759761 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.759772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.759781 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:07Z","lastTransitionTime":"2026-02-02T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.862512 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.862543 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.862551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.862565 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.862574 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:07Z","lastTransitionTime":"2026-02-02T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.964116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.964149 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.964160 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.964175 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.964218 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:07Z","lastTransitionTime":"2026-02-02T10:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:07 crc kubenswrapper[4909]: I0202 10:32:07.997009 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 00:18:17.667870601 +0000 UTC Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.066850 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.066938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.066986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.067020 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.067039 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:08Z","lastTransitionTime":"2026-02-02T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.169713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.169743 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.169751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.169767 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.169777 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:08Z","lastTransitionTime":"2026-02-02T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.271775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.271843 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.271856 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.271871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.271880 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:08Z","lastTransitionTime":"2026-02-02T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.373849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.373887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.373899 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.373915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.373925 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:08Z","lastTransitionTime":"2026-02-02T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.475667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.475718 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.475728 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.475742 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.475751 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:08Z","lastTransitionTime":"2026-02-02T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.578376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.578426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.578443 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.578456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.578465 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:08Z","lastTransitionTime":"2026-02-02T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.680465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.680536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.680560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.680589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.680611 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:08Z","lastTransitionTime":"2026-02-02T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.782717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.782774 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.782790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.782945 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.782983 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:08Z","lastTransitionTime":"2026-02-02T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.885132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.885194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.885209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.885238 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.885253 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:08Z","lastTransitionTime":"2026-02-02T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.987384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.987427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.987438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.987453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.987464 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:08Z","lastTransitionTime":"2026-02-02T10:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:08 crc kubenswrapper[4909]: I0202 10:32:08.997919 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 16:30:26.489797797 +0000 UTC Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.016491 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.016519 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.016551 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:09 crc kubenswrapper[4909]: E0202 10:32:09.016651 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.016866 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:09 crc kubenswrapper[4909]: E0202 10:32:09.016967 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:09 crc kubenswrapper[4909]: E0202 10:32:09.017108 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:09 crc kubenswrapper[4909]: E0202 10:32:09.017167 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.089213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.089422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.089514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.089594 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.089675 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:09Z","lastTransitionTime":"2026-02-02T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.192641 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.192685 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.192693 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.192710 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.192721 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:09Z","lastTransitionTime":"2026-02-02T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.296048 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.296115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.296126 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.296149 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.296161 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:09Z","lastTransitionTime":"2026-02-02T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.399261 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.399862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.399972 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.400128 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.400232 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:09Z","lastTransitionTime":"2026-02-02T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.503089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.503145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.503162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.503184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.503202 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:09Z","lastTransitionTime":"2026-02-02T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.605650 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.605689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.605697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.605713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.605722 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:09Z","lastTransitionTime":"2026-02-02T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.709197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.709246 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.709261 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.709278 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.709289 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:09Z","lastTransitionTime":"2026-02-02T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.811708 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.811751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.811846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.811862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.811871 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:09Z","lastTransitionTime":"2026-02-02T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.816345 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.826046 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.829515 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.840205 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.851871 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.863044 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.874791 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.886266 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.898777 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.914596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.914625 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.914633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.914648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.914656 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:09Z","lastTransitionTime":"2026-02-02T10:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.916946 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.927477 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.935980 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.948278 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.958595 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.968302 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.981193 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:09 crc kubenswrapper[4909]: I0202 10:32:09.998877 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 08:02:52.67810159 +0000 UTC Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:09.999872 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:09Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.011375 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.017773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.018016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.018028 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.018043 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.018055 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:10Z","lastTransitionTime":"2026-02-02T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.023173 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:10Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.121282 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.121330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.121342 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.121359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.121376 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:10Z","lastTransitionTime":"2026-02-02T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.224028 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.224062 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.224070 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.224082 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.224090 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:10Z","lastTransitionTime":"2026-02-02T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.326479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.326541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.326564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.326590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.326610 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:10Z","lastTransitionTime":"2026-02-02T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.429037 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.429081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.429092 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.429109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.429121 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:10Z","lastTransitionTime":"2026-02-02T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.531845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.531875 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.531883 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.531897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.531907 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:10Z","lastTransitionTime":"2026-02-02T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.634749 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.634795 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.634823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.634841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.634853 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:10Z","lastTransitionTime":"2026-02-02T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.737483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.737525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.737551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.737575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.737595 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:10Z","lastTransitionTime":"2026-02-02T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.840406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.840458 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.840467 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.840482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.840492 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:10Z","lastTransitionTime":"2026-02-02T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.943272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.943313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.943320 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.943332 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.943341 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:10Z","lastTransitionTime":"2026-02-02T10:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:10 crc kubenswrapper[4909]: I0202 10:32:10.999186 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 08:02:47.155150272 +0000 UTC Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.015647 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.015710 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.015721 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.015665 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:11 crc kubenswrapper[4909]: E0202 10:32:11.015836 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:11 crc kubenswrapper[4909]: E0202 10:32:11.015887 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:11 crc kubenswrapper[4909]: E0202 10:32:11.015997 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:11 crc kubenswrapper[4909]: E0202 10:32:11.016074 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.045685 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.045736 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.045749 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.045771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.045784 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.148769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.148857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.148878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.148905 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.148923 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.252838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.252884 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.252901 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.252942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.252959 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.321284 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.337998 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.356176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.356225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.356239 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.356260 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.356272 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.358436 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.376362 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.392685 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.407565 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.426427 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.438074 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.450542 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.458795 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.458839 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.458848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.458860 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.458871 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.463795 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.478543 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.491552 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.507212 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.527128 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.541146 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.554206 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.561386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.561432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.561441 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.561458 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.561468 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.566694 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.577675 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.587886 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71855549-8a60-4feb-bc35-677f5606d9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:11Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.664248 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.664318 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.664330 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.664348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.664361 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.766700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.766732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.766740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.766753 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.766762 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.869643 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.869676 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.869685 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.869698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.869706 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.971878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.971920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.971928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.971948 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.971967 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:11 crc kubenswrapper[4909]: I0202 10:32:11.999489 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 20:10:48.538605533 +0000 UTC Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:11.999636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:11.999666 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:11.999677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:11.999693 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:11.999703 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:11Z","lastTransitionTime":"2026-02-02T10:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: E0202 10:32:12.011681 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:12Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.014776 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.014819 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.014839 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.014857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.014868 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: E0202 10:32:12.029759 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:12Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.033058 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.033083 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.033094 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.033108 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.033120 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: E0202 10:32:12.050272 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:12Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.053954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.053976 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.053984 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.053996 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.054005 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: E0202 10:32:12.072007 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:12Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.075474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.075511 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.075519 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.075533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.075543 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: E0202 10:32:12.089529 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:12Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:12 crc kubenswrapper[4909]: E0202 10:32:12.089641 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.091530 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.091562 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.091577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.091596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.091609 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.194111 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.194139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.194148 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.194160 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.194169 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.296009 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.296048 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.296061 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.296083 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.296094 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.398572 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.398662 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.398694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.398743 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.398770 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.501682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.501751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.501770 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.501800 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.501853 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.605477 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.605558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.605576 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.605604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.605624 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.709918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.710013 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.710034 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.710062 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.710085 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.813156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.813209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.813218 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.813232 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.813260 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.916727 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.916765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.916777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.916794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:12 crc kubenswrapper[4909]: I0202 10:32:12.916844 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:12Z","lastTransitionTime":"2026-02-02T10:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.000341 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 13:55:53.852197396 +0000 UTC Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.016646 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.016726 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.016646 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:13 crc kubenswrapper[4909]: E0202 10:32:13.016897 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.016918 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:13 crc kubenswrapper[4909]: E0202 10:32:13.017051 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:13 crc kubenswrapper[4909]: E0202 10:32:13.017227 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:13 crc kubenswrapper[4909]: E0202 10:32:13.017629 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.019878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.019922 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.019939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.019963 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.019977 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:13Z","lastTransitionTime":"2026-02-02T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.123107 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.123188 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.123201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.123247 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.123263 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:13Z","lastTransitionTime":"2026-02-02T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.226604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.226653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.226664 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.226687 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.226706 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:13Z","lastTransitionTime":"2026-02-02T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.329976 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.330036 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.330047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.330089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.330106 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:13Z","lastTransitionTime":"2026-02-02T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.433315 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.433400 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.433440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.433483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.433513 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:13Z","lastTransitionTime":"2026-02-02T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.537423 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.537483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.537498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.537522 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.537538 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:13Z","lastTransitionTime":"2026-02-02T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.641207 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.641262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.641275 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.641298 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.641315 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:13Z","lastTransitionTime":"2026-02-02T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.744461 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.744504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.744514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.744531 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.744542 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:13Z","lastTransitionTime":"2026-02-02T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.847930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.847974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.847988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.848007 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.848019 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:13Z","lastTransitionTime":"2026-02-02T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.950710 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.951002 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.951048 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.951072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:13 crc kubenswrapper[4909]: I0202 10:32:13.951090 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:13Z","lastTransitionTime":"2026-02-02T10:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.001355 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 05:01:00.348174675 +0000 UTC Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.053649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.053690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.053702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.053721 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.053732 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:14Z","lastTransitionTime":"2026-02-02T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.157431 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.158164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.158227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.158279 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.158312 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:14Z","lastTransitionTime":"2026-02-02T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.261653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.261688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.261696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.261733 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.261744 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:14Z","lastTransitionTime":"2026-02-02T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.366016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.366109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.366145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.366178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.366202 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:14Z","lastTransitionTime":"2026-02-02T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.474900 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.475101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.475130 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.475173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.475194 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:14Z","lastTransitionTime":"2026-02-02T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.578961 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.579041 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.579080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.579119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.579148 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:14Z","lastTransitionTime":"2026-02-02T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.683257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.683328 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.683345 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.683382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.683419 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:14Z","lastTransitionTime":"2026-02-02T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.786634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.786694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.786705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.786722 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.786734 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:14Z","lastTransitionTime":"2026-02-02T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.890399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.890470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.890488 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.890517 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.890537 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:14Z","lastTransitionTime":"2026-02-02T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.993335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.993383 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.993395 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.993418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:14 crc kubenswrapper[4909]: I0202 10:32:14.993431 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:14Z","lastTransitionTime":"2026-02-02T10:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.001498 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 12:34:46.741631093 +0000 UTC Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.015992 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.015991 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:15 crc kubenswrapper[4909]: E0202 10:32:15.016311 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:15 crc kubenswrapper[4909]: E0202 10:32:15.016433 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.016489 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:15 crc kubenswrapper[4909]: E0202 10:32:15.016620 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.016622 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:15 crc kubenswrapper[4909]: E0202 10:32:15.016935 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.053111 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.075551 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.095846 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.097194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.097249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.097259 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.097280 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.097294 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:15Z","lastTransitionTime":"2026-02-02T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.120419 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.140679 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71855549-8a60-4feb-bc35-677f5606d9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.159655 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.182960 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.198568 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.200037 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.200094 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.200104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.200122 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.200135 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:15Z","lastTransitionTime":"2026-02-02T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.214482 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.234184 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.258216 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.274841 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.290467 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.302926 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.303051 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.303224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.303344 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.303436 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:15Z","lastTransitionTime":"2026-02-02T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.314052 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.329236 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.344659 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.358614 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.375668 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:15Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.407863 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.407923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.407942 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.407970 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.407988 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:15Z","lastTransitionTime":"2026-02-02T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.511140 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.511247 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.511279 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.511318 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.511344 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:15Z","lastTransitionTime":"2026-02-02T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.614124 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.614485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.614495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.614509 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.614517 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:15Z","lastTransitionTime":"2026-02-02T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.720504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.720562 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.720573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.720591 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.720607 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:15Z","lastTransitionTime":"2026-02-02T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.823448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.823504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.823515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.823533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.823546 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:15Z","lastTransitionTime":"2026-02-02T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.926579 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.926646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.926660 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.926683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:15 crc kubenswrapper[4909]: I0202 10:32:15.926705 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:15Z","lastTransitionTime":"2026-02-02T10:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.002506 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 11:24:09.387030605 +0000 UTC Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.029541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.029598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.029610 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.029622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.029631 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:16Z","lastTransitionTime":"2026-02-02T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.132549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.132598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.132618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.132642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.132660 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:16Z","lastTransitionTime":"2026-02-02T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.234960 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.235022 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.235034 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.235061 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.235074 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:16Z","lastTransitionTime":"2026-02-02T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.337533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.337589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.337601 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.337623 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.337634 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:16Z","lastTransitionTime":"2026-02-02T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.440648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.440719 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.440732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.440755 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.440768 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:16Z","lastTransitionTime":"2026-02-02T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.544257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.544322 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.544343 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.544371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.544395 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:16Z","lastTransitionTime":"2026-02-02T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.648183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.648352 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.648392 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.648481 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.648506 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:16Z","lastTransitionTime":"2026-02-02T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.751357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.751392 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.751401 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.751415 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.751424 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:16Z","lastTransitionTime":"2026-02-02T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.854873 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.854921 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.854933 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.854949 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.854959 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:16Z","lastTransitionTime":"2026-02-02T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.958462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.958500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.958509 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.958526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:16 crc kubenswrapper[4909]: I0202 10:32:16.958538 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:16Z","lastTransitionTime":"2026-02-02T10:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.004141 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 17:51:58.644158143 +0000 UTC Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.016452 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:17 crc kubenswrapper[4909]: E0202 10:32:17.016584 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.016996 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:17 crc kubenswrapper[4909]: E0202 10:32:17.017064 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.017117 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:17 crc kubenswrapper[4909]: E0202 10:32:17.017171 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.017215 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:17 crc kubenswrapper[4909]: E0202 10:32:17.017271 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.060527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.060589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.060602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.060620 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.060631 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:17Z","lastTransitionTime":"2026-02-02T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.162340 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.162373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.162382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.162396 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.162405 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:17Z","lastTransitionTime":"2026-02-02T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.265237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.265565 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.265735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.265935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.266094 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:17Z","lastTransitionTime":"2026-02-02T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.368132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.368186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.368202 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.368225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.368242 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:17Z","lastTransitionTime":"2026-02-02T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.470886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.470927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.470938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.470955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.470967 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:17Z","lastTransitionTime":"2026-02-02T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.574121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.574152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.574161 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.574176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.574185 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:17Z","lastTransitionTime":"2026-02-02T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.676081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.676303 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.676371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.676436 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.676503 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:17Z","lastTransitionTime":"2026-02-02T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.779143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.779450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.779561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.779702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.779866 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:17Z","lastTransitionTime":"2026-02-02T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.882377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.882696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.882855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.883016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.883133 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:17Z","lastTransitionTime":"2026-02-02T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.985544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.985577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.985587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.985604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:17 crc kubenswrapper[4909]: I0202 10:32:17.985619 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:17Z","lastTransitionTime":"2026-02-02T10:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.004312 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 03:58:06.783698922 +0000 UTC Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.016252 4909 scope.go:117] "RemoveContainer" containerID="7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0" Feb 02 10:32:18 crc kubenswrapper[4909]: E0202 10:32:18.016437 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\"" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.087661 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.087693 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.087701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.087735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.087745 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:18Z","lastTransitionTime":"2026-02-02T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.190669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.190722 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.190746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.190766 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.190781 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:18Z","lastTransitionTime":"2026-02-02T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.293428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.293463 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.293959 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.294017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.294030 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:18Z","lastTransitionTime":"2026-02-02T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.397109 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.397158 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.397173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.397195 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.397209 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:18Z","lastTransitionTime":"2026-02-02T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.499868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.499914 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.499925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.499940 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.499951 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:18Z","lastTransitionTime":"2026-02-02T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.602690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.602731 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.602743 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.602759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.602772 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:18Z","lastTransitionTime":"2026-02-02T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.705426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.705454 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.705461 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.705475 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.705483 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:18Z","lastTransitionTime":"2026-02-02T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.807688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.807732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.807747 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.807767 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.807782 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:18Z","lastTransitionTime":"2026-02-02T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.911085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.911132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.911139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.911155 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:18 crc kubenswrapper[4909]: I0202 10:32:18.911170 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:18Z","lastTransitionTime":"2026-02-02T10:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.005315 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 19:22:39.383196718 +0000 UTC Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.013936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.013974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.013986 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.014003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.014016 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:19Z","lastTransitionTime":"2026-02-02T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.015557 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.015560 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:19 crc kubenswrapper[4909]: E0202 10:32:19.015931 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.015593 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:19 crc kubenswrapper[4909]: E0202 10:32:19.015940 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.015626 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:19 crc kubenswrapper[4909]: E0202 10:32:19.016293 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:19 crc kubenswrapper[4909]: E0202 10:32:19.016145 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.116665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.116698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.116707 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.116749 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.116764 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:19Z","lastTransitionTime":"2026-02-02T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.218696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.218735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.218747 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.218770 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.218782 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:19Z","lastTransitionTime":"2026-02-02T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.321249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.321288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.321299 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.321314 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.321326 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:19Z","lastTransitionTime":"2026-02-02T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.423794 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.423855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.423864 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.423878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.423889 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:19Z","lastTransitionTime":"2026-02-02T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.526500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.526539 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.526551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.526569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.526581 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:19Z","lastTransitionTime":"2026-02-02T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.628789 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.628852 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.628860 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.628875 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.628889 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:19Z","lastTransitionTime":"2026-02-02T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.730654 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.730686 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.730694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.730722 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.730731 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:19Z","lastTransitionTime":"2026-02-02T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.832376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.832412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.832421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.832435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.832446 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:19Z","lastTransitionTime":"2026-02-02T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.934937 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.934984 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.934996 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.935013 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:19 crc kubenswrapper[4909]: I0202 10:32:19.935028 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:19Z","lastTransitionTime":"2026-02-02T10:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.006013 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 20:26:24.556283303 +0000 UTC Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.037658 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.037700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.037719 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.037740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.037752 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:20Z","lastTransitionTime":"2026-02-02T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.140399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.140438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.140446 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.140459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.140468 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:20Z","lastTransitionTime":"2026-02-02T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.246988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.247022 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.247031 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.247044 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.247055 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:20Z","lastTransitionTime":"2026-02-02T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.349561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.349597 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.349608 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.349624 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.349636 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:20Z","lastTransitionTime":"2026-02-02T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.452142 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.452184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.452196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.452213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.452224 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:20Z","lastTransitionTime":"2026-02-02T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.554237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.554277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.554289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.554309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.554321 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:20Z","lastTransitionTime":"2026-02-02T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.656432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.656473 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.656485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.656503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.656515 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:20Z","lastTransitionTime":"2026-02-02T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.758668 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.758704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.758713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.758726 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.758735 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:20Z","lastTransitionTime":"2026-02-02T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.860838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.860874 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.860882 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.860897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.860906 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:20Z","lastTransitionTime":"2026-02-02T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.964504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.964550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.964559 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.964581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:20 crc kubenswrapper[4909]: I0202 10:32:20.964593 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:20Z","lastTransitionTime":"2026-02-02T10:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.006108 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 01:23:03.892207473 +0000 UTC Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.015552 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.015637 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.015640 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:21 crc kubenswrapper[4909]: E0202 10:32:21.015710 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.015762 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:21 crc kubenswrapper[4909]: E0202 10:32:21.015929 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:21 crc kubenswrapper[4909]: E0202 10:32:21.016052 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:21 crc kubenswrapper[4909]: E0202 10:32:21.016177 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.066624 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.066670 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.066684 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.066701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.066713 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:21Z","lastTransitionTime":"2026-02-02T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.134428 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:21 crc kubenswrapper[4909]: E0202 10:32:21.134572 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:21 crc kubenswrapper[4909]: E0202 10:32:21.134616 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs podName:0f457793-f4e0-4417-ae91-4455722372c1 nodeName:}" failed. No retries permitted until 2026-02-02 10:32:53.134602778 +0000 UTC m=+98.880703513 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs") pod "network-metrics-daemon-2v5vw" (UID: "0f457793-f4e0-4417-ae91-4455722372c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.169713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.169753 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.169761 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.169777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.169787 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:21Z","lastTransitionTime":"2026-02-02T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.271563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.271607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.271619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.271633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.271646 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:21Z","lastTransitionTime":"2026-02-02T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.373910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.373954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.373967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.373983 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.373994 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:21Z","lastTransitionTime":"2026-02-02T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.476244 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.476276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.476286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.476299 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.476309 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:21Z","lastTransitionTime":"2026-02-02T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.578858 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.578898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.578912 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.578928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.578940 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:21Z","lastTransitionTime":"2026-02-02T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.682172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.682208 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.682220 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.682240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.682252 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:21Z","lastTransitionTime":"2026-02-02T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.784348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.784370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.784378 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.784390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.784398 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:21Z","lastTransitionTime":"2026-02-02T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.885870 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.885919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.885927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.885944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.885955 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:21Z","lastTransitionTime":"2026-02-02T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.988261 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.988295 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.988304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.988318 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:21 crc kubenswrapper[4909]: I0202 10:32:21.988327 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:21Z","lastTransitionTime":"2026-02-02T10:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.006951 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 11:07:49.517145537 +0000 UTC Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.090164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.090199 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.090210 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.090227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.090238 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.192821 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.192849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.192858 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.192872 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.192882 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.295136 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.295221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.295244 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.295271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.295290 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: E0202 10:32:22.309437 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.317540 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.317592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.317614 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.317631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.317643 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: E0202 10:32:22.330610 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.334080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.334105 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.334112 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.334125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.334134 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: E0202 10:32:22.346664 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.350110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.350149 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.350166 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.350183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.350195 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: E0202 10:32:22.360569 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.363580 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.363608 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.363617 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.363631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.363640 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: E0202 10:32:22.374259 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:22Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:22 crc kubenswrapper[4909]: E0202 10:32:22.374373 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.375637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.375710 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.375737 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.375772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.375782 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.478728 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.478768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.478777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.478792 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.478803 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.581325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.581361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.581371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.581401 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.581409 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.684102 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.684157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.684168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.684183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.684194 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.785839 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.785881 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.785891 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.785908 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.785921 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.887515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.887548 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.887561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.887576 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.887586 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.989866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.989915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.989928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.989946 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:22 crc kubenswrapper[4909]: I0202 10:32:22.989958 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:22Z","lastTransitionTime":"2026-02-02T10:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.007107 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 09:09:35.159415483 +0000 UTC Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.016478 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:23 crc kubenswrapper[4909]: E0202 10:32:23.016583 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.016653 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.016885 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.017058 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:23 crc kubenswrapper[4909]: E0202 10:32:23.017277 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:23 crc kubenswrapper[4909]: E0202 10:32:23.017214 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:23 crc kubenswrapper[4909]: E0202 10:32:23.017049 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.092156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.092187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.092196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.092210 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.092220 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:23Z","lastTransitionTime":"2026-02-02T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.194143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.194417 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.194619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.194702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.194779 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:23Z","lastTransitionTime":"2026-02-02T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.297662 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.297694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.297705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.297722 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.297733 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:23Z","lastTransitionTime":"2026-02-02T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.400389 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.400426 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.400434 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.400448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.400457 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:23Z","lastTransitionTime":"2026-02-02T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.502924 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.502957 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.502966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.502980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.502989 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:23Z","lastTransitionTime":"2026-02-02T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.605378 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.605601 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.605669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.605732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.605799 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:23Z","lastTransitionTime":"2026-02-02T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.708496 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.708530 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.708538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.708550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.708558 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:23Z","lastTransitionTime":"2026-02-02T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.810596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.810645 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.810655 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.810674 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.810683 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:23Z","lastTransitionTime":"2026-02-02T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.913050 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.913098 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.913113 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.913132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:23 crc kubenswrapper[4909]: I0202 10:32:23.913145 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:23Z","lastTransitionTime":"2026-02-02T10:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.007858 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 18:31:19.389641207 +0000 UTC Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.019634 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.019683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.019695 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.019710 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.019721 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:24Z","lastTransitionTime":"2026-02-02T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.121829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.121870 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.121882 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.121899 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.121911 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:24Z","lastTransitionTime":"2026-02-02T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.224165 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.224444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.224513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.224577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.224639 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:24Z","lastTransitionTime":"2026-02-02T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.327121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.327163 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.327172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.327187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.327197 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:24Z","lastTransitionTime":"2026-02-02T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.429192 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.429226 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.429235 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.429250 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.429260 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:24Z","lastTransitionTime":"2026-02-02T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.531845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.531891 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.531902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.531919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.531930 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:24Z","lastTransitionTime":"2026-02-02T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.633600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.633632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.633640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.633652 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.633661 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:24Z","lastTransitionTime":"2026-02-02T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.735429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.735470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.735481 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.735499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.735510 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:24Z","lastTransitionTime":"2026-02-02T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.837190 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.837229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.837239 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.837255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.837267 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:24Z","lastTransitionTime":"2026-02-02T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.939892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.939927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.939946 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.939965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:24 crc kubenswrapper[4909]: I0202 10:32:24.939977 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:24Z","lastTransitionTime":"2026-02-02T10:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.008763 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 21:48:25.791361696 +0000 UTC Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.016073 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.016204 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.016221 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.016434 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:25 crc kubenswrapper[4909]: E0202 10:32:25.016442 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:25 crc kubenswrapper[4909]: E0202 10:32:25.016522 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:25 crc kubenswrapper[4909]: E0202 10:32:25.016555 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:25 crc kubenswrapper[4909]: E0202 10:32:25.016656 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.036066 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.042373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.042402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.042410 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.042425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.042435 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.048690 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.059871 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.072280 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.081980 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71855549-8a60-4feb-bc35-677f5606d9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.093829 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.103177 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.111023 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.120371 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.129907 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.140606 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.144131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.144184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.144198 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.144215 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.144226 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.151324 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.161919 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.172469 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.182159 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.200133 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.217161 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.226477 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.245881 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.245910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.245919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.245936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.245945 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.347797 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.347857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.347867 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.347881 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.347890 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.417025 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnbvb_bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af/kube-multus/0.log" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.417077 4909 generic.go:334] "Generic (PLEG): container finished" podID="bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af" containerID="6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb" exitCode=1 Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.417108 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnbvb" event={"ID":"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af","Type":"ContainerDied","Data":"6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.417442 4909 scope.go:117] "RemoveContainer" containerID="6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.428968 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.442513 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.450034 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.450076 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.450088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.450104 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.450115 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.455409 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"message\\\":\\\"2026-02-02T10:31:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b\\\\n2026-02-02T10:31:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b to /host/opt/cni/bin/\\\\n2026-02-02T10:31:39Z [verbose] multus-daemon started\\\\n2026-02-02T10:31:39Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:32:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.472896 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.483504 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.493589 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.506450 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.516588 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.530021 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.545649 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.552138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.552194 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.552205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.552223 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.552234 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.565184 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.576593 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.588236 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.598571 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.609212 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.621122 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71855549-8a60-4feb-bc35-677f5606d9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.635993 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.647919 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:25Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.654599 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.654646 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.654658 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.654682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.654697 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.756281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.756312 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.756320 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.756332 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.756340 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.858884 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.858933 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.858943 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.858964 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.858977 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.961482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.961527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.961537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.961559 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:25 crc kubenswrapper[4909]: I0202 10:32:25.961569 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:25Z","lastTransitionTime":"2026-02-02T10:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.009133 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 15:59:27.890806053 +0000 UTC Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.064030 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.064087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.064099 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.064116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.064126 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.166541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.166591 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.166602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.166621 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.166631 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.269541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.269626 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.269637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.269653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.269663 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.372376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.372415 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.372425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.372442 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.372456 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.421673 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnbvb_bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af/kube-multus/0.log" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.421725 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnbvb" event={"ID":"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af","Type":"ContainerStarted","Data":"8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.434113 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.446194 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.457359 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.469886 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"message\\\":\\\"2026-02-02T10:31:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b\\\\n2026-02-02T10:31:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b to /host/opt/cni/bin/\\\\n2026-02-02T10:31:39Z [verbose] multus-daemon started\\\\n2026-02-02T10:31:39Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:32:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.474360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.474402 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.474415 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.474432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.474441 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.489047 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.501179 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.511995 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.527056 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.538362 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.548408 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.561471 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.577254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.577297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.577308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.577325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.577335 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.579069 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.592036 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.604744 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.614846 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.626168 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.636912 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71855549-8a60-4feb-bc35-677f5606d9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.647015 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:26Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.679723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.679784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.679795 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.679824 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.679838 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.781669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.781738 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.781751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.781768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.781801 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.884648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.884698 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.884744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.884761 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.884772 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.987425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.987462 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.987470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.987483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:26 crc kubenswrapper[4909]: I0202 10:32:26.987495 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:26Z","lastTransitionTime":"2026-02-02T10:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.009911 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 08:28:59.684079709 +0000 UTC Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.016282 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:27 crc kubenswrapper[4909]: E0202 10:32:27.016417 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.016446 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.016504 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.016297 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:27 crc kubenswrapper[4909]: E0202 10:32:27.016547 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:27 crc kubenswrapper[4909]: E0202 10:32:27.016610 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:27 crc kubenswrapper[4909]: E0202 10:32:27.016686 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.089445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.089485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.089496 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.089512 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.089524 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.191498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.191525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.191532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.191545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.191555 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.293975 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.294011 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.294021 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.294036 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.294045 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.395767 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.395816 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.395826 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.395840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.395849 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.498003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.498255 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.498329 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.498399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.498455 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.601004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.601229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.601302 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.601364 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.601432 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.703850 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.703901 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.703910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.703923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.703932 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.806790 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.806844 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.806857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.806871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.806882 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.909669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.909708 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.909721 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.909737 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:27 crc kubenswrapper[4909]: I0202 10:32:27.909746 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:27Z","lastTransitionTime":"2026-02-02T10:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.010973 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 02:53:29.987899448 +0000 UTC Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.012903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.012941 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.012953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.012969 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.013014 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.115219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.115256 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.115268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.115283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.115295 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.218016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.218058 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.218069 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.218086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.218098 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.320544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.320573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.320581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.320595 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.320613 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.423088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.423134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.423142 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.423156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.423166 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.525056 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.525095 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.525106 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.525150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.525159 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.627835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.627892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.627904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.627922 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.627933 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.730007 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.730047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.730060 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.730077 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.730089 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.832270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.832313 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.832326 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.832342 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.832357 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.934653 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.934691 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.934700 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.934713 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:28 crc kubenswrapper[4909]: I0202 10:32:28.934721 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:28Z","lastTransitionTime":"2026-02-02T10:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.011466 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 21:37:22.070495096 +0000 UTC Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.015738 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:29 crc kubenswrapper[4909]: E0202 10:32:29.015865 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.015893 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.015905 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.015931 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:29 crc kubenswrapper[4909]: E0202 10:32:29.015971 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:29 crc kubenswrapper[4909]: E0202 10:32:29.016049 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:29 crc kubenswrapper[4909]: E0202 10:32:29.016108 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.036562 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.036599 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.036608 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.036622 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.036632 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.139085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.139140 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.139152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.139168 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.139179 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.241517 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.241550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.241560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.241579 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.241590 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.343897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.343934 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.343945 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.343961 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.343972 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.449468 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.449997 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.450019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.450037 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.450050 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.552534 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.552827 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.552935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.553032 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.553103 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.655772 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.655831 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.655843 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.655857 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.655867 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.758721 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.758762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.758773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.758871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.758895 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.860984 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.861023 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.861031 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.861049 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.861060 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.963116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.963150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.963162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.963178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:29 crc kubenswrapper[4909]: I0202 10:32:29.963188 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:29Z","lastTransitionTime":"2026-02-02T10:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.011576 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 22:29:13.337094417 +0000 UTC Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.065089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.065433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.065556 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.065701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.065794 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.168483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.168543 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.168557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.168572 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.168585 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.270657 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.270697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.270706 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.270722 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.270731 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.373324 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.373372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.373383 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.373399 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.373408 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.475636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.475673 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.475682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.475696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.475706 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.578418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.578498 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.578514 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.578615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.578632 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.680582 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.680630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.680641 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.680660 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.680671 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.783186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.783223 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.783232 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.783265 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.783274 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.885541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.885598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.885610 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.885626 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.885638 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.988289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.988335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.988345 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.988361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:30 crc kubenswrapper[4909]: I0202 10:32:30.988372 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:30Z","lastTransitionTime":"2026-02-02T10:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.012233 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 19:32:10.034450352 +0000 UTC Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.015573 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.015609 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.015689 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:31 crc kubenswrapper[4909]: E0202 10:32:31.015878 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.015950 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:31 crc kubenswrapper[4909]: E0202 10:32:31.015992 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:31 crc kubenswrapper[4909]: E0202 10:32:31.016092 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:31 crc kubenswrapper[4909]: E0202 10:32:31.016185 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.091479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.091536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.091548 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.091571 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.091585 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.193592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.193632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.193642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.193655 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.193665 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.296887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.296930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.296938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.296956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.296965 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.399667 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.399737 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.399749 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.399778 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.399796 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.503435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.503504 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.503519 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.503549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.503581 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.606730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.606791 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.606823 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.606844 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.606857 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.709955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.710311 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.710361 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.710387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.710403 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.813093 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.813151 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.813164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.813184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.813197 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.916220 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.916270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.916279 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.916293 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:31 crc kubenswrapper[4909]: I0202 10:32:31.916301 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:31Z","lastTransitionTime":"2026-02-02T10:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.012775 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 10:25:21.655719306 +0000 UTC Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.018012 4909 scope.go:117] "RemoveContainer" containerID="7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.019870 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.019939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.019954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.019973 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.020017 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.124025 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.124078 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.124088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.124106 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.124117 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.227203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.227247 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.227256 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.227272 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.227282 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.330092 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.330142 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.330150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.330163 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.330178 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.432607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.432642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.432649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.432663 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.432672 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.441158 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/2.log" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.447497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.448000 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.460855 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.471956 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.483968 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"message\\\":\\\"2026-02-02T10:31:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b\\\\n2026-02-02T10:31:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b to /host/opt/cni/bin/\\\\n2026-02-02T10:31:39Z [verbose] multus-daemon started\\\\n2026-02-02T10:31:39Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:32:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.501495 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.512624 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.524239 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.534542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.534577 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.534585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.534600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.534609 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.538066 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.549869 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.560607 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.573444 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.578268 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.578310 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.578321 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.578337 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.578348 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: E0202 10:32:32.590288 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.593546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.593596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.593612 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.593630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.593642 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.596781 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: E0202 10:32:32.606451 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.609846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.609878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.609892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.609907 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.609917 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.611405 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: E0202 10:32:32.623803 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.626278 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.627346 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.627388 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.627403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.627421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.627434 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.637065 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: E0202 10:32:32.639292 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.642184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.642227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.642236 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.642250 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.642258 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.648248 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: E0202 10:32:32.652930 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: E0202 10:32:32.653032 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.654544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.654573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.654583 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.654661 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.654683 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.681006 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71855549-8a60-4feb-bc35-677f5606d9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.692834 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.708537 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:32Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.756309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.756368 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.756383 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.756398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.756406 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.860066 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.860721 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.860744 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.860761 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.860772 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.962845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.962893 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.962904 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.962920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:32 crc kubenswrapper[4909]: I0202 10:32:32.962930 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:32Z","lastTransitionTime":"2026-02-02T10:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.013105 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 18:15:35.692381286 +0000 UTC Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.016508 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.016582 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.016536 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.016536 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:33 crc kubenswrapper[4909]: E0202 10:32:33.016697 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:33 crc kubenswrapper[4909]: E0202 10:32:33.016942 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:33 crc kubenswrapper[4909]: E0202 10:32:33.016958 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:33 crc kubenswrapper[4909]: E0202 10:32:33.017018 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.064633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.064703 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.064715 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.064730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.064742 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.166978 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.167012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.167022 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.167037 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.167047 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.268839 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.268877 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.268887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.268902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.268912 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.370990 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.371022 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.371031 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.371046 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.371058 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.453415 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/3.log" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.454392 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/2.log" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.457948 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" exitCode=1 Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.458026 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.458160 4909 scope.go:117] "RemoveContainer" containerID="7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.458683 4909 scope.go:117] "RemoveContainer" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:32:33 crc kubenswrapper[4909]: E0202 10:32:33.458982 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\"" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.473418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.473493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.473519 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.473550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.473574 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.478192 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.490645 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.500822 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.513629 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.523303 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71855549-8a60-4feb-bc35-677f5606d9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.532889 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.542544 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.550593 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.558946 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.569729 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.576595 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.576636 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.576647 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.576663 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.576676 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.587568 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e6627db959538eb07584bdf91828172567965a95f6b3d025163ca03b390f8d0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798010 6635 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:03.798185 6635 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0202 10:32:03.798470 6635 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:03.798832 6635 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:03.798875 6635 factory.go:656] Stopping watch factory\\\\nI0202 10:32:03.798892 6635 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:03.827138 6635 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:03.827168 6635 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:03.827227 6635 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:03.827249 6635 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:03.827324 6635 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0202 10:32:32.762440 7036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:32.762712 7036 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:32.762862 7036 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:32.762950 7036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:32.763404 7036 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:32.763462 7036 factory.go:656] Stopping watch factory\\\\nI0202 10:32:32.763476 7036 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:32.810509 7036 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:32.810541 7036 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:32.810617 7036 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:32.810644 7036 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:32.810751 7036 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.599344 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.610866 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.624833 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.640509 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.652410 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.663445 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.675450 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"message\\\":\\\"2026-02-02T10:31:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b\\\\n2026-02-02T10:31:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b to /host/opt/cni/bin/\\\\n2026-02-02T10:31:39Z [verbose] multus-daemon started\\\\n2026-02-02T10:31:39Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:32:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:33Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.678971 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.679049 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.679061 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.679078 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.679100 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.781457 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.781499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.781526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.781541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.781549 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.884229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.884261 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.884269 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.884282 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.884291 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.987529 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.987581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.987596 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.987617 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:33 crc kubenswrapper[4909]: I0202 10:32:33.987629 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:33Z","lastTransitionTime":"2026-02-02T10:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.014017 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 12:31:05.45236334 +0000 UTC Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.091066 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.091153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.091169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.091192 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.091204 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.197537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.197598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.197613 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.197632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.197649 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.299871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.299906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.299917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.299932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.299941 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.402684 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.403017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.403080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.403156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.403492 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.461868 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/3.log" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.466118 4909 scope.go:117] "RemoveContainer" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:32:34 crc kubenswrapper[4909]: E0202 10:32:34.466281 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\"" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.477781 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.488356 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.501205 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.505648 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.505691 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.505702 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.505719 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.505729 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.514367 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71855549-8a60-4feb-bc35-677f5606d9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.525378 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.535526 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.547464 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.557013 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.566976 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"message\\\":\\\"2026-02-02T10:31:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b\\\\n2026-02-02T10:31:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b to /host/opt/cni/bin/\\\\n2026-02-02T10:31:39Z [verbose] multus-daemon started\\\\n2026-02-02T10:31:39Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:32:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.582764 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0202 10:32:32.762440 7036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:32.762712 7036 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:32.762862 7036 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:32.762950 7036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:32.763404 7036 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:32.763462 7036 factory.go:656] Stopping watch factory\\\\nI0202 10:32:32.763476 7036 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:32.810509 7036 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:32.810541 7036 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:32.810617 7036 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:32.810644 7036 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:32.810751 7036 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.593067 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.602387 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.608186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.608448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.608531 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.608603 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.608679 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.615924 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.627171 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.637694 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.653115 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.673153 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.684410 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:34Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.711377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.711421 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.711432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.711450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.711461 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.814240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.814496 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.814615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.814715 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.814960 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.916798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.916862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.916878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.916902 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:34 crc kubenswrapper[4909]: I0202 10:32:34.916915 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:34Z","lastTransitionTime":"2026-02-02T10:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.015084 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 10:56:09.728032691 +0000 UTC Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.015457 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.015679 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.015989 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:35 crc kubenswrapper[4909]: E0202 10:32:35.015980 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.016014 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:35 crc kubenswrapper[4909]: E0202 10:32:35.016098 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:35 crc kubenswrapper[4909]: E0202 10:32:35.016275 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:35 crc kubenswrapper[4909]: E0202 10:32:35.016391 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.018877 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.018938 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.018960 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.018988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.019007 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.120637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.120680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.120691 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.120707 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.120719 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.198138 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.219527 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.230309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.230376 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.230390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.230415 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.230435 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.246013 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.259535 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.273180 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"message\\\":\\\"2026-02-02T10:31:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b\\\\n2026-02-02T10:31:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b to /host/opt/cni/bin/\\\\n2026-02-02T10:31:39Z [verbose] multus-daemon started\\\\n2026-02-02T10:31:39Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:32:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.292477 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0202 10:32:32.762440 7036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:32.762712 7036 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:32.762862 7036 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:32.762950 7036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:32.763404 7036 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:32.763462 7036 factory.go:656] Stopping watch factory\\\\nI0202 10:32:32.763476 7036 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:32.810509 7036 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:32.810541 7036 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:32.810617 7036 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:32.810644 7036 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:32.810751 7036 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.306627 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.317625 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.329867 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.332848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.332882 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.332895 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.332912 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.332926 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.343593 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.357846 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.374998 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.405863 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.423486 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.435203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.435270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.435284 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.435309 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.435322 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.437755 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.449445 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.460696 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.476441 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71855549-8a60-4feb-bc35-677f5606d9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:35Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.538502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.538585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.538602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.538627 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.538641 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.641832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.641947 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.641962 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.641979 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.641989 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.744449 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.744501 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.744515 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.744537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.744553 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.847306 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.847355 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.847367 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.847387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.847400 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.949470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.949540 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.949558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.949588 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:35 crc kubenswrapper[4909]: I0202 10:32:35.949623 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:35Z","lastTransitionTime":"2026-02-02T10:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.015970 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 18:46:51.042133855 +0000 UTC Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.052385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.052455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.052472 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.052493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.052507 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.155525 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.155573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.155585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.155604 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.155616 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.257406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.257440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.257449 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.257463 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.257480 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.360296 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.360359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.360377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.360400 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.360418 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.462901 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.462956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.462967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.462984 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.462996 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.565374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.565424 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.565437 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.565455 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.565467 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.667985 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.668023 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.668030 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.668047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.668056 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.772219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.772563 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.772578 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.772598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.772612 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.875385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.875423 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.875432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.875445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.875455 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.977764 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.977820 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.977832 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.977849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:36 crc kubenswrapper[4909]: I0202 10:32:36.977860 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:36Z","lastTransitionTime":"2026-02-02T10:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.015840 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.015866 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.015877 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.015840 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:37 crc kubenswrapper[4909]: E0202 10:32:37.015976 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:37 crc kubenswrapper[4909]: E0202 10:32:37.016041 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:37 crc kubenswrapper[4909]: E0202 10:32:37.016094 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:37 crc kubenswrapper[4909]: E0202 10:32:37.016140 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.016125 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 18:19:29.035146361 +0000 UTC Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.079840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.079877 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.079890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.079908 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.079920 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.182405 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.182432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.182440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.182453 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.182462 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.285129 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.285162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.285171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.285186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.285195 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.388769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.388835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.388855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.388871 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.388884 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.491710 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.491758 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.491770 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.491868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.491896 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.594232 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.594271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.594281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.594300 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.594312 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.696379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.696418 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.696429 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.696445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.696456 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.798833 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.798894 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.798912 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.798933 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.798948 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.901940 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.901974 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.901985 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.902000 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:37 crc kubenswrapper[4909]: I0202 10:32:37.902010 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:37Z","lastTransitionTime":"2026-02-02T10:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.004015 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.004056 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.004065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.004080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.004089 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.016462 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 19:00:05.617907534 +0000 UTC Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.027744 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.106087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.106124 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.106135 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.106149 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.106160 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.208637 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.208675 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.208687 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.208705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.208717 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.310466 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.310510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.310522 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.310536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.310547 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.413721 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.413781 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.413793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.413840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.413854 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.516019 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.516054 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.516063 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.516077 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.516086 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.617779 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.617835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.617846 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.617860 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.617869 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.717723 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.718004 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.717973235 +0000 UTC m=+148.464073980 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.719556 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.719590 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.719599 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.719613 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.719621 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.818657 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.818718 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.818764 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.818800 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.818838 4909 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.818890 4909 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.818954 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.818933052 +0000 UTC m=+148.565033807 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.818955 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.818982 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.818969033 +0000 UTC m=+148.565069788 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.818984 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.819002 4909 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.818953 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.819047 4909 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.819084 4909 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.819052 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.819033135 +0000 UTC m=+148.565133920 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:38 crc kubenswrapper[4909]: E0202 10:32:38.819133 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.819121798 +0000 UTC m=+148.565222593 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.821963 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.822005 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.822016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.822030 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.822039 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.924834 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.924866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.924876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.924889 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:38 crc kubenswrapper[4909]: I0202 10:32:38.924897 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:38Z","lastTransitionTime":"2026-02-02T10:32:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.015473 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.015514 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.015527 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:39 crc kubenswrapper[4909]: E0202 10:32:39.015604 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.015680 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:39 crc kubenswrapper[4909]: E0202 10:32:39.015704 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:39 crc kubenswrapper[4909]: E0202 10:32:39.016234 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:39 crc kubenswrapper[4909]: E0202 10:32:39.016383 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.018793 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 07:18:05.917339358 +0000 UTC Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.026395 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.026437 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.026446 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.026459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.026467 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.129187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.129227 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.129236 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.129252 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.129263 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.231347 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.231384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.231395 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.231410 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.231421 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.333741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.333774 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.333784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.333800 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.333829 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.436015 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.436047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.436056 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.436071 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.436083 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.538668 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.538705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.538717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.538734 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.538743 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.641144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.641181 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.641196 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.641213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.641225 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.744374 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.744461 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.744476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.744500 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.744517 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.847031 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.847108 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.847120 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.847142 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.847157 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.950489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.950535 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.950544 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.950560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:39 crc kubenswrapper[4909]: I0202 10:32:39.950570 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:39Z","lastTransitionTime":"2026-02-02T10:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.019258 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 11:49:26.428067406 +0000 UTC Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.052696 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.052768 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.052780 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.052848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.052866 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.155748 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.155787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.155821 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.155841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.155853 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.257723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.257764 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.257774 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.257786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.257794 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.360304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.360341 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.360350 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.360363 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.360374 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.462356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.462387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.462395 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.462408 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.462418 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.564666 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.564939 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.565068 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.565159 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.565237 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.667326 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.667370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.667382 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.667398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.667410 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.770479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.770632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.770656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.770689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.770715 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.873489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.873526 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.873536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.873550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.873561 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.976040 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.976080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.976091 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.976103 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:40 crc kubenswrapper[4909]: I0202 10:32:40.976114 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:40Z","lastTransitionTime":"2026-02-02T10:32:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.015497 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.015545 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.015573 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:41 crc kubenswrapper[4909]: E0202 10:32:41.015667 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.015687 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:41 crc kubenswrapper[4909]: E0202 10:32:41.015827 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:41 crc kubenswrapper[4909]: E0202 10:32:41.015874 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:41 crc kubenswrapper[4909]: E0202 10:32:41.015936 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.020101 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 18:49:25.426136022 +0000 UTC Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.077965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.078014 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.078022 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.078037 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.078050 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.179775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.180127 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.180139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.180152 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.180161 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.282308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.282357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.282368 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.282389 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.282402 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.384345 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.384379 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.384388 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.384406 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.384417 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.486492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.486521 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.486529 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.486542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.486550 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.588953 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.589000 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.589016 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.589068 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.589080 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.691575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.691619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.691627 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.691640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.691649 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.793980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.794040 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.794048 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.794063 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.794074 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.895783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.895829 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.895840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.895858 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.895871 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.998323 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.998360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.998370 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.998387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:41 crc kubenswrapper[4909]: I0202 10:32:41.998396 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:41Z","lastTransitionTime":"2026-02-02T10:32:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.020230 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 07:03:03.542525467 +0000 UTC Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.100606 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.100881 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.100980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.101082 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.101160 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.203919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.203955 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.203964 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.203980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.203989 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.306093 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.306116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.306125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.306138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.306147 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.407951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.408178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.408248 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.408359 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.408425 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.510797 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.511094 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.511176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.511249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.511315 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.613969 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.614004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.614011 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.614025 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.614036 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.716079 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.716307 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.716394 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.716458 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.716524 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.818898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.818936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.818944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.818959 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.818968 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.857347 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.857390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.857403 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.857420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.857431 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: E0202 10:32:42.869915 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.873108 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.873145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.873156 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.873173 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.873183 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: E0202 10:32:42.884157 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.886880 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.886910 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.886918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.886936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.886946 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: E0202 10:32:42.896411 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.899292 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.899325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.899335 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.899348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.899357 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: E0202 10:32:42.909793 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.912432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.912459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.912468 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.912483 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.912493 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:42 crc kubenswrapper[4909]: E0202 10:32:42.922547 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:42Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:42 crc kubenswrapper[4909]: E0202 10:32:42.922651 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.924110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.924144 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.924155 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.924171 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:42 crc kubenswrapper[4909]: I0202 10:32:42.924183 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:42Z","lastTransitionTime":"2026-02-02T10:32:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.016306 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.016397 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.016688 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.016965 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:43 crc kubenswrapper[4909]: E0202 10:32:43.017075 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:43 crc kubenswrapper[4909]: E0202 10:32:43.016961 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:43 crc kubenswrapper[4909]: E0202 10:32:43.017117 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:43 crc kubenswrapper[4909]: E0202 10:32:43.017296 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.021106 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 16:36:41.534515221 +0000 UTC Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.026883 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.026944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.026967 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.026993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.027015 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.129191 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.129231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.129244 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.129263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.129274 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.231935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.232010 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.232023 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.232040 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.232052 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.334549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.334579 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.334588 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.334600 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.334609 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.440249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.440304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.440316 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.440334 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.440345 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.542656 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.542891 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.542988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.543051 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.543123 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.645680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.645707 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.645715 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.645729 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.645737 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.748151 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.748178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.748187 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.748205 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.748216 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.850541 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.850587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.850598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.850616 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.850628 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.953047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.953080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.953088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.953101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:43 crc kubenswrapper[4909]: I0202 10:32:43.953110 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:43Z","lastTransitionTime":"2026-02-02T10:32:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.021512 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 15:16:28.245634847 +0000 UTC Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.055304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.055537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.055624 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.055703 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.055793 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.158783 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.159041 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.159112 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.159213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.159278 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.261745 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.261776 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.261784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.261796 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.261804 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.364317 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.364581 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.364689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.364765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.364871 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.467397 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.467682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.467767 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.467925 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.468035 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.570775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.570987 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.571065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.571162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.571243 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.673642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.673679 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.673690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.673730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.673744 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.776389 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.776425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.776432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.776446 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.776456 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.878699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.878746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.878756 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.878769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.878777 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.981422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.981464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.981475 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.981492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:44 crc kubenswrapper[4909]: I0202 10:32:44.981501 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:44Z","lastTransitionTime":"2026-02-02T10:32:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.016014 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.016057 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.016057 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:45 crc kubenswrapper[4909]: E0202 10:32:45.016144 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.016199 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:45 crc kubenswrapper[4909]: E0202 10:32:45.016376 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:45 crc kubenswrapper[4909]: E0202 10:32:45.016397 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:45 crc kubenswrapper[4909]: E0202 10:32:45.016462 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.022601 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 23:50:08.575055622 +0000 UTC Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.026568 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6058abf2-62e8-4b7d-bc24-61a116ace130\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e7c5a7ef3ac7b5bb69e5783d19d53ba17ebe05a10903ba260faa7c15290f5fea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77a45f7f6bf17cb3340a56769fb18deaca777960a0582243179c0151c8782dc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77a45f7f6bf17cb3340a56769fb18deaca777960a0582243179c0151c8782dc3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.036970 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://44d31dca2d78ab1ec99c31ced6e70acef53c3525c51597f3f45f0b50a5d9421c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.054280 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ca5084bc-8bd1-4964-9a52-384222fc8374\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:32Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0202 10:32:32.762440 7036 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:32.762712 7036 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:32.762862 7036 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0202 10:32:32.762950 7036 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0202 10:32:32.763404 7036 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0202 10:32:32.763462 7036 factory.go:656] Stopping watch factory\\\\nI0202 10:32:32.763476 7036 handler.go:208] Removed *v1.Node event handler 2\\\\nI0202 10:32:32.810509 7036 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0202 10:32:32.810541 7036 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0202 10:32:32.810617 7036 ovnkube.go:599] Stopped ovnkube\\\\nI0202 10:32:32.810644 7036 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0202 10:32:32.810751 7036 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:32:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4tkj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-775zr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.064928 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47146704-6957-41b2-ae8b-866b5fb08c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddd562b5e61d91cbf79d43ade2df052504ba7682eac59fe8f6636146537372e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://34f2669f75956a8f492a92214254f6614fa8a0da8feb9ecc763e776e16315b31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lz5vc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nfrsl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.076474 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0f457793-f4e0-4417-ae91-4455722372c1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qj9t6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-2v5vw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.083423 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.083459 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.083470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.083487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.083498 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.090024 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e9f3ab5-f3f2-495a-8f51-ee432c06f828\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"message\\\":\\\"le observer\\\\nW0202 10:31:35.224062 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0202 10:31:35.224171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0202 10:31:35.225011 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1626940810/tls.crt::/tmp/serving-cert-1626940810/tls.key\\\\\\\"\\\\nI0202 10:31:35.481307 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0202 10:31:35.490652 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0202 10:31:35.490685 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0202 10:31:35.490713 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0202 10:31:35.490720 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0202 10:31:35.505681 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0202 10:31:35.505712 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505718 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0202 10:31:35.505724 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0202 10:31:35.505727 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0202 10:31:35.505730 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0202 10:31:35.505733 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0202 10:31:35.505950 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0202 10:31:35.507296 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.103173 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.113525 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd86db6f0426caec5d47a7dd0cb0ee0a556da3b4e3a474a604bbd248202bfe60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://660a693f6e4ce8e7502cf93b2398f8ae3bce9bf452cd9a33721bd81393d1d012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.125202 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.139628 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qnbvb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-02T10:32:24Z\\\",\\\"message\\\":\\\"2026-02-02T10:31:39+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b\\\\n2026-02-02T10:31:39+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2bc5e691-b4e6-49f0-a3d2-5c18a26f8b9b to /host/opt/cni/bin/\\\\n2026-02-02T10:31:39Z [verbose] multus-daemon started\\\\n2026-02-02T10:31:39Z [verbose] Readiness Indicator file check\\\\n2026-02-02T10:32:24Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7l8pk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qnbvb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.159131 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c0f1d85-1639-4002-b3ce-331641a03755\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffeb68098be275eb73c9738dcbd02aa9b6cc2e5b1ba44db8d8b34d21b778a05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b994006dd6c161d31f6c347a1ed86b333e11192bb62ca10fac40bf1f7b5b69fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f220bf8cbd6a60862323bd48c6394ef7217fe3426da940519bcd9ba4d10298f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4de0547048169527a9945faa5873668fcf04f255d78e70a7796b22d50a77d37b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1612c8faa08b5dc10996fcfdb5dcb8259dc9c329b23729c080533c1a58fd2b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://915e2141b5558606ef6862267b5913731aba7b49befc2cd683f95278125e6b0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0f532c8aee3910388ac067fc1a8ff3df245cb0c073dcc686d013752e90ddc51d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7536be38661a17325e120fbe89fbadb922427285bc6ca2e92edb3fa38dbc4e2f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.169382 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7cbcebbd-bcfc-4eb7-af88-0d98e72ade33\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d6b1b5d97dc1769fd4c9876a5bd98b74f6e6184d4523689d63c895119b0cd5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c6a5d0e226fc7929d7f082ba52a932b90844c0b31b06d1178aac6cfea0179\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4716281221e0943ad3cc3dc7402bbda8cf6e0f8ec395605f3313c6146fbd7dad\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.178076 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7de68b6c-f308-498c-95a3-27c9caf44f4f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://476a195f3edf064ea4da184742a34926c91f757b2a90cc1bec6bd04e2ef3f5af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dmlrc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ftn2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.185357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.185398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.185409 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.185425 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.185435 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.191875 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6t82h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9776f741-d318-4076-b337-a496344f1d2d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eeecae276ae8ad277f91928e326e0d0c165cffccbe19127c37881e72b694b4dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad18d33e4d07d432dd299a39cbcf568b072ca93ca1f541e2f11ed7a420430ceb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9a94020c250f338e043ffa849bcc86f99db9303ca091107a055cbff68683113\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e624e7779b04e74feffb2d6c9e061e7ee1aef532bb901d85f778bdd560a7f05f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef3366b2ff53305804e72b63693765201d2ffb8136d20d276bcfe4e6458b9c2d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://804dab70cad5290685fb97227458ee2f4d6581b7a9ed2d6e15a66df5ed6fa198\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f0e3d81daeec4243161a66b3eb54a5538d2c89039f09eca465c334674ae503b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhk9b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6t82h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.205617 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71855549-8a60-4feb-bc35-677f5606d9c0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6dbf1daea0619549e3956c4aaea233cc63e03e43690110003f1c3eee9f61fa07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e198de5cf24f9e12e21a62f987167eb62dbb00c2b44eca84d46d23ebac89ef72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://914344068a69d53d58993ab87cb55aec7988a2005eac19a9cb695b23fc428220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a006a78dd9fb665fafd3fe22a317f03bc40561a9cac9152cda8dbd0dfd9e63e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-02T10:31:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-02T10:31:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.217834 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:36Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://669086bd04f83ab61b0d81edde20d81f21584b7a4e0d0fa38bdac8fdaa8858c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.229000 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:34Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.237939 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-f49tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7ca81ab-b06b-4e03-879a-fb5546436e54\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://848d854e2e5a0913153f4d56d1d5d022b0381e404eb060571ab49ec6bfd74c9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxwr2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:35Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-f49tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.247494 4909 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-ccs5q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de56cfec-f410-4c75-b58b-3f82cdc1c603\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-02T10:31:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c1e2ee3f45e60a381848b4131a6f150071339ab9f36e7ee2b7279cb626509a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-02T10:31:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wh52q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-02T10:31:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-ccs5q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:45Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.287285 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.287321 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.287332 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.287346 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.287357 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.390474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.390688 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.390791 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.390883 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.390965 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.493618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.493665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.493675 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.493693 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.493704 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.596341 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.596659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.596878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.597031 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.597186 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.699509 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.699534 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.699542 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.699553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.699562 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.801488 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.801522 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.801555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.801571 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.801582 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.904203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.904245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.904257 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.904277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:45 crc kubenswrapper[4909]: I0202 10:32:45.904291 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:45Z","lastTransitionTime":"2026-02-02T10:32:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.007242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.007308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.007327 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.007352 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.007370 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.023269 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 14:01:05.639371253 +0000 UTC Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.109833 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.109869 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.109881 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.109900 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.109909 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.212404 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.212451 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.212465 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.212482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.212493 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.314481 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.314712 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.314798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.314954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.315059 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.417754 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.417789 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.417800 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.417828 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.417838 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.520377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.520401 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.520412 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.520428 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.520438 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.622454 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.622495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.622506 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.622521 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.622531 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.725416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.725492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.725502 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.725516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.725524 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.827555 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.827592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.827603 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.827619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.827631 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.929486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.929533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.929546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.929562 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:46 crc kubenswrapper[4909]: I0202 10:32:46.929571 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:46Z","lastTransitionTime":"2026-02-02T10:32:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.015966 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.015967 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:47 crc kubenswrapper[4909]: E0202 10:32:47.016089 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.015986 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.015962 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:47 crc kubenswrapper[4909]: E0202 10:32:47.016174 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:47 crc kubenswrapper[4909]: E0202 10:32:47.016303 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:47 crc kubenswrapper[4909]: E0202 10:32:47.016383 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.023554 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 12:01:52.893055353 +0000 UTC Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.032047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.032076 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.032087 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.032101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.032112 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.134930 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.135103 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.135115 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.135128 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.135138 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.236771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.236830 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.236841 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.236866 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.236881 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.338680 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.338738 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.338756 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.338778 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.338802 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.442101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.442336 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.442398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.442474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.442547 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.544458 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.544493 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.544532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.544550 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.544561 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.647131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.647172 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.647184 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.647201 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.647214 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.749872 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.749924 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.749937 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.749954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.749966 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.853096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.853142 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.853153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.853169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.853180 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.955463 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.955495 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.955503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.955516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:47 crc kubenswrapper[4909]: I0202 10:32:47.955526 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:47Z","lastTransitionTime":"2026-02-02T10:32:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.023884 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 16:50:08.094070439 +0000 UTC Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.057708 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.057738 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.057755 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.057771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.057782 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.159472 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.159503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.159523 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.159539 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.159550 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.261954 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.261993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.262004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.262023 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.262034 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.364251 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.364367 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.364385 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.364417 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.364435 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.467741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.467793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.467822 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.467840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.467853 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.570473 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.570528 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.570537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.570552 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.570563 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.674306 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.674364 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.674377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.674397 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.674413 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.777906 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.777962 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.777978 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.778003 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.778020 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.882045 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.882119 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.882145 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.882186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.882210 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.984928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.984969 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.984978 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.984992 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:48 crc kubenswrapper[4909]: I0202 10:32:48.985002 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:48Z","lastTransitionTime":"2026-02-02T10:32:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.016679 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.016713 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.016748 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:49 crc kubenswrapper[4909]: E0202 10:32:49.017197 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.017212 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:49 crc kubenswrapper[4909]: E0202 10:32:49.017422 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:49 crc kubenswrapper[4909]: E0202 10:32:49.017414 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:49 crc kubenswrapper[4909]: E0202 10:32:49.017512 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.018399 4909 scope.go:117] "RemoveContainer" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:32:49 crc kubenswrapper[4909]: E0202 10:32:49.018596 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\"" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.024389 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 05:51:33.08858574 +0000 UTC Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.087608 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.087640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.087649 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.087665 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.087676 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.192659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.192716 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.192730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.192750 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.192772 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.295456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.295508 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.295516 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.295535 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.295548 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.398325 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.398377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.398390 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.398413 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.398429 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.503537 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.503711 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.503785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.503907 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.503983 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.608401 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.608474 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.608632 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.608668 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.608688 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.711189 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.711231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.711254 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.711278 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.711293 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.812745 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.812777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.812785 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.812798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.812823 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.916132 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.916625 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.916642 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.916670 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:49 crc kubenswrapper[4909]: I0202 10:32:49.916688 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:49Z","lastTransitionTime":"2026-02-02T10:32:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.020249 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.020321 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.020339 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.020371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.020389 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.025409 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 06:16:56.643899978 +0000 UTC Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.122995 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.123049 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.123065 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.123085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.123099 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.225760 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.225855 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.225876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.225896 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.225930 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.328787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.328869 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.328881 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.328897 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.328911 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.432138 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.432170 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.432179 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.432197 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.432207 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.534690 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.534766 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.534787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.534840 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.534857 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.638209 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.638264 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.638276 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.638298 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.638311 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.741798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.741862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.741874 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.741890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.741902 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.844499 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.844546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.844557 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.844574 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.844586 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.947253 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.947308 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.947321 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.947345 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:50 crc kubenswrapper[4909]: I0202 10:32:50.947363 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:50Z","lastTransitionTime":"2026-02-02T10:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.015800 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.015865 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:51 crc kubenswrapper[4909]: E0202 10:32:51.015955 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.015824 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.015842 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:51 crc kubenswrapper[4909]: E0202 10:32:51.016096 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:51 crc kubenswrapper[4909]: E0202 10:32:51.016227 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:51 crc kubenswrapper[4909]: E0202 10:32:51.016315 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.026529 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 10:01:29.86052643 +0000 UTC Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.049448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.049482 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.049490 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.049503 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.049513 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.151479 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.151513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.151523 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.151575 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.151587 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.254284 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.254356 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.254366 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.254381 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.254392 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.356559 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.356587 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.356594 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.356607 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.356616 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.458717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.458765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.458780 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.458798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.458843 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.561319 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.561571 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.561640 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.561734 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.561842 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.665077 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.665377 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.665536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.665616 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.665689 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.769692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.769734 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.769746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.769766 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.769784 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.871569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.871611 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.871621 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.871635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.871644 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.973853 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.973890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.973898 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.973911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:51 crc kubenswrapper[4909]: I0202 10:32:51.973919 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:51Z","lastTransitionTime":"2026-02-02T10:32:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.027621 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 16:45:45.920064615 +0000 UTC Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.075838 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.075865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.075876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.075891 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.075903 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.178176 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.178210 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.178221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.178237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.178249 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.280716 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.280798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.280876 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.280966 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.281400 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.384167 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.384203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.384213 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.384229 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.384241 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.486416 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.486448 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.486456 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.486468 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.486477 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.588717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.588752 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.588762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.588777 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.588787 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.691245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.691294 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.691311 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.691334 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.691350 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.793534 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.793568 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.793576 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.793589 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.793598 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.895890 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.895935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.895944 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.895959 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.895969 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.998396 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.998440 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.998449 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.998464 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:52 crc kubenswrapper[4909]: I0202 10:32:52.998474 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:52Z","lastTransitionTime":"2026-02-02T10:32:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.015955 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.015992 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.016013 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.016270 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.016249 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.016373 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.016457 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.016519 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.028739 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 01:24:27.263883727 +0000 UTC Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.100706 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.100751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.100766 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.100786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.100800 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.164192 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.164293 4909 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.164342 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs podName:0f457793-f4e0-4417-ae91-4455722372c1 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:57.164328875 +0000 UTC m=+162.910429610 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs") pod "network-metrics-daemon-2v5vw" (UID: "0f457793-f4e0-4417-ae91-4455722372c1") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.202773 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.202825 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.202835 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.202849 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.202858 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.217886 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.217914 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.217923 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.217936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.217945 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.231842 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.235357 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.235396 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.235407 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.235423 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.235434 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.247241 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.250486 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.250569 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.250585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.250603 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.250613 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.261995 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.265635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.265661 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.265670 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.265682 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.265691 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.278197 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.281888 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.281922 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.281937 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.281956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.281971 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.294150 4909 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-02T10:32:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"81efcdbd-597c-45e5-a1b0-0f6832442cdd\\\",\\\"systemUUID\\\":\\\"632b64ae-2264-4464-afbe-4696d2c7d3e4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-02T10:32:53Z is after 2025-08-24T17:21:41Z" Feb 02 10:32:53 crc kubenswrapper[4909]: E0202 10:32:53.294273 4909 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.305081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.305124 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.305134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.305149 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.305160 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.406948 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.406988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.406999 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.407012 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.407022 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.509398 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.509430 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.509438 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.509450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.509460 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.614845 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.615150 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.615167 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.615182 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.615208 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.717478 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.717580 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.717597 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.717619 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.717655 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.819631 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.819669 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.819677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.819694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.819710 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.922080 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.922121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.922131 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.922146 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:53 crc kubenswrapper[4909]: I0202 10:32:53.922155 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:53Z","lastTransitionTime":"2026-02-02T10:32:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.024757 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.024787 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.024795 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.024826 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.024835 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.029036 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 11:12:57.059372102 +0000 UTC Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.126956 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.126980 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.126988 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.127001 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.127009 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.229165 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.229228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.229239 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.229261 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.229275 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.331583 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.331618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.331630 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.331644 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.331655 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.433740 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.433778 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.433793 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.433827 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.433838 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.535535 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.535566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.535576 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.535591 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.535601 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.637560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.637602 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.637612 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.637626 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.637637 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.739951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.739993 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.740004 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.740021 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.740032 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.841659 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.841694 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.841704 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.841717 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.841726 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.943926 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.943998 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.944053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.944086 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:54 crc kubenswrapper[4909]: I0202 10:32:54.944109 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:54Z","lastTransitionTime":"2026-02-02T10:32:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.016016 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:55 crc kubenswrapper[4909]: E0202 10:32:55.016268 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.016565 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.016645 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.016753 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:55 crc kubenswrapper[4909]: E0202 10:32:55.016914 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:55 crc kubenswrapper[4909]: E0202 10:32:55.017003 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:55 crc kubenswrapper[4909]: E0202 10:32:55.017131 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.029884 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 22:14:22.591171025 +0000 UTC Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.046177 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.046232 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.046245 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.046262 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.046273 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.058089 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=46.05806712 podStartE2EDuration="46.05806712s" podCreationTimestamp="2026-02-02 10:32:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.057865934 +0000 UTC m=+100.803966659" watchObservedRunningTime="2026-02-02 10:32:55.05806712 +0000 UTC m=+100.804167855" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.058249 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ccs5q" podStartSLOduration=79.058244355 podStartE2EDuration="1m19.058244355s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.046487696 +0000 UTC m=+100.792588441" watchObservedRunningTime="2026-02-02 10:32:55.058244355 +0000 UTC m=+100.804345090" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.106882 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-f49tk" podStartSLOduration=80.106860375 podStartE2EDuration="1m20.106860375s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.097424373 +0000 UTC m=+100.843525108" watchObservedRunningTime="2026-02-02 10:32:55.106860375 +0000 UTC m=+100.852961120" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.107366 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=17.107358689 podStartE2EDuration="17.107358689s" podCreationTimestamp="2026-02-02 10:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.107319548 +0000 UTC m=+100.853420293" watchObservedRunningTime="2026-02-02 10:32:55.107358689 +0000 UTC m=+100.853459424" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.151386 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.151470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.151484 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.151510 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.151526 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.165169 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qnbvb" podStartSLOduration=80.165144643 podStartE2EDuration="1m20.165144643s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.135440708 +0000 UTC m=+100.881541443" watchObservedRunningTime="2026-02-02 10:32:55.165144643 +0000 UTC m=+100.911245388" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.177999 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nfrsl" podStartSLOduration=79.177974743 podStartE2EDuration="1m19.177974743s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.177714645 +0000 UTC m=+100.923815390" watchObservedRunningTime="2026-02-02 10:32:55.177974743 +0000 UTC m=+100.924075478" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.225294 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.225275885 podStartE2EDuration="1m19.225275885s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.210037446 +0000 UTC m=+100.956138181" watchObservedRunningTime="2026-02-02 10:32:55.225275885 +0000 UTC m=+100.971376620" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.254121 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.254155 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.254169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.254189 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.254200 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.275704 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=80.275686887 podStartE2EDuration="1m20.275686887s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.274938995 +0000 UTC m=+101.021039730" watchObservedRunningTime="2026-02-02 10:32:55.275686887 +0000 UTC m=+101.021787622" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.310748 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=81.310726076 podStartE2EDuration="1m21.310726076s" podCreationTimestamp="2026-02-02 10:31:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.297526886 +0000 UTC m=+101.043627621" watchObservedRunningTime="2026-02-02 10:32:55.310726076 +0000 UTC m=+101.056826811" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.310958 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podStartSLOduration=80.310952422 podStartE2EDuration="1m20.310952422s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.310349355 +0000 UTC m=+101.056450090" watchObservedRunningTime="2026-02-02 10:32:55.310952422 +0000 UTC m=+101.057053157" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.329701 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6t82h" podStartSLOduration=80.329675501 podStartE2EDuration="1m20.329675501s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:32:55.329259849 +0000 UTC m=+101.075360584" watchObservedRunningTime="2026-02-02 10:32:55.329675501 +0000 UTC m=+101.075776236" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.357302 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.357683 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.357775 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.357878 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.357972 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.460703 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.460755 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.460765 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.460784 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.460800 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.565153 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.565228 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.565251 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.565278 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.565301 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.668800 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.668873 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.668885 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.668903 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.668917 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.772836 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.772877 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.772887 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.772911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.772922 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.875868 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.875911 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.875920 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.875936 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.875946 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.978240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.978277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.978288 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.978306 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:55 crc kubenswrapper[4909]: I0202 10:32:55.978314 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:55Z","lastTransitionTime":"2026-02-02T10:32:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.030193 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 21:02:19.29317291 +0000 UTC Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.080032 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.080064 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.080072 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.080085 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.080094 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.182564 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.182633 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.182643 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.182661 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.182672 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.285297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.285348 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.285358 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.285369 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.285378 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.387489 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.387532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.387545 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.387560 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.387570 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.489485 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.489520 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.489532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.489551 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.489563 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.592164 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.592231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.592244 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.592263 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.592277 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.694375 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.694420 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.694432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.694447 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.694458 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.797552 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.797615 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.797625 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.797639 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.797652 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.900371 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.900422 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.900432 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.900445 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:56 crc kubenswrapper[4909]: I0202 10:32:56.900455 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:56Z","lastTransitionTime":"2026-02-02T10:32:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.002681 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.002716 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.002723 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.002735 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.002744 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.016149 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.016167 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:57 crc kubenswrapper[4909]: E0202 10:32:57.016269 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.016295 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:57 crc kubenswrapper[4909]: E0202 10:32:57.016358 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.016402 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:57 crc kubenswrapper[4909]: E0202 10:32:57.016490 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:57 crc kubenswrapper[4909]: E0202 10:32:57.016483 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.031257 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 04:45:40.81144197 +0000 UTC Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.105349 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.105384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.105392 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.105407 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.105417 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.207547 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.207580 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.207592 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.207606 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.207650 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.309875 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.309907 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.309916 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.309927 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.309937 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.411536 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.411576 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.411586 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.411598 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.411607 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.514444 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.514481 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.514492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.514507 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.514518 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.617053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.617089 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.617101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.617117 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.617127 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.719436 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.719509 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.719527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.719561 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.719582 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.821919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.821951 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.821960 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.821973 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.821982 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.924948 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.924992 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.925000 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.925015 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:57 crc kubenswrapper[4909]: I0202 10:32:57.925024 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:57Z","lastTransitionTime":"2026-02-02T10:32:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.027658 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.027730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.027746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.027769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.027783 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.031750 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 05:37:24.958823101 +0000 UTC Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.131054 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.131108 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.131122 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.131143 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.131158 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.234965 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.235017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.235029 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.235047 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.235059 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.338078 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.338142 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.338160 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.338193 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.338214 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.442210 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.442306 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.442318 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.442337 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.442349 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.545162 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.545204 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.545216 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.545240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.545257 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.647347 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.647384 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.647394 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.647411 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.647421 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.750314 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.750373 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.750388 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.750409 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.750424 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.852237 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.852277 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.852289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.852305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.852317 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.954533 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.954573 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.954582 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.954595 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:58 crc kubenswrapper[4909]: I0202 10:32:58.954603 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:58Z","lastTransitionTime":"2026-02-02T10:32:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.016065 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.016175 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.016115 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.016095 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:32:59 crc kubenswrapper[4909]: E0202 10:32:59.016252 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:32:59 crc kubenswrapper[4909]: E0202 10:32:59.016359 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:32:59 crc kubenswrapper[4909]: E0202 10:32:59.016402 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:32:59 crc kubenswrapper[4909]: E0202 10:32:59.016567 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.032765 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 23:49:29.988957483 +0000 UTC Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.056689 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.056729 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.056738 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.056754 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.056764 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.159220 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.159258 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.159270 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.159289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.159305 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.261863 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.261915 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.261928 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.261945 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.261954 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.364645 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.364692 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.364701 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.364714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.364723 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.466798 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.466872 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.466883 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.466896 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.466905 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.568865 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.568908 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.568918 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.568935 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.568946 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.671703 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.671751 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.671769 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.671791 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.671896 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.774487 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.774524 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.774532 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.774546 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.774558 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.877158 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.877203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.877219 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.877239 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.877253 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.980053 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.980088 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.980096 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.980111 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:32:59 crc kubenswrapper[4909]: I0202 10:32:59.980121 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:32:59Z","lastTransitionTime":"2026-02-02T10:32:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.033252 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 14:27:52.720695235 +0000 UTC Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.082679 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.082719 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.082727 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.082741 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.082752 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.184283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.184322 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.184337 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.184358 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.184373 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.287081 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.287116 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.287125 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.287139 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.287149 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.389788 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.389875 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.389892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.389917 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.389935 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.492183 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.492223 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.492233 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.492248 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.492258 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.594678 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.594709 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.594719 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.594733 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.594744 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.698052 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.698111 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.698129 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.698157 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.698173 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.801360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.801419 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.801435 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.801450 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.801461 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.905054 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.905101 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.905113 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.905134 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:00 crc kubenswrapper[4909]: I0202 10:33:00.905148 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:00Z","lastTransitionTime":"2026-02-02T10:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.008476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.008538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.008554 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.008579 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.008591 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.015771 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:01 crc kubenswrapper[4909]: E0202 10:33:01.015947 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.016344 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.016389 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.016502 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:01 crc kubenswrapper[4909]: E0202 10:33:01.016652 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:01 crc kubenswrapper[4909]: E0202 10:33:01.016832 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:01 crc kubenswrapper[4909]: E0202 10:33:01.016889 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.017137 4909 scope.go:117] "RemoveContainer" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:33:01 crc kubenswrapper[4909]: E0202 10:33:01.017445 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-775zr_openshift-ovn-kubernetes(ca5084bc-8bd1-4964-9a52-384222fc8374)\"" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.033728 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 02:58:33.815215131 +0000 UTC Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.111750 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.111844 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.111862 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.111892 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.111915 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.214232 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.214283 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.214296 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.214318 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.214331 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.316937 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.317031 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.317040 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.317057 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.317068 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.421861 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.421919 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.421932 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.421952 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.421965 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.524654 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.524705 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.524714 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.524730 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.524743 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.628233 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.628278 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.628289 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.628304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.628315 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.731169 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.731206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.731221 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.731242 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.731259 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.834391 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.834433 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.834443 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.834458 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.834470 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.936304 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.936360 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.936372 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.936387 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:01 crc kubenswrapper[4909]: I0202 10:33:01.936399 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:01Z","lastTransitionTime":"2026-02-02T10:33:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.034683 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 11:27:56.107376457 +0000 UTC Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.038697 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.038748 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.038759 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.038771 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.038780 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.141635 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.141678 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.141687 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.141699 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.141708 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.244470 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.244585 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.244601 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.244618 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.244629 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.346513 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.346549 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.346558 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.346572 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.346581 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.449178 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.449215 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.449224 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.449240 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.449250 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.552206 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.552275 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.552286 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.552305 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.552317 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.655225 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.655271 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.655281 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.655297 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.655309 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.758681 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.758743 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.758762 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.758786 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.758805 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.861100 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.861427 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.861492 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.861566 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.861635 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.963791 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.963848 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.963860 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.963877 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:02 crc kubenswrapper[4909]: I0202 10:33:02.963891 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:02Z","lastTransitionTime":"2026-02-02T10:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.016333 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:03 crc kubenswrapper[4909]: E0202 10:33:03.016491 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.016742 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.016778 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.016838 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:03 crc kubenswrapper[4909]: E0202 10:33:03.016895 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:03 crc kubenswrapper[4909]: E0202 10:33:03.016991 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:03 crc kubenswrapper[4909]: E0202 10:33:03.017067 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.035844 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 18:55:50.840038317 +0000 UTC Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.066677 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.066722 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.066732 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.066746 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.066757 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.169181 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.169220 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.169231 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.169248 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.169259 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.271110 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.271186 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.271203 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.271230 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.271249 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.372975 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.373009 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.373017 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.373031 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.373039 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.416476 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.416527 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.416538 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.416553 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.416562 4909 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-02T10:33:03Z","lastTransitionTime":"2026-02-02T10:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.458105 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh"] Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.458794 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.460655 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.460974 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.461136 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.462547 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.573032 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2dabafc7-c904-4127-8b41-91a18e04eda3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.573081 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2dabafc7-c904-4127-8b41-91a18e04eda3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.573104 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dabafc7-c904-4127-8b41-91a18e04eda3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.573261 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dabafc7-c904-4127-8b41-91a18e04eda3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.573375 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2dabafc7-c904-4127-8b41-91a18e04eda3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.674005 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2dabafc7-c904-4127-8b41-91a18e04eda3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.674050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2dabafc7-c904-4127-8b41-91a18e04eda3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.674090 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dabafc7-c904-4127-8b41-91a18e04eda3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.674112 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dabafc7-c904-4127-8b41-91a18e04eda3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.674140 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2dabafc7-c904-4127-8b41-91a18e04eda3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.674141 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/2dabafc7-c904-4127-8b41-91a18e04eda3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.674220 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/2dabafc7-c904-4127-8b41-91a18e04eda3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.674945 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2dabafc7-c904-4127-8b41-91a18e04eda3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.680489 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dabafc7-c904-4127-8b41-91a18e04eda3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.690406 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2dabafc7-c904-4127-8b41-91a18e04eda3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hd6gh\" (UID: \"2dabafc7-c904-4127-8b41-91a18e04eda3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:03 crc kubenswrapper[4909]: I0202 10:33:03.774457 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" Feb 02 10:33:04 crc kubenswrapper[4909]: I0202 10:33:04.036469 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 10:40:58.46687582 +0000 UTC Feb 02 10:33:04 crc kubenswrapper[4909]: I0202 10:33:04.036741 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 02 10:33:04 crc kubenswrapper[4909]: I0202 10:33:04.045152 4909 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 02 10:33:04 crc kubenswrapper[4909]: I0202 10:33:04.549945 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" event={"ID":"2dabafc7-c904-4127-8b41-91a18e04eda3","Type":"ContainerStarted","Data":"64d82144c2ca03fb22e9206a5a19717bff30a17fb872d441a7ea66c281fa2ef6"} Feb 02 10:33:04 crc kubenswrapper[4909]: I0202 10:33:04.549992 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" event={"ID":"2dabafc7-c904-4127-8b41-91a18e04eda3","Type":"ContainerStarted","Data":"43a2cf227437d21d163cf1d9b73060118a891cbc7af5b7300d5717762dddb85b"} Feb 02 10:33:04 crc kubenswrapper[4909]: I0202 10:33:04.563492 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hd6gh" podStartSLOduration=89.563469866 podStartE2EDuration="1m29.563469866s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:04.561901371 +0000 UTC m=+110.308002106" watchObservedRunningTime="2026-02-02 10:33:04.563469866 +0000 UTC m=+110.309570621" Feb 02 10:33:05 crc kubenswrapper[4909]: I0202 10:33:05.016155 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:05 crc kubenswrapper[4909]: I0202 10:33:05.016155 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:05 crc kubenswrapper[4909]: I0202 10:33:05.016205 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:05 crc kubenswrapper[4909]: I0202 10:33:05.016243 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:05 crc kubenswrapper[4909]: E0202 10:33:05.017170 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:05 crc kubenswrapper[4909]: E0202 10:33:05.017299 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:05 crc kubenswrapper[4909]: E0202 10:33:05.017360 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:05 crc kubenswrapper[4909]: E0202 10:33:05.017401 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:07 crc kubenswrapper[4909]: I0202 10:33:07.015881 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:07 crc kubenswrapper[4909]: I0202 10:33:07.015933 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:07 crc kubenswrapper[4909]: E0202 10:33:07.016011 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:07 crc kubenswrapper[4909]: I0202 10:33:07.016028 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:07 crc kubenswrapper[4909]: E0202 10:33:07.016113 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:07 crc kubenswrapper[4909]: E0202 10:33:07.016178 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:07 crc kubenswrapper[4909]: I0202 10:33:07.016218 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:07 crc kubenswrapper[4909]: E0202 10:33:07.016258 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:09 crc kubenswrapper[4909]: I0202 10:33:09.016274 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:09 crc kubenswrapper[4909]: E0202 10:33:09.016407 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:09 crc kubenswrapper[4909]: I0202 10:33:09.016597 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:09 crc kubenswrapper[4909]: E0202 10:33:09.016663 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:09 crc kubenswrapper[4909]: I0202 10:33:09.016837 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:09 crc kubenswrapper[4909]: E0202 10:33:09.016898 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:09 crc kubenswrapper[4909]: I0202 10:33:09.017018 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:09 crc kubenswrapper[4909]: E0202 10:33:09.017095 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:11 crc kubenswrapper[4909]: I0202 10:33:11.016288 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:11 crc kubenswrapper[4909]: I0202 10:33:11.016391 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:11 crc kubenswrapper[4909]: I0202 10:33:11.016613 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:11 crc kubenswrapper[4909]: E0202 10:33:11.016686 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:11 crc kubenswrapper[4909]: E0202 10:33:11.016718 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:11 crc kubenswrapper[4909]: E0202 10:33:11.016777 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:11 crc kubenswrapper[4909]: I0202 10:33:11.016450 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:11 crc kubenswrapper[4909]: E0202 10:33:11.016909 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:11 crc kubenswrapper[4909]: I0202 10:33:11.571186 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnbvb_bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af/kube-multus/1.log" Feb 02 10:33:11 crc kubenswrapper[4909]: I0202 10:33:11.571733 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnbvb_bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af/kube-multus/0.log" Feb 02 10:33:11 crc kubenswrapper[4909]: I0202 10:33:11.571782 4909 generic.go:334] "Generic (PLEG): container finished" podID="bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af" containerID="8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76" exitCode=1 Feb 02 10:33:11 crc kubenswrapper[4909]: I0202 10:33:11.571826 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnbvb" event={"ID":"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af","Type":"ContainerDied","Data":"8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76"} Feb 02 10:33:11 crc kubenswrapper[4909]: I0202 10:33:11.571858 4909 scope.go:117] "RemoveContainer" containerID="6b193c5e9d953b8a8c9340dcffe804109ff2d31f856a786a9fe5f947af4435fb" Feb 02 10:33:11 crc kubenswrapper[4909]: I0202 10:33:11.572193 4909 scope.go:117] "RemoveContainer" containerID="8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76" Feb 02 10:33:11 crc kubenswrapper[4909]: E0202 10:33:11.573948 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-qnbvb_openshift-multus(bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af)\"" pod="openshift-multus/multus-qnbvb" podUID="bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af" Feb 02 10:33:12 crc kubenswrapper[4909]: I0202 10:33:12.575612 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnbvb_bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af/kube-multus/1.log" Feb 02 10:33:13 crc kubenswrapper[4909]: I0202 10:33:13.015933 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:13 crc kubenswrapper[4909]: I0202 10:33:13.015973 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:13 crc kubenswrapper[4909]: I0202 10:33:13.015973 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:13 crc kubenswrapper[4909]: E0202 10:33:13.016044 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:13 crc kubenswrapper[4909]: I0202 10:33:13.016102 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:13 crc kubenswrapper[4909]: E0202 10:33:13.016345 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:13 crc kubenswrapper[4909]: E0202 10:33:13.016501 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:13 crc kubenswrapper[4909]: E0202 10:33:13.016580 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:14 crc kubenswrapper[4909]: I0202 10:33:14.016653 4909 scope.go:117] "RemoveContainer" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:33:14 crc kubenswrapper[4909]: I0202 10:33:14.583550 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/3.log" Feb 02 10:33:14 crc kubenswrapper[4909]: I0202 10:33:14.586639 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerStarted","Data":"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57"} Feb 02 10:33:14 crc kubenswrapper[4909]: I0202 10:33:14.587072 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:33:14 crc kubenswrapper[4909]: I0202 10:33:14.618840 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podStartSLOduration=99.618765235 podStartE2EDuration="1m39.618765235s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:14.616015226 +0000 UTC m=+120.362115961" watchObservedRunningTime="2026-02-02 10:33:14.618765235 +0000 UTC m=+120.364865970" Feb 02 10:33:14 crc kubenswrapper[4909]: E0202 10:33:14.968864 4909 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 02 10:33:15 crc kubenswrapper[4909]: I0202 10:33:15.001545 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2v5vw"] Feb 02 10:33:15 crc kubenswrapper[4909]: I0202 10:33:15.001666 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:15 crc kubenswrapper[4909]: E0202 10:33:15.001762 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:15 crc kubenswrapper[4909]: I0202 10:33:15.016468 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:15 crc kubenswrapper[4909]: E0202 10:33:15.016567 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:15 crc kubenswrapper[4909]: I0202 10:33:15.018253 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:15 crc kubenswrapper[4909]: E0202 10:33:15.018370 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:15 crc kubenswrapper[4909]: I0202 10:33:15.018441 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:15 crc kubenswrapper[4909]: E0202 10:33:15.018493 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:15 crc kubenswrapper[4909]: E0202 10:33:15.124584 4909 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:33:17 crc kubenswrapper[4909]: I0202 10:33:17.016001 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:17 crc kubenswrapper[4909]: I0202 10:33:17.016204 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:17 crc kubenswrapper[4909]: I0202 10:33:17.016098 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:17 crc kubenswrapper[4909]: I0202 10:33:17.016063 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:17 crc kubenswrapper[4909]: E0202 10:33:17.016383 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:17 crc kubenswrapper[4909]: E0202 10:33:17.016606 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:17 crc kubenswrapper[4909]: E0202 10:33:17.016718 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:17 crc kubenswrapper[4909]: E0202 10:33:17.016798 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:19 crc kubenswrapper[4909]: I0202 10:33:19.016124 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:19 crc kubenswrapper[4909]: I0202 10:33:19.016157 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:19 crc kubenswrapper[4909]: I0202 10:33:19.016188 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:19 crc kubenswrapper[4909]: E0202 10:33:19.016231 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:19 crc kubenswrapper[4909]: E0202 10:33:19.016357 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:19 crc kubenswrapper[4909]: I0202 10:33:19.016463 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:19 crc kubenswrapper[4909]: E0202 10:33:19.016518 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:19 crc kubenswrapper[4909]: E0202 10:33:19.016662 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:20 crc kubenswrapper[4909]: E0202 10:33:20.125964 4909 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:33:20 crc kubenswrapper[4909]: I0202 10:33:20.515563 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:33:21 crc kubenswrapper[4909]: I0202 10:33:21.015794 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:21 crc kubenswrapper[4909]: I0202 10:33:21.015850 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:21 crc kubenswrapper[4909]: E0202 10:33:21.015966 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:21 crc kubenswrapper[4909]: I0202 10:33:21.016028 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:21 crc kubenswrapper[4909]: I0202 10:33:21.016030 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:21 crc kubenswrapper[4909]: E0202 10:33:21.016116 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:21 crc kubenswrapper[4909]: E0202 10:33:21.016154 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:21 crc kubenswrapper[4909]: E0202 10:33:21.016200 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:23 crc kubenswrapper[4909]: I0202 10:33:23.016383 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:23 crc kubenswrapper[4909]: I0202 10:33:23.016510 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:23 crc kubenswrapper[4909]: E0202 10:33:23.016575 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:23 crc kubenswrapper[4909]: E0202 10:33:23.016661 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:23 crc kubenswrapper[4909]: I0202 10:33:23.017032 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:23 crc kubenswrapper[4909]: I0202 10:33:23.017098 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:23 crc kubenswrapper[4909]: E0202 10:33:23.017123 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:23 crc kubenswrapper[4909]: E0202 10:33:23.017252 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:24 crc kubenswrapper[4909]: I0202 10:33:24.018120 4909 scope.go:117] "RemoveContainer" containerID="8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76" Feb 02 10:33:24 crc kubenswrapper[4909]: I0202 10:33:24.616865 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnbvb_bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af/kube-multus/1.log" Feb 02 10:33:24 crc kubenswrapper[4909]: I0202 10:33:24.616917 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnbvb" event={"ID":"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af","Type":"ContainerStarted","Data":"8d522d4b79e66f21f5dd0d4fc865f589c28370d17f77d11076d1291b57cee4aa"} Feb 02 10:33:25 crc kubenswrapper[4909]: I0202 10:33:25.016213 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:25 crc kubenswrapper[4909]: I0202 10:33:25.016321 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:25 crc kubenswrapper[4909]: E0202 10:33:25.017560 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:25 crc kubenswrapper[4909]: I0202 10:33:25.017575 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:25 crc kubenswrapper[4909]: I0202 10:33:25.017607 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:25 crc kubenswrapper[4909]: E0202 10:33:25.017696 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:25 crc kubenswrapper[4909]: E0202 10:33:25.017936 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:25 crc kubenswrapper[4909]: E0202 10:33:25.018038 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:25 crc kubenswrapper[4909]: E0202 10:33:25.126773 4909 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:33:27 crc kubenswrapper[4909]: I0202 10:33:27.016090 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:27 crc kubenswrapper[4909]: I0202 10:33:27.016096 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:27 crc kubenswrapper[4909]: I0202 10:33:27.016222 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:27 crc kubenswrapper[4909]: E0202 10:33:27.016730 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:27 crc kubenswrapper[4909]: E0202 10:33:27.016528 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:27 crc kubenswrapper[4909]: E0202 10:33:27.016775 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:27 crc kubenswrapper[4909]: I0202 10:33:27.016306 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:27 crc kubenswrapper[4909]: E0202 10:33:27.016932 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:29 crc kubenswrapper[4909]: I0202 10:33:29.015778 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:29 crc kubenswrapper[4909]: I0202 10:33:29.015862 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:29 crc kubenswrapper[4909]: I0202 10:33:29.015875 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:29 crc kubenswrapper[4909]: E0202 10:33:29.016135 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 02 10:33:29 crc kubenswrapper[4909]: E0202 10:33:29.016201 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2v5vw" podUID="0f457793-f4e0-4417-ae91-4455722372c1" Feb 02 10:33:29 crc kubenswrapper[4909]: I0202 10:33:29.015900 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:29 crc kubenswrapper[4909]: E0202 10:33:29.016319 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 02 10:33:29 crc kubenswrapper[4909]: E0202 10:33:29.016239 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 02 10:33:31 crc kubenswrapper[4909]: I0202 10:33:31.016141 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:31 crc kubenswrapper[4909]: I0202 10:33:31.016228 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:31 crc kubenswrapper[4909]: I0202 10:33:31.016299 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:31 crc kubenswrapper[4909]: I0202 10:33:31.016398 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:31 crc kubenswrapper[4909]: I0202 10:33:31.018858 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 02 10:33:31 crc kubenswrapper[4909]: I0202 10:33:31.019284 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 02 10:33:31 crc kubenswrapper[4909]: I0202 10:33:31.019792 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 02 10:33:31 crc kubenswrapper[4909]: I0202 10:33:31.020282 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 02 10:33:31 crc kubenswrapper[4909]: I0202 10:33:31.021357 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 02 10:33:31 crc kubenswrapper[4909]: I0202 10:33:31.028467 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.172136 4909 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.199326 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-knpc2"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.200072 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.200271 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ghmnw"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.200759 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.203740 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.204048 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.204146 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.204661 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.204857 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.206274 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwjvx"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.206471 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.206773 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.207632 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gxpkn"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.207737 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.208072 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.208460 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.209059 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ddclx"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.209551 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.209138 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.209246 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.209465 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.209494 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.209643 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.210610 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.211006 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.212268 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.212755 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.213237 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-26rcr"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.226414 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.226796 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.227141 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.227349 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.227386 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.227364 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.227478 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.227487 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.227397 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.227619 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.227368 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.227623 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.230344 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.230674 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.230958 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.231004 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.231074 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.231254 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.231270 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.231925 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.251955 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.251996 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.252313 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.252465 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.253225 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.252483 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.252539 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.253593 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-thqss"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.253903 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.254199 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.254443 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.252556 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.255165 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.255420 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.252571 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.253110 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.253720 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.253726 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.253782 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.255956 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.267481 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.267586 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.270887 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ljw49"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.271326 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-64hf5"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.271415 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.271458 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.271597 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.271882 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.272099 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ddclx"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.272148 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.272698 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qqbpq"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.273401 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.273565 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.273671 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.274300 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.275620 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.275894 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.276257 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.280658 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.281438 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.286487 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.287914 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288124 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288276 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288344 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288389 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288437 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288557 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288642 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288722 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288846 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288921 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288976 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.289015 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.289077 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.297540 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.297964 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.298271 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.298377 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.298398 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.298582 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.298600 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.298754 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.288989 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.299576 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.299961 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.300335 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.300602 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.300722 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.300947 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.301082 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.301307 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.301651 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.301892 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.302114 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.304037 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.329131 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.331660 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.331931 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.332111 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.332374 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.332519 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.332728 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.332827 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-slspb"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.333306 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nt7q4"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.333602 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-26rcr"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.333699 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.334062 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.345243 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.345467 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.345857 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.346204 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.347703 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.347959 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.348319 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.348471 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.348574 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.348594 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.349462 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.349537 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.349903 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.350111 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.350406 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.354697 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.358143 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-rtcqn"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.358425 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.358905 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.359469 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gxpkn"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.359571 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.360376 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-gprz6"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.359516 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.359710 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.361026 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.361491 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.361878 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.362283 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.362592 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.362934 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2m2wr"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.363142 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gprz6" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.363244 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.363286 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.364123 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.365536 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.366351 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.366415 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.367677 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.367754 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.374339 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.375543 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.376297 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.378371 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.378701 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7rkdz"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.378875 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.378988 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.379405 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.380006 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.380143 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.381950 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.382090 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-knpc2"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.382112 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwjvx"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.394493 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ghmnw"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.394552 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-thqss"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.396880 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9gvvq"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.400868 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-etcd-client\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.400916 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgs2z\" (UniqueName: \"kubernetes.io/projected/e650bcf6-f84c-4322-86ac-6df17841176d-kube-api-access-lgs2z\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.400920 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.400943 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bec6950a-558b-4df9-a47b-128b7a3e4edb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-f8wcs\" (UID: \"bec6950a-558b-4df9-a47b-128b7a3e4edb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401026 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba5776ad-fa9b-4f21-9dca-40958f01e293-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401055 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzcnm\" (UniqueName: \"kubernetes.io/projected/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-kube-api-access-xzcnm\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401082 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6514ff15-d150-4c13-b96b-cf885b71504a-serving-cert\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401082 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9gvvq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401122 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-etcd-serving-ca\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401162 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-config\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401211 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-client-ca\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401241 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa665eb8-a288-4806-9192-fc30b30868db-config\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401266 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401338 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5srtt\" (UniqueName: \"kubernetes.io/projected/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-kube-api-access-5srtt\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401387 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba5776ad-fa9b-4f21-9dca-40958f01e293-audit-policies\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401410 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6514ff15-d150-4c13-b96b-cf885b71504a-config\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401446 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-serving-cert\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401474 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-encryption-config\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401499 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jcpbk\" (UID: \"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401520 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369d9bc5-f128-4daf-afcb-3e1cdf7da3fb-config\") pod \"kube-controller-manager-operator-78b949d7b-6wjkz\" (UID: \"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-oauth-config\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401573 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401597 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6lm\" (UniqueName: \"kubernetes.io/projected/349306e1-fe46-449a-96c5-0b1e13f29733-kube-api-access-kb6lm\") pod \"cluster-samples-operator-665b6dd947-rqcgf\" (UID: \"349306e1-fe46-449a-96c5-0b1e13f29733\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401625 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f3258e-a4f7-47bb-b448-aaa85f41c9d3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mstlw\" (UID: \"87f3258e-a4f7-47bb-b448-aaa85f41c9d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401651 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401700 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ba5776ad-fa9b-4f21-9dca-40958f01e293-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401727 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba5776ad-fa9b-4f21-9dca-40958f01e293-serving-cert\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401746 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/112f9f29-0947-4a25-adde-b74961a6001b-config\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401789 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jcpbk\" (UID: \"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401856 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2-config\") pod \"kube-apiserver-operator-766d6c64bb-jcpbk\" (UID: \"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401888 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401910 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4txr\" (UniqueName: \"kubernetes.io/projected/87f3258e-a4f7-47bb-b448-aaa85f41c9d3-kube-api-access-p4txr\") pod \"openshift-controller-manager-operator-756b6f6bc6-mstlw\" (UID: \"87f3258e-a4f7-47bb-b448-aaa85f41c9d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401937 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ba5776ad-fa9b-4f21-9dca-40958f01e293-encryption-config\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401956 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.401984 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-client-ca\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6514ff15-d150-4c13-b96b-cf885b71504a-trusted-ca\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402033 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-audit-dir\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402056 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e650bcf6-f84c-4322-86ac-6df17841176d-config\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402075 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ef937a5-1513-4ca5-b719-5418118d0987-machine-approver-tls\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402097 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa665eb8-a288-4806-9192-fc30b30868db-serving-cert\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402117 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bec6950a-558b-4df9-a47b-128b7a3e4edb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-f8wcs\" (UID: \"bec6950a-558b-4df9-a47b-128b7a3e4edb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402158 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2czx4\" (UniqueName: \"kubernetes.io/projected/112f9f29-0947-4a25-adde-b74961a6001b-kube-api-access-2czx4\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402182 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402374 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402440 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-node-pullsecrets\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402462 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/175ecff3-6a2a-4076-a012-7eee503357f9-available-featuregates\") pod \"openshift-config-operator-7777fb866f-26rcr\" (UID: \"175ecff3-6a2a-4076-a012-7eee503357f9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402479 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402505 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402539 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ba5776ad-fa9b-4f21-9dca-40958f01e293-etcd-client\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njf5q\" (UniqueName: \"kubernetes.io/projected/ba5776ad-fa9b-4f21-9dca-40958f01e293-kube-api-access-njf5q\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402589 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402611 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef937a5-1513-4ca5-b719-5418118d0987-config\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402663 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69s26\" (UniqueName: \"kubernetes.io/projected/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-kube-api-access-69s26\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402742 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/349306e1-fe46-449a-96c5-0b1e13f29733-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-rqcgf\" (UID: \"349306e1-fe46-449a-96c5-0b1e13f29733\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402878 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-trusted-ca-bundle\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402931 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/369d9bc5-f128-4daf-afcb-3e1cdf7da3fb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6wjkz\" (UID: \"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402964 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-audit\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.402991 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175ecff3-6a2a-4076-a012-7eee503357f9-serving-cert\") pod \"openshift-config-operator-7777fb866f-26rcr\" (UID: \"175ecff3-6a2a-4076-a012-7eee503357f9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403013 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ef937a5-1513-4ca5-b719-5418118d0987-auth-proxy-config\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403041 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/112f9f29-0947-4a25-adde-b74961a6001b-serving-cert\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403103 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35861719-3b6b-4572-8761-bb9c8bfce573-audit-dir\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403165 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-serving-cert\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403203 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhnd8\" (UniqueName: \"kubernetes.io/projected/0e581142-928a-4e07-888c-362d0ae7b49f-kube-api-access-qhnd8\") pod \"dns-operator-744455d44c-ljw49\" (UID: \"0e581142-928a-4e07-888c-362d0ae7b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403265 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e650bcf6-f84c-4322-86ac-6df17841176d-images\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403321 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e650bcf6-f84c-4322-86ac-6df17841176d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403342 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aa665eb8-a288-4806-9192-fc30b30868db-etcd-client\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ljh2\" (UniqueName: \"kubernetes.io/projected/aa665eb8-a288-4806-9192-fc30b30868db-kube-api-access-9ljh2\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403394 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369d9bc5-f128-4daf-afcb-3e1cdf7da3fb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6wjkz\" (UID: \"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403415 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403460 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-console-config\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403482 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-service-ca\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403508 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0e581142-928a-4e07-888c-362d0ae7b49f-metrics-tls\") pod \"dns-operator-744455d44c-ljw49\" (UID: \"0e581142-928a-4e07-888c-362d0ae7b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403598 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-image-import-ca\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403623 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8869\" (UniqueName: \"kubernetes.io/projected/175ecff3-6a2a-4076-a012-7eee503357f9-kube-api-access-c8869\") pod \"openshift-config-operator-7777fb866f-26rcr\" (UID: \"175ecff3-6a2a-4076-a012-7eee503357f9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403644 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba5776ad-fa9b-4f21-9dca-40958f01e293-audit-dir\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403667 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aa665eb8-a288-4806-9192-fc30b30868db-etcd-service-ca\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403708 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/112f9f29-0947-4a25-adde-b74961a6001b-service-ca-bundle\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403758 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403848 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87f3258e-a4f7-47bb-b448-aaa85f41c9d3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mstlw\" (UID: \"87f3258e-a4f7-47bb-b448-aaa85f41c9d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403879 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-config\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403914 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-serving-cert\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.403961 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b9wk\" (UniqueName: \"kubernetes.io/projected/e6595d49-3b53-44fc-a253-a252a53333a2-kube-api-access-8b9wk\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.404002 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aa665eb8-a288-4806-9192-fc30b30868db-etcd-ca\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.404029 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/112f9f29-0947-4a25-adde-b74961a6001b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.404225 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wstr\" (UniqueName: \"kubernetes.io/projected/35861719-3b6b-4572-8761-bb9c8bfce573-kube-api-access-4wstr\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.404257 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8l56\" (UniqueName: \"kubernetes.io/projected/6514ff15-d150-4c13-b96b-cf885b71504a-kube-api-access-s8l56\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.404323 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-config\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.404362 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-audit-policies\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.404407 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-serving-cert\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.404433 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-oauth-serving-cert\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.405120 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.405430 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nt7q4"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.405519 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdghg\" (UniqueName: \"kubernetes.io/projected/8ef937a5-1513-4ca5-b719-5418118d0987-kube-api-access-kdghg\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.405567 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmnjt\" (UniqueName: \"kubernetes.io/projected/bec6950a-558b-4df9-a47b-128b7a3e4edb-kube-api-access-cmnjt\") pod \"openshift-apiserver-operator-796bbdcf4f-f8wcs\" (UID: \"bec6950a-558b-4df9-a47b-128b7a3e4edb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.408434 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-64hf5"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.412051 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.413866 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.416652 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-slspb"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.418677 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.420190 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.422145 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.424441 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.427854 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.429626 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ljw49"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.430759 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.431731 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qqbpq"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.432708 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.433753 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.434724 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.435988 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.437348 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.438122 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.439082 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.440016 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.441800 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.443440 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gprz6"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.443533 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.445073 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.445892 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.447320 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2m2wr"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.448894 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.449775 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9sjgx"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.451609 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jp9hd"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.451732 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.452707 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.452965 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.453680 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9gvvq"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.455156 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9sjgx"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.456630 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7rkdz"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.457876 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jp9hd"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.459209 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-f59sb"] Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.459692 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.461947 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.483074 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.502367 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506101 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5srtt\" (UniqueName: \"kubernetes.io/projected/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-kube-api-access-5srtt\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506146 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba5776ad-fa9b-4f21-9dca-40958f01e293-audit-policies\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506179 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6514ff15-d150-4c13-b96b-cf885b71504a-config\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506213 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-serving-cert\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-encryption-config\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506263 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jcpbk\" (UID: \"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506296 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369d9bc5-f128-4daf-afcb-3e1cdf7da3fb-config\") pod \"kube-controller-manager-operator-78b949d7b-6wjkz\" (UID: \"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506327 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqt59\" (UniqueName: \"kubernetes.io/projected/b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb-kube-api-access-mqt59\") pod \"downloads-7954f5f757-gprz6\" (UID: \"b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb\") " pod="openshift-console/downloads-7954f5f757-gprz6" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506356 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-oauth-config\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506379 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a292656-f231-47aa-9ad8-f47d92cafb32-images\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506426 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6lm\" (UniqueName: \"kubernetes.io/projected/349306e1-fe46-449a-96c5-0b1e13f29733-kube-api-access-kb6lm\") pod \"cluster-samples-operator-665b6dd947-rqcgf\" (UID: \"349306e1-fe46-449a-96c5-0b1e13f29733\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506449 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f3258e-a4f7-47bb-b448-aaa85f41c9d3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mstlw\" (UID: \"87f3258e-a4f7-47bb-b448-aaa85f41c9d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506475 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.506503 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ba5776ad-fa9b-4f21-9dca-40958f01e293-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.507391 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f3258e-a4f7-47bb-b448-aaa85f41c9d3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-mstlw\" (UID: \"87f3258e-a4f7-47bb-b448-aaa85f41c9d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.507611 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6514ff15-d150-4c13-b96b-cf885b71504a-config\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.507737 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ba5776ad-fa9b-4f21-9dca-40958f01e293-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.507792 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ba5776ad-fa9b-4f21-9dca-40958f01e293-audit-policies\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.509202 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba5776ad-fa9b-4f21-9dca-40958f01e293-serving-cert\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.509341 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/112f9f29-0947-4a25-adde-b74961a6001b-config\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.510008 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/112f9f29-0947-4a25-adde-b74961a6001b-config\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.510053 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jcpbk\" (UID: \"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.510092 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2-config\") pod \"kube-apiserver-operator-766d6c64bb-jcpbk\" (UID: \"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.510125 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4txr\" (UniqueName: \"kubernetes.io/projected/87f3258e-a4f7-47bb-b448-aaa85f41c9d3-kube-api-access-p4txr\") pod \"openshift-controller-manager-operator-756b6f6bc6-mstlw\" (UID: \"87f3258e-a4f7-47bb-b448-aaa85f41c9d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.510253 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ba5776ad-fa9b-4f21-9dca-40958f01e293-encryption-config\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.510404 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.510755 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-client-ca\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511638 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-client-ca\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511660 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511706 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6514ff15-d150-4c13-b96b-cf885b71504a-trusted-ca\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511748 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-audit-dir\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511762 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511773 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e650bcf6-f84c-4322-86ac-6df17841176d-config\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511816 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ef937a5-1513-4ca5-b719-5418118d0987-machine-approver-tls\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511829 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-audit-dir\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511844 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa665eb8-a288-4806-9192-fc30b30868db-serving-cert\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511876 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f318b6bc-f681-4794-bab8-059ccf270229-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2xztg\" (UID: \"f318b6bc-f681-4794-bab8-059ccf270229\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511901 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bec6950a-558b-4df9-a47b-128b7a3e4edb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-f8wcs\" (UID: \"bec6950a-558b-4df9-a47b-128b7a3e4edb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511927 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2czx4\" (UniqueName: \"kubernetes.io/projected/112f9f29-0947-4a25-adde-b74961a6001b-kube-api-access-2czx4\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511955 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511979 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.511994 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-oauth-config\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512004 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512033 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f25z5\" (UniqueName: \"kubernetes.io/projected/06cb86a1-a324-4ba8-b508-1ddcc20b4025-kube-api-access-f25z5\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512127 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-node-pullsecrets\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/175ecff3-6a2a-4076-a012-7eee503357f9-available-featuregates\") pod \"openshift-config-operator-7777fb866f-26rcr\" (UID: \"175ecff3-6a2a-4076-a012-7eee503357f9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512179 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512206 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512230 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ba5776ad-fa9b-4f21-9dca-40958f01e293-etcd-client\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512252 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njf5q\" (UniqueName: \"kubernetes.io/projected/ba5776ad-fa9b-4f21-9dca-40958f01e293-kube-api-access-njf5q\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512275 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512302 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/166ca18b-e724-4858-b168-c5f607ed9def-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3a292656-f231-47aa-9ad8-f47d92cafb32-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512360 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef937a5-1513-4ca5-b719-5418118d0987-config\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69s26\" (UniqueName: \"kubernetes.io/projected/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-kube-api-access-69s26\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512412 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/166ca18b-e724-4858-b168-c5f607ed9def-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512442 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/349306e1-fe46-449a-96c5-0b1e13f29733-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-rqcgf\" (UID: \"349306e1-fe46-449a-96c5-0b1e13f29733\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512465 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-trusted-ca-bundle\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512489 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/369d9bc5-f128-4daf-afcb-3e1cdf7da3fb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6wjkz\" (UID: \"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512511 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-audit\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512532 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175ecff3-6a2a-4076-a012-7eee503357f9-serving-cert\") pod \"openshift-config-operator-7777fb866f-26rcr\" (UID: \"175ecff3-6a2a-4076-a012-7eee503357f9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512552 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ef937a5-1513-4ca5-b719-5418118d0987-auth-proxy-config\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512573 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/112f9f29-0947-4a25-adde-b74961a6001b-serving-cert\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512592 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35861719-3b6b-4572-8761-bb9c8bfce573-audit-dir\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512646 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-serving-cert\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512668 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhnd8\" (UniqueName: \"kubernetes.io/projected/0e581142-928a-4e07-888c-362d0ae7b49f-kube-api-access-qhnd8\") pod \"dns-operator-744455d44c-ljw49\" (UID: \"0e581142-928a-4e07-888c-362d0ae7b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512689 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06cb86a1-a324-4ba8-b508-1ddcc20b4025-trusted-ca\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512711 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e650bcf6-f84c-4322-86ac-6df17841176d-images\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512733 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e650bcf6-f84c-4322-86ac-6df17841176d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512753 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aa665eb8-a288-4806-9192-fc30b30868db-etcd-client\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512773 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ljh2\" (UniqueName: \"kubernetes.io/projected/aa665eb8-a288-4806-9192-fc30b30868db-kube-api-access-9ljh2\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512793 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369d9bc5-f128-4daf-afcb-3e1cdf7da3fb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6wjkz\" (UID: \"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512832 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512868 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-console-config\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512889 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-service-ca\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512908 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0e581142-928a-4e07-888c-362d0ae7b49f-metrics-tls\") pod \"dns-operator-744455d44c-ljw49\" (UID: \"0e581142-928a-4e07-888c-362d0ae7b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512932 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-image-import-ca\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512953 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8869\" (UniqueName: \"kubernetes.io/projected/175ecff3-6a2a-4076-a012-7eee503357f9-kube-api-access-c8869\") pod \"openshift-config-operator-7777fb866f-26rcr\" (UID: \"175ecff3-6a2a-4076-a012-7eee503357f9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512974 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba5776ad-fa9b-4f21-9dca-40958f01e293-audit-dir\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.512994 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aa665eb8-a288-4806-9192-fc30b30868db-etcd-service-ca\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513015 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/112f9f29-0947-4a25-adde-b74961a6001b-service-ca-bundle\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513038 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513061 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06cb86a1-a324-4ba8-b508-1ddcc20b4025-metrics-tls\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513085 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87f3258e-a4f7-47bb-b448-aaa85f41c9d3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mstlw\" (UID: \"87f3258e-a4f7-47bb-b448-aaa85f41c9d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513107 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-config\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513134 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-serving-cert\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513170 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b9wk\" (UniqueName: \"kubernetes.io/projected/e6595d49-3b53-44fc-a253-a252a53333a2-kube-api-access-8b9wk\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513193 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aa665eb8-a288-4806-9192-fc30b30868db-etcd-ca\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513297 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/112f9f29-0947-4a25-adde-b74961a6001b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513337 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wstr\" (UniqueName: \"kubernetes.io/projected/35861719-3b6b-4572-8761-bb9c8bfce573-kube-api-access-4wstr\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513360 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8l56\" (UniqueName: \"kubernetes.io/projected/6514ff15-d150-4c13-b96b-cf885b71504a-kube-api-access-s8l56\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513385 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slhpd\" (UniqueName: \"kubernetes.io/projected/166ca18b-e724-4858-b168-c5f607ed9def-kube-api-access-slhpd\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513407 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a292656-f231-47aa-9ad8-f47d92cafb32-proxy-tls\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513438 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-config\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513460 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-audit-policies\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513483 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-serving-cert\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513507 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-oauth-serving-cert\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513531 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513555 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdghg\" (UniqueName: \"kubernetes.io/projected/8ef937a5-1513-4ca5-b719-5418118d0987-kube-api-access-kdghg\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513576 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmnjt\" (UniqueName: \"kubernetes.io/projected/bec6950a-558b-4df9-a47b-128b7a3e4edb-kube-api-access-cmnjt\") pod \"openshift-apiserver-operator-796bbdcf4f-f8wcs\" (UID: \"bec6950a-558b-4df9-a47b-128b7a3e4edb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513626 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-etcd-client\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513649 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgs2z\" (UniqueName: \"kubernetes.io/projected/e650bcf6-f84c-4322-86ac-6df17841176d-kube-api-access-lgs2z\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513672 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dnbx\" (UniqueName: \"kubernetes.io/projected/3a292656-f231-47aa-9ad8-f47d92cafb32-kube-api-access-5dnbx\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513696 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bec6950a-558b-4df9-a47b-128b7a3e4edb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-f8wcs\" (UID: \"bec6950a-558b-4df9-a47b-128b7a3e4edb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513702 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513729 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba5776ad-fa9b-4f21-9dca-40958f01e293-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513753 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7ssq\" (UniqueName: \"kubernetes.io/projected/f318b6bc-f681-4794-bab8-059ccf270229-kube-api-access-r7ssq\") pod \"olm-operator-6b444d44fb-2xztg\" (UID: \"f318b6bc-f681-4794-bab8-059ccf270229\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513774 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzcnm\" (UniqueName: \"kubernetes.io/projected/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-kube-api-access-xzcnm\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513797 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6514ff15-d150-4c13-b96b-cf885b71504a-serving-cert\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513831 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/166ca18b-e724-4858-b168-c5f607ed9def-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513856 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-etcd-serving-ca\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513880 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-config\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513901 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-client-ca\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513920 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa665eb8-a288-4806-9192-fc30b30868db-config\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513920 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e650bcf6-f84c-4322-86ac-6df17841176d-config\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513940 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513970 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/06cb86a1-a324-4ba8-b508-1ddcc20b4025-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513978 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6514ff15-d150-4c13-b96b-cf885b71504a-trusted-ca\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.513991 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f318b6bc-f681-4794-bab8-059ccf270229-srv-cert\") pod \"olm-operator-6b444d44fb-2xztg\" (UID: \"f318b6bc-f681-4794-bab8-059ccf270229\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.514163 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bec6950a-558b-4df9-a47b-128b7a3e4edb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-f8wcs\" (UID: \"bec6950a-558b-4df9-a47b-128b7a3e4edb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.514421 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa665eb8-a288-4806-9192-fc30b30868db-serving-cert\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.514423 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8ef937a5-1513-4ca5-b719-5418118d0987-machine-approver-tls\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.514749 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.515117 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-oauth-serving-cert\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.515119 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-image-import-ca\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.515313 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-serving-cert\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.515683 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/112f9f29-0947-4a25-adde-b74961a6001b-service-ca-bundle\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.515972 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-client-ca\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.516138 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.516213 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ba5776ad-fa9b-4f21-9dca-40958f01e293-audit-dir\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.516243 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/aa665eb8-a288-4806-9192-fc30b30868db-etcd-service-ca\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.516606 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ba5776ad-fa9b-4f21-9dca-40958f01e293-encryption-config\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.516644 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/112f9f29-0947-4a25-adde-b74961a6001b-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.516944 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba5776ad-fa9b-4f21-9dca-40958f01e293-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.516953 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/175ecff3-6a2a-4076-a012-7eee503357f9-available-featuregates\") pod \"openshift-config-operator-7777fb866f-26rcr\" (UID: \"175ecff3-6a2a-4076-a012-7eee503357f9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.517617 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-config\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.518025 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-etcd-client\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.518174 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba5776ad-fa9b-4f21-9dca-40958f01e293-serving-cert\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.518315 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-encryption-config\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.518317 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.518365 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.518400 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-node-pullsecrets\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.518711 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35861719-3b6b-4572-8761-bb9c8bfce573-audit-dir\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.518939 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-service-ca\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.519244 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-etcd-serving-ca\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.519496 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa665eb8-a288-4806-9192-fc30b30868db-config\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.519600 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.519702 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.519838 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-audit-policies\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.520096 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-audit\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.520117 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-serving-cert\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.520175 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e650bcf6-f84c-4322-86ac-6df17841176d-images\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.520176 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8ef937a5-1513-4ca5-b719-5418118d0987-auth-proxy-config\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.520653 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ef937a5-1513-4ca5-b719-5418118d0987-config\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.520907 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-console-config\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.521370 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-config\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.521480 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-trusted-ca-bundle\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.521485 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-config\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.521798 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bec6950a-558b-4df9-a47b-128b7a3e4edb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-f8wcs\" (UID: \"bec6950a-558b-4df9-a47b-128b7a3e4edb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.521886 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6514ff15-d150-4c13-b96b-cf885b71504a-serving-cert\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.522753 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0e581142-928a-4e07-888c-362d0ae7b49f-metrics-tls\") pod \"dns-operator-744455d44c-ljw49\" (UID: \"0e581142-928a-4e07-888c-362d0ae7b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.522917 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87f3258e-a4f7-47bb-b448-aaa85f41c9d3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-mstlw\" (UID: \"87f3258e-a4f7-47bb-b448-aaa85f41c9d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.523424 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.524123 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/aa665eb8-a288-4806-9192-fc30b30868db-etcd-client\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.524128 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e650bcf6-f84c-4322-86ac-6df17841176d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.524201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-serving-cert\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.524149 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-serving-cert\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.524907 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.525440 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.525578 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/349306e1-fe46-449a-96c5-0b1e13f29733-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-rqcgf\" (UID: \"349306e1-fe46-449a-96c5-0b1e13f29733\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.526336 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ba5776ad-fa9b-4f21-9dca-40958f01e293-etcd-client\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.526651 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175ecff3-6a2a-4076-a012-7eee503357f9-serving-cert\") pod \"openshift-config-operator-7777fb866f-26rcr\" (UID: \"175ecff3-6a2a-4076-a012-7eee503357f9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.526716 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.526764 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/112f9f29-0947-4a25-adde-b74961a6001b-serving-cert\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.528254 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.531949 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jcpbk\" (UID: \"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.542787 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.551429 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2-config\") pod \"kube-apiserver-operator-766d6c64bb-jcpbk\" (UID: \"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.562285 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.570594 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/aa665eb8-a288-4806-9192-fc30b30868db-etcd-ca\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.583131 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.602388 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.607946 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369d9bc5-f128-4daf-afcb-3e1cdf7da3fb-config\") pod \"kube-controller-manager-operator-78b949d7b-6wjkz\" (UID: \"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.615698 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slhpd\" (UniqueName: \"kubernetes.io/projected/166ca18b-e724-4858-b168-c5f607ed9def-kube-api-access-slhpd\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.615751 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a292656-f231-47aa-9ad8-f47d92cafb32-proxy-tls\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.615826 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dnbx\" (UniqueName: \"kubernetes.io/projected/3a292656-f231-47aa-9ad8-f47d92cafb32-kube-api-access-5dnbx\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.615848 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7ssq\" (UniqueName: \"kubernetes.io/projected/f318b6bc-f681-4794-bab8-059ccf270229-kube-api-access-r7ssq\") pod \"olm-operator-6b444d44fb-2xztg\" (UID: \"f318b6bc-f681-4794-bab8-059ccf270229\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.615867 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/166ca18b-e724-4858-b168-c5f607ed9def-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.615892 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f318b6bc-f681-4794-bab8-059ccf270229-srv-cert\") pod \"olm-operator-6b444d44fb-2xztg\" (UID: \"f318b6bc-f681-4794-bab8-059ccf270229\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.615908 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/06cb86a1-a324-4ba8-b508-1ddcc20b4025-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.615949 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqt59\" (UniqueName: \"kubernetes.io/projected/b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb-kube-api-access-mqt59\") pod \"downloads-7954f5f757-gprz6\" (UID: \"b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb\") " pod="openshift-console/downloads-7954f5f757-gprz6" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.615967 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a292656-f231-47aa-9ad8-f47d92cafb32-images\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.616026 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f318b6bc-f681-4794-bab8-059ccf270229-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2xztg\" (UID: \"f318b6bc-f681-4794-bab8-059ccf270229\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.616065 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f25z5\" (UniqueName: \"kubernetes.io/projected/06cb86a1-a324-4ba8-b508-1ddcc20b4025-kube-api-access-f25z5\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.616105 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/166ca18b-e724-4858-b168-c5f607ed9def-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.616127 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3a292656-f231-47aa-9ad8-f47d92cafb32-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.616152 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/166ca18b-e724-4858-b168-c5f607ed9def-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.616191 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06cb86a1-a324-4ba8-b508-1ddcc20b4025-trusted-ca\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.616257 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06cb86a1-a324-4ba8-b508-1ddcc20b4025-metrics-tls\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.617294 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3a292656-f231-47aa-9ad8-f47d92cafb32-auth-proxy-config\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.622346 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.642526 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.651484 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/369d9bc5-f128-4daf-afcb-3e1cdf7da3fb-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6wjkz\" (UID: \"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.663290 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.687146 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.703543 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.722121 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.742861 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.767007 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.777454 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/166ca18b-e724-4858-b168-c5f607ed9def-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.782217 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.803196 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.822276 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.862541 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.881791 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.902913 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.922135 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.942009 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.962025 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.971970 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/166ca18b-e724-4858-b168-c5f607ed9def-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.988643 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 02 10:33:34 crc kubenswrapper[4909]: I0202 10:33:34.999099 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/06cb86a1-a324-4ba8-b508-1ddcc20b4025-trusted-ca\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.001763 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.023859 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.030091 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06cb86a1-a324-4ba8-b508-1ddcc20b4025-metrics-tls\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.043335 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.063571 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.083054 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.103512 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.122859 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.142747 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.162654 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.182139 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.204434 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.222324 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.244891 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.263451 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.282580 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.303217 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.322778 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.342852 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.351002 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f318b6bc-f681-4794-bab8-059ccf270229-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2xztg\" (UID: \"f318b6bc-f681-4794-bab8-059ccf270229\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.364539 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.381379 4909 request.go:700] Waited for 1.018001471s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.384225 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.402916 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.423056 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.442683 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.462001 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.482534 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.490584 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f318b6bc-f681-4794-bab8-059ccf270229-srv-cert\") pod \"olm-operator-6b444d44fb-2xztg\" (UID: \"f318b6bc-f681-4794-bab8-059ccf270229\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.503766 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.523248 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.553951 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.562590 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.583411 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.602980 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: E0202 10:33:35.616298 4909 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 02 10:33:35 crc kubenswrapper[4909]: E0202 10:33:35.616509 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a292656-f231-47aa-9ad8-f47d92cafb32-images podName:3a292656-f231-47aa-9ad8-f47d92cafb32 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:36.116484357 +0000 UTC m=+141.862585092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/3a292656-f231-47aa-9ad8-f47d92cafb32-images") pod "machine-config-operator-74547568cd-h8sfq" (UID: "3a292656-f231-47aa-9ad8-f47d92cafb32") : failed to sync configmap cache: timed out waiting for the condition Feb 02 10:33:35 crc kubenswrapper[4909]: E0202 10:33:35.616523 4909 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Feb 02 10:33:35 crc kubenswrapper[4909]: E0202 10:33:35.616714 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a292656-f231-47aa-9ad8-f47d92cafb32-proxy-tls podName:3a292656-f231-47aa-9ad8-f47d92cafb32 nodeName:}" failed. No retries permitted until 2026-02-02 10:33:36.116704923 +0000 UTC m=+141.862805658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3a292656-f231-47aa-9ad8-f47d92cafb32-proxy-tls") pod "machine-config-operator-74547568cd-h8sfq" (UID: "3a292656-f231-47aa-9ad8-f47d92cafb32") : failed to sync secret cache: timed out waiting for the condition Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.621772 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.643669 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.662879 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.683792 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.702509 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.722174 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.744374 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.763751 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.782395 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.803252 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.822603 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.842325 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.863035 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.882405 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.902160 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.922653 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.943425 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.962518 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 02 10:33:35 crc kubenswrapper[4909]: I0202 10:33:35.982409 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.003181 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.022456 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.043217 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.082123 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.102750 4909 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.122469 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.139107 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a292656-f231-47aa-9ad8-f47d92cafb32-proxy-tls\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.139230 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a292656-f231-47aa-9ad8-f47d92cafb32-images\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.140007 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3a292656-f231-47aa-9ad8-f47d92cafb32-images\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.143209 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.144150 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3a292656-f231-47aa-9ad8-f47d92cafb32-proxy-tls\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.162644 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.182882 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.203087 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.222707 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.242934 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.283511 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5srtt\" (UniqueName: \"kubernetes.io/projected/2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2-kube-api-access-5srtt\") pod \"apiserver-76f77b778f-knpc2\" (UID: \"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2\") " pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.295167 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jcpbk\" (UID: \"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.320735 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6lm\" (UniqueName: \"kubernetes.io/projected/349306e1-fe46-449a-96c5-0b1e13f29733-kube-api-access-kb6lm\") pod \"cluster-samples-operator-665b6dd947-rqcgf\" (UID: \"349306e1-fe46-449a-96c5-0b1e13f29733\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.334620 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4txr\" (UniqueName: \"kubernetes.io/projected/87f3258e-a4f7-47bb-b448-aaa85f41c9d3-kube-api-access-p4txr\") pod \"openshift-controller-manager-operator-756b6f6bc6-mstlw\" (UID: \"87f3258e-a4f7-47bb-b448-aaa85f41c9d3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.354985 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.355675 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2czx4\" (UniqueName: \"kubernetes.io/projected/112f9f29-0947-4a25-adde-b74961a6001b-kube-api-access-2czx4\") pod \"authentication-operator-69f744f599-gxpkn\" (UID: \"112f9f29-0947-4a25-adde-b74961a6001b\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.378277 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b9wk\" (UniqueName: \"kubernetes.io/projected/e6595d49-3b53-44fc-a253-a252a53333a2-kube-api-access-8b9wk\") pod \"console-f9d7485db-thqss\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.381999 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.395957 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdghg\" (UniqueName: \"kubernetes.io/projected/8ef937a5-1513-4ca5-b719-5418118d0987-kube-api-access-kdghg\") pod \"machine-approver-56656f9798-trqkm\" (UID: \"8ef937a5-1513-4ca5-b719-5418118d0987\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.400858 4909 request.go:700] Waited for 1.885089068s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.416062 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmnjt\" (UniqueName: \"kubernetes.io/projected/bec6950a-558b-4df9-a47b-128b7a3e4edb-kube-api-access-cmnjt\") pod \"openshift-apiserver-operator-796bbdcf4f-f8wcs\" (UID: \"bec6950a-558b-4df9-a47b-128b7a3e4edb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.444226 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8869\" (UniqueName: \"kubernetes.io/projected/175ecff3-6a2a-4076-a012-7eee503357f9-kube-api-access-c8869\") pod \"openshift-config-operator-7777fb866f-26rcr\" (UID: \"175ecff3-6a2a-4076-a012-7eee503357f9\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.461497 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.470470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzcnm\" (UniqueName: \"kubernetes.io/projected/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-kube-api-access-xzcnm\") pod \"route-controller-manager-6576b87f9c-5mlmj\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.478795 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8l56\" (UniqueName: \"kubernetes.io/projected/6514ff15-d150-4c13-b96b-cf885b71504a-kube-api-access-s8l56\") pod \"console-operator-58897d9998-64hf5\" (UID: \"6514ff15-d150-4c13-b96b-cf885b71504a\") " pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.478969 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.495005 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.497738 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/369d9bc5-f128-4daf-afcb-3e1cdf7da3fb-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6wjkz\" (UID: \"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.499515 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.515701 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.523746 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njf5q\" (UniqueName: \"kubernetes.io/projected/ba5776ad-fa9b-4f21-9dca-40958f01e293-kube-api-access-njf5q\") pod \"apiserver-7bbb656c7d-q62sq\" (UID: \"ba5776ad-fa9b-4f21-9dca-40958f01e293\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.523788 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.527715 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.540380 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgs2z\" (UniqueName: \"kubernetes.io/projected/e650bcf6-f84c-4322-86ac-6df17841176d-kube-api-access-lgs2z\") pod \"machine-api-operator-5694c8668f-ghmnw\" (UID: \"e650bcf6-f84c-4322-86ac-6df17841176d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:36 crc kubenswrapper[4909]: W0202 10:33:36.546114 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ef937a5_1513_4ca5_b719_5418118d0987.slice/crio-0c3c8a364e48942871a4f788b1c6db83af01edeafd074f207bb20e75d16ceae5 WatchSource:0}: Error finding container 0c3c8a364e48942871a4f788b1c6db83af01edeafd074f207bb20e75d16ceae5: Status 404 returned error can't find the container with id 0c3c8a364e48942871a4f788b1c6db83af01edeafd074f207bb20e75d16ceae5 Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.549061 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.551026 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-knpc2"] Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.557367 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhnd8\" (UniqueName: \"kubernetes.io/projected/0e581142-928a-4e07-888c-362d0ae7b49f-kube-api-access-qhnd8\") pod \"dns-operator-744455d44c-ljw49\" (UID: \"0e581142-928a-4e07-888c-362d0ae7b49f\") " pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.560867 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.564679 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-gxpkn"] Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.567289 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.578056 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ljh2\" (UniqueName: \"kubernetes.io/projected/aa665eb8-a288-4806-9192-fc30b30868db-kube-api-access-9ljh2\") pod \"etcd-operator-b45778765-qqbpq\" (UID: \"aa665eb8-a288-4806-9192-fc30b30868db\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.596912 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wstr\" (UniqueName: \"kubernetes.io/projected/35861719-3b6b-4572-8761-bb9c8bfce573-kube-api-access-4wstr\") pod \"oauth-openshift-558db77b4-ddclx\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.618723 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69s26\" (UniqueName: \"kubernetes.io/projected/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-kube-api-access-69s26\") pod \"controller-manager-879f6c89f-mwjvx\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.644730 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dnbx\" (UniqueName: \"kubernetes.io/projected/3a292656-f231-47aa-9ad8-f47d92cafb32-kube-api-access-5dnbx\") pod \"machine-config-operator-74547568cd-h8sfq\" (UID: \"3a292656-f231-47aa-9ad8-f47d92cafb32\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.659916 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.664507 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/06cb86a1-a324-4ba8-b508-1ddcc20b4025-bound-sa-token\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.669138 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:36 crc kubenswrapper[4909]: W0202 10:33:36.680419 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod112f9f29_0947_4a25_adde_b74961a6001b.slice/crio-577a949010dda0966c3504e46e2d83b45d077dc7253fb0df7445f07d3b02ec4a WatchSource:0}: Error finding container 577a949010dda0966c3504e46e2d83b45d077dc7253fb0df7445f07d3b02ec4a: Status 404 returned error can't find the container with id 577a949010dda0966c3504e46e2d83b45d077dc7253fb0df7445f07d3b02ec4a Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.690223 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7ssq\" (UniqueName: \"kubernetes.io/projected/f318b6bc-f681-4794-bab8-059ccf270229-kube-api-access-r7ssq\") pod \"olm-operator-6b444d44fb-2xztg\" (UID: \"f318b6bc-f681-4794-bab8-059ccf270229\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.702911 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" event={"ID":"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2","Type":"ContainerStarted","Data":"49a71ca17d0d2689a7883a802e5de64e4e7f42cc92dd6f9e5bc19542f36d7bb3"} Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.703150 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/166ca18b-e724-4858-b168-c5f607ed9def-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.704403 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.705104 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" event={"ID":"8ef937a5-1513-4ca5-b719-5418118d0987","Type":"ContainerStarted","Data":"0c3c8a364e48942871a4f788b1c6db83af01edeafd074f207bb20e75d16ceae5"} Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.720980 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqt59\" (UniqueName: \"kubernetes.io/projected/b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb-kube-api-access-mqt59\") pod \"downloads-7954f5f757-gprz6\" (UID: \"b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb\") " pod="openshift-console/downloads-7954f5f757-gprz6" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.725793 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.743969 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.751952 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.757586 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f25z5\" (UniqueName: \"kubernetes.io/projected/06cb86a1-a324-4ba8-b508-1ddcc20b4025-kube-api-access-f25z5\") pod \"ingress-operator-5b745b69d9-v6jr9\" (UID: \"06cb86a1-a324-4ba8-b508-1ddcc20b4025\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.758718 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slhpd\" (UniqueName: \"kubernetes.io/projected/166ca18b-e724-4858-b168-c5f607ed9def-kube-api-access-slhpd\") pod \"cluster-image-registry-operator-dc59b4c8b-kpxbr\" (UID: \"166ca18b-e724-4858-b168-c5f607ed9def\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.838517 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.844954 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-26rcr"] Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.853784 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.853864 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsdpv\" (UniqueName: \"kubernetes.io/projected/a39a1e38-d1f2-4cda-a088-3a22446407cc-kube-api-access-gsdpv\") pod \"multus-admission-controller-857f4d67dd-2m2wr\" (UID: \"a39a1e38-d1f2-4cda-a088-3a22446407cc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.853896 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rnmq\" (UniqueName: \"kubernetes.io/projected/ec5d8d6b-4bf8-442a-813d-7312fe78ab8a-kube-api-access-8rnmq\") pod \"ingress-canary-9gvvq\" (UID: \"ec5d8d6b-4bf8-442a-813d-7312fe78ab8a\") " pod="openshift-ingress-canary/ingress-canary-9gvvq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.853938 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5zq2\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-kube-api-access-g5zq2\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.853958 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bac9c408-8a21-4db2-b450-214c285e45c4-tmpfs\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.853993 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwdjf\" (UniqueName: \"kubernetes.io/projected/6c932fa7-7181-4aac-bf6c-8a6d56f92ece-kube-api-access-nwdjf\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptpkp\" (UID: \"6c932fa7-7181-4aac-bf6c-8a6d56f92ece\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.854037 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4d413d-3b6e-436a-8e63-7c17ddb1cf86-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kddv4\" (UID: \"7d4d413d-3b6e-436a-8e63-7c17ddb1cf86\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.854597 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8bb8059-30d1-4fca-9f9e-c4dac4b0854f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ftbpn\" (UID: \"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.855207 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bac9c408-8a21-4db2-b450-214c285e45c4-apiservice-cert\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.855239 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvx9m\" (UniqueName: \"kubernetes.io/projected/7d4d413d-3b6e-436a-8e63-7c17ddb1cf86-kube-api-access-mvx9m\") pod \"package-server-manager-789f6589d5-kddv4\" (UID: \"7d4d413d-3b6e-436a-8e63-7c17ddb1cf86\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.855276 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5d8d6b-4bf8-442a-813d-7312fe78ab8a-cert\") pod \"ingress-canary-9gvvq\" (UID: \"ec5d8d6b-4bf8-442a-813d-7312fe78ab8a\") " pod="openshift-ingress-canary/ingress-canary-9gvvq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.856424 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b35c3a0c-04bf-4603-997b-09b0a4976d67-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kw2jt\" (UID: \"b35c3a0c-04bf-4603-997b-09b0a4976d67\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.856470 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl498\" (UniqueName: \"kubernetes.io/projected/6e592fee-28c7-49ef-b1e8-203a72d633b7-kube-api-access-pl498\") pod \"machine-config-controller-84d6567774-hx79g\" (UID: \"6e592fee-28c7-49ef-b1e8-203a72d633b7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.856881 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nt7q4\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.856932 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/839bd402-e470-4226-aa0c-d8d295773a5b-srv-cert\") pod \"catalog-operator-68c6474976-rdmsw\" (UID: \"839bd402-e470-4226-aa0c-d8d295773a5b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.856963 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-certificates\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.856984 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e592fee-28c7-49ef-b1e8-203a72d633b7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hx79g\" (UID: \"6e592fee-28c7-49ef-b1e8-203a72d633b7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857005 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b35c3a0c-04bf-4603-997b-09b0a4976d67-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kw2jt\" (UID: \"b35c3a0c-04bf-4603-997b-09b0a4976d67\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857024 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-tls\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857085 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpnhz\" (UniqueName: \"kubernetes.io/projected/925654f4-6239-4a63-b5d2-9482939d83a8-kube-api-access-hpnhz\") pod \"service-ca-operator-777779d784-tsvwl\" (UID: \"925654f4-6239-4a63-b5d2-9482939d83a8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/925654f4-6239-4a63-b5d2-9482939d83a8-serving-cert\") pod \"service-ca-operator-777779d784-tsvwl\" (UID: \"925654f4-6239-4a63-b5d2-9482939d83a8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857162 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnsvr\" (UniqueName: \"kubernetes.io/projected/b35c3a0c-04bf-4603-997b-09b0a4976d67-kube-api-access-jnsvr\") pod \"kube-storage-version-migrator-operator-b67b599dd-kw2jt\" (UID: \"b35c3a0c-04bf-4603-997b-09b0a4976d67\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857181 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-trusted-ca\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857199 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/97e0804e-f2d3-45b2-bd8e-c5df2099dea5-signing-cabundle\") pod \"service-ca-9c57cc56f-7rkdz\" (UID: \"97e0804e-f2d3-45b2-bd8e-c5df2099dea5\") " pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857221 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0544feb-caff-4805-8c58-bae454503fa0-service-ca-bundle\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857262 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndv2f\" (UniqueName: \"kubernetes.io/projected/78b7f343-023b-4ed9-bbb0-7b43c7d7a7d0-kube-api-access-ndv2f\") pod \"migrator-59844c95c7-4cdjr\" (UID: \"78b7f343-023b-4ed9-bbb0-7b43c7d7a7d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857309 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgxl7\" (UniqueName: \"kubernetes.io/projected/50e06f1d-ae35-4177-afd0-55fc1112f0a7-kube-api-access-vgxl7\") pod \"collect-profiles-29500470-c2btb\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857622 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7w9d\" (UniqueName: \"kubernetes.io/projected/839bd402-e470-4226-aa0c-d8d295773a5b-kube-api-access-n7w9d\") pod \"catalog-operator-68c6474976-rdmsw\" (UID: \"839bd402-e470-4226-aa0c-d8d295773a5b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857694 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bngzv\" (UniqueName: \"kubernetes.io/projected/b0544feb-caff-4805-8c58-bae454503fa0-kube-api-access-bngzv\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857715 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8bb8059-30d1-4fca-9f9e-c4dac4b0854f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ftbpn\" (UID: \"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857760 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50e06f1d-ae35-4177-afd0-55fc1112f0a7-config-volume\") pod \"collect-profiles-29500470-c2btb\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857802 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50e06f1d-ae35-4177-afd0-55fc1112f0a7-secret-volume\") pod \"collect-profiles-29500470-c2btb\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857841 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/97e0804e-f2d3-45b2-bd8e-c5df2099dea5-signing-key\") pod \"service-ca-9c57cc56f-7rkdz\" (UID: \"97e0804e-f2d3-45b2-bd8e-c5df2099dea5\") " pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857913 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8bb8059-30d1-4fca-9f9e-c4dac4b0854f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ftbpn\" (UID: \"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857933 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shgb5\" (UniqueName: \"kubernetes.io/projected/97e0804e-f2d3-45b2-bd8e-c5df2099dea5-kube-api-access-shgb5\") pod \"service-ca-9c57cc56f-7rkdz\" (UID: \"97e0804e-f2d3-45b2-bd8e-c5df2099dea5\") " pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857975 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.857997 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c932fa7-7181-4aac-bf6c-8a6d56f92ece-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptpkp\" (UID: \"6c932fa7-7181-4aac-bf6c-8a6d56f92ece\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858025 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/839bd402-e470-4226-aa0c-d8d295773a5b-profile-collector-cert\") pod \"catalog-operator-68c6474976-rdmsw\" (UID: \"839bd402-e470-4226-aa0c-d8d295773a5b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858043 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b0544feb-caff-4805-8c58-bae454503fa0-default-certificate\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858065 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kflps\" (UniqueName: \"kubernetes.io/projected/bac9c408-8a21-4db2-b450-214c285e45c4-kube-api-access-kflps\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858088 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a39a1e38-d1f2-4cda-a088-3a22446407cc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2m2wr\" (UID: \"a39a1e38-d1f2-4cda-a088-3a22446407cc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858119 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nk27\" (UniqueName: \"kubernetes.io/projected/b575e8ec-7b85-4647-b0af-03274d67afc8-kube-api-access-4nk27\") pod \"marketplace-operator-79b997595-nt7q4\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858139 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b0544feb-caff-4805-8c58-bae454503fa0-stats-auth\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858165 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0544feb-caff-4805-8c58-bae454503fa0-metrics-certs\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858225 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nt7q4\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858246 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e592fee-28c7-49ef-b1e8-203a72d633b7-proxy-tls\") pod \"machine-config-controller-84d6567774-hx79g\" (UID: \"6e592fee-28c7-49ef-b1e8-203a72d633b7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858265 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bac9c408-8a21-4db2-b450-214c285e45c4-webhook-cert\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858286 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858304 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/925654f4-6239-4a63-b5d2-9482939d83a8-config\") pod \"service-ca-operator-777779d784-tsvwl\" (UID: \"925654f4-6239-4a63-b5d2-9482939d83a8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.858338 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-bound-sa-token\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: E0202 10:33:36.861363 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:37.361346123 +0000 UTC m=+143.107446938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.878047 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.915555 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.928652 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf"] Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.960548 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.960774 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5zq2\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-kube-api-access-g5zq2\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: E0202 10:33:36.960830 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:37.460790531 +0000 UTC m=+143.206891266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.960865 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bac9c408-8a21-4db2-b450-214c285e45c4-tmpfs\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.960920 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwdjf\" (UniqueName: \"kubernetes.io/projected/6c932fa7-7181-4aac-bf6c-8a6d56f92ece-kube-api-access-nwdjf\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptpkp\" (UID: \"6c932fa7-7181-4aac-bf6c-8a6d56f92ece\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.960940 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4d413d-3b6e-436a-8e63-7c17ddb1cf86-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kddv4\" (UID: \"7d4d413d-3b6e-436a-8e63-7c17ddb1cf86\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.960978 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8bb8059-30d1-4fca-9f9e-c4dac4b0854f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ftbpn\" (UID: \"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.960997 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h468c\" (UniqueName: \"kubernetes.io/projected/a8d139ed-0b63-4cfa-9b66-0fe970d40006-kube-api-access-h468c\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961023 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bac9c408-8a21-4db2-b450-214c285e45c4-apiservice-cert\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961041 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvx9m\" (UniqueName: \"kubernetes.io/projected/7d4d413d-3b6e-436a-8e63-7c17ddb1cf86-kube-api-access-mvx9m\") pod \"package-server-manager-789f6589d5-kddv4\" (UID: \"7d4d413d-3b6e-436a-8e63-7c17ddb1cf86\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961078 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5d8d6b-4bf8-442a-813d-7312fe78ab8a-cert\") pod \"ingress-canary-9gvvq\" (UID: \"ec5d8d6b-4bf8-442a-813d-7312fe78ab8a\") " pod="openshift-ingress-canary/ingress-canary-9gvvq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961118 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b35c3a0c-04bf-4603-997b-09b0a4976d67-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kw2jt\" (UID: \"b35c3a0c-04bf-4603-997b-09b0a4976d67\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961137 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl498\" (UniqueName: \"kubernetes.io/projected/6e592fee-28c7-49ef-b1e8-203a72d633b7-kube-api-access-pl498\") pod \"machine-config-controller-84d6567774-hx79g\" (UID: \"6e592fee-28c7-49ef-b1e8-203a72d633b7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961158 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-socket-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961192 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nt7q4\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961226 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/839bd402-e470-4226-aa0c-d8d295773a5b-srv-cert\") pod \"catalog-operator-68c6474976-rdmsw\" (UID: \"839bd402-e470-4226-aa0c-d8d295773a5b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961253 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-certificates\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961268 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e592fee-28c7-49ef-b1e8-203a72d633b7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hx79g\" (UID: \"6e592fee-28c7-49ef-b1e8-203a72d633b7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961284 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b35c3a0c-04bf-4603-997b-09b0a4976d67-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kw2jt\" (UID: \"b35c3a0c-04bf-4603-997b-09b0a4976d67\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961301 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-tls\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961317 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b6be365-e876-4c20-9105-67da9ad35291-config-volume\") pod \"dns-default-jp9hd\" (UID: \"2b6be365-e876-4c20-9105-67da9ad35291\") " pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961388 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpnhz\" (UniqueName: \"kubernetes.io/projected/925654f4-6239-4a63-b5d2-9482939d83a8-kube-api-access-hpnhz\") pod \"service-ca-operator-777779d784-tsvwl\" (UID: \"925654f4-6239-4a63-b5d2-9482939d83a8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961415 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/925654f4-6239-4a63-b5d2-9482939d83a8-serving-cert\") pod \"service-ca-operator-777779d784-tsvwl\" (UID: \"925654f4-6239-4a63-b5d2-9482939d83a8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961432 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7110f507-8731-4643-9322-a43cd0aef174-certs\") pod \"machine-config-server-f59sb\" (UID: \"7110f507-8731-4643-9322-a43cd0aef174\") " pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961448 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgp5l\" (UniqueName: \"kubernetes.io/projected/2b6be365-e876-4c20-9105-67da9ad35291-kube-api-access-fgp5l\") pod \"dns-default-jp9hd\" (UID: \"2b6be365-e876-4c20-9105-67da9ad35291\") " pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961484 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnsvr\" (UniqueName: \"kubernetes.io/projected/b35c3a0c-04bf-4603-997b-09b0a4976d67-kube-api-access-jnsvr\") pod \"kube-storage-version-migrator-operator-b67b599dd-kw2jt\" (UID: \"b35c3a0c-04bf-4603-997b-09b0a4976d67\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961514 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-trusted-ca\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/97e0804e-f2d3-45b2-bd8e-c5df2099dea5-signing-cabundle\") pod \"service-ca-9c57cc56f-7rkdz\" (UID: \"97e0804e-f2d3-45b2-bd8e-c5df2099dea5\") " pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961557 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0544feb-caff-4805-8c58-bae454503fa0-service-ca-bundle\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961590 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndv2f\" (UniqueName: \"kubernetes.io/projected/78b7f343-023b-4ed9-bbb0-7b43c7d7a7d0-kube-api-access-ndv2f\") pod \"migrator-59844c95c7-4cdjr\" (UID: \"78b7f343-023b-4ed9-bbb0-7b43c7d7a7d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961606 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-mountpoint-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961619 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/bac9c408-8a21-4db2-b450-214c285e45c4-tmpfs\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961634 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgxl7\" (UniqueName: \"kubernetes.io/projected/50e06f1d-ae35-4177-afd0-55fc1112f0a7-kube-api-access-vgxl7\") pod \"collect-profiles-29500470-c2btb\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961671 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-plugins-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961725 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7w9d\" (UniqueName: \"kubernetes.io/projected/839bd402-e470-4226-aa0c-d8d295773a5b-kube-api-access-n7w9d\") pod \"catalog-operator-68c6474976-rdmsw\" (UID: \"839bd402-e470-4226-aa0c-d8d295773a5b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961782 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-registration-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961845 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bngzv\" (UniqueName: \"kubernetes.io/projected/b0544feb-caff-4805-8c58-bae454503fa0-kube-api-access-bngzv\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961868 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8bb8059-30d1-4fca-9f9e-c4dac4b0854f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ftbpn\" (UID: \"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961907 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50e06f1d-ae35-4177-afd0-55fc1112f0a7-config-volume\") pod \"collect-profiles-29500470-c2btb\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50e06f1d-ae35-4177-afd0-55fc1112f0a7-secret-volume\") pod \"collect-profiles-29500470-c2btb\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.961979 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/97e0804e-f2d3-45b2-bd8e-c5df2099dea5-signing-key\") pod \"service-ca-9c57cc56f-7rkdz\" (UID: \"97e0804e-f2d3-45b2-bd8e-c5df2099dea5\") " pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962051 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8bb8059-30d1-4fca-9f9e-c4dac4b0854f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ftbpn\" (UID: \"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962088 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shgb5\" (UniqueName: \"kubernetes.io/projected/97e0804e-f2d3-45b2-bd8e-c5df2099dea5-kube-api-access-shgb5\") pod \"service-ca-9c57cc56f-7rkdz\" (UID: \"97e0804e-f2d3-45b2-bd8e-c5df2099dea5\") " pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962120 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7110f507-8731-4643-9322-a43cd0aef174-node-bootstrap-token\") pod \"machine-config-server-f59sb\" (UID: \"7110f507-8731-4643-9322-a43cd0aef174\") " pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962148 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962172 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c932fa7-7181-4aac-bf6c-8a6d56f92ece-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptpkp\" (UID: \"6c932fa7-7181-4aac-bf6c-8a6d56f92ece\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962201 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/839bd402-e470-4226-aa0c-d8d295773a5b-profile-collector-cert\") pod \"catalog-operator-68c6474976-rdmsw\" (UID: \"839bd402-e470-4226-aa0c-d8d295773a5b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962225 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b0544feb-caff-4805-8c58-bae454503fa0-default-certificate\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962247 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b6be365-e876-4c20-9105-67da9ad35291-metrics-tls\") pod \"dns-default-jp9hd\" (UID: \"2b6be365-e876-4c20-9105-67da9ad35291\") " pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962270 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kflps\" (UniqueName: \"kubernetes.io/projected/bac9c408-8a21-4db2-b450-214c285e45c4-kube-api-access-kflps\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962290 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a39a1e38-d1f2-4cda-a088-3a22446407cc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2m2wr\" (UID: \"a39a1e38-d1f2-4cda-a088-3a22446407cc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962394 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nk27\" (UniqueName: \"kubernetes.io/projected/b575e8ec-7b85-4647-b0af-03274d67afc8-kube-api-access-4nk27\") pod \"marketplace-operator-79b997595-nt7q4\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b0544feb-caff-4805-8c58-bae454503fa0-stats-auth\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962457 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0544feb-caff-4805-8c58-bae454503fa0-metrics-certs\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962486 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nt7q4\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962506 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e592fee-28c7-49ef-b1e8-203a72d633b7-proxy-tls\") pod \"machine-config-controller-84d6567774-hx79g\" (UID: \"6e592fee-28c7-49ef-b1e8-203a72d633b7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bac9c408-8a21-4db2-b450-214c285e45c4-webhook-cert\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962562 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962583 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/925654f4-6239-4a63-b5d2-9482939d83a8-config\") pod \"service-ca-operator-777779d784-tsvwl\" (UID: \"925654f4-6239-4a63-b5d2-9482939d83a8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962604 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-bound-sa-token\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962625 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98llx\" (UniqueName: \"kubernetes.io/projected/7110f507-8731-4643-9322-a43cd0aef174-kube-api-access-98llx\") pod \"machine-config-server-f59sb\" (UID: \"7110f507-8731-4643-9322-a43cd0aef174\") " pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962675 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-csi-data-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962717 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962740 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsdpv\" (UniqueName: \"kubernetes.io/projected/a39a1e38-d1f2-4cda-a088-3a22446407cc-kube-api-access-gsdpv\") pod \"multus-admission-controller-857f4d67dd-2m2wr\" (UID: \"a39a1e38-d1f2-4cda-a088-3a22446407cc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.962767 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rnmq\" (UniqueName: \"kubernetes.io/projected/ec5d8d6b-4bf8-442a-813d-7312fe78ab8a-kube-api-access-8rnmq\") pod \"ingress-canary-9gvvq\" (UID: \"ec5d8d6b-4bf8-442a-813d-7312fe78ab8a\") " pod="openshift-ingress-canary/ingress-canary-9gvvq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.963866 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-nt7q4\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.966757 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bac9c408-8a21-4db2-b450-214c285e45c4-apiservice-cert\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.968714 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8bb8059-30d1-4fca-9f9e-c4dac4b0854f-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ftbpn\" (UID: \"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.969518 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b35c3a0c-04bf-4603-997b-09b0a4976d67-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kw2jt\" (UID: \"b35c3a0c-04bf-4603-997b-09b0a4976d67\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.969537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50e06f1d-ae35-4177-afd0-55fc1112f0a7-config-volume\") pod \"collect-profiles-29500470-c2btb\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.969723 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/97e0804e-f2d3-45b2-bd8e-c5df2099dea5-signing-cabundle\") pod \"service-ca-9c57cc56f-7rkdz\" (UID: \"97e0804e-f2d3-45b2-bd8e-c5df2099dea5\") " pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.969936 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a39a1e38-d1f2-4cda-a088-3a22446407cc-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2m2wr\" (UID: \"a39a1e38-d1f2-4cda-a088-3a22446407cc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.970513 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0544feb-caff-4805-8c58-bae454503fa0-service-ca-bundle\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.970952 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-trusted-ca\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.970996 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/925654f4-6239-4a63-b5d2-9482939d83a8-serving-cert\") pod \"service-ca-operator-777779d784-tsvwl\" (UID: \"925654f4-6239-4a63-b5d2-9482939d83a8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.971633 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-ca-trust-extracted\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: E0202 10:33:36.973024 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:37.47301033 +0000 UTC m=+143.219111065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.975032 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-certificates\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.976463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-nt7q4\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.976939 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/925654f4-6239-4a63-b5d2-9482939d83a8-config\") pod \"service-ca-operator-777779d784-tsvwl\" (UID: \"925654f4-6239-4a63-b5d2-9482939d83a8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.976976 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.978741 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e592fee-28c7-49ef-b1e8-203a72d633b7-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hx79g\" (UID: \"6e592fee-28c7-49ef-b1e8-203a72d633b7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.986112 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/839bd402-e470-4226-aa0c-d8d295773a5b-profile-collector-cert\") pod \"catalog-operator-68c6474976-rdmsw\" (UID: \"839bd402-e470-4226-aa0c-d8d295773a5b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.988673 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0544feb-caff-4805-8c58-bae454503fa0-metrics-certs\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.996463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bac9c408-8a21-4db2-b450-214c285e45c4-webhook-cert\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.996862 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/97e0804e-f2d3-45b2-bd8e-c5df2099dea5-signing-key\") pod \"service-ca-9c57cc56f-7rkdz\" (UID: \"97e0804e-f2d3-45b2-bd8e-c5df2099dea5\") " pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.997051 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6c932fa7-7181-4aac-bf6c-8a6d56f92ece-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptpkp\" (UID: \"6c932fa7-7181-4aac-bf6c-8a6d56f92ece\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.997181 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b35c3a0c-04bf-4603-997b-09b0a4976d67-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kw2jt\" (UID: \"b35c3a0c-04bf-4603-997b-09b0a4976d67\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.997566 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec5d8d6b-4bf8-442a-813d-7312fe78ab8a-cert\") pod \"ingress-canary-9gvvq\" (UID: \"ec5d8d6b-4bf8-442a-813d-7312fe78ab8a\") " pod="openshift-ingress-canary/ingress-canary-9gvvq" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.997821 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8bb8059-30d1-4fca-9f9e-c4dac4b0854f-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ftbpn\" (UID: \"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.998167 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50e06f1d-ae35-4177-afd0-55fc1112f0a7-secret-volume\") pod \"collect-profiles-29500470-c2btb\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.998967 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e592fee-28c7-49ef-b1e8-203a72d633b7-proxy-tls\") pod \"machine-config-controller-84d6567774-hx79g\" (UID: \"6e592fee-28c7-49ef-b1e8-203a72d633b7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:36 crc kubenswrapper[4909]: I0202 10:33:36.999468 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-tls\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.001267 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4d413d-3b6e-436a-8e63-7c17ddb1cf86-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kddv4\" (UID: \"7d4d413d-3b6e-436a-8e63-7c17ddb1cf86\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.001466 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b0544feb-caff-4805-8c58-bae454503fa0-stats-auth\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.007225 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-gprz6" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.009046 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-installation-pull-secrets\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.009160 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/839bd402-e470-4226-aa0c-d8d295773a5b-srv-cert\") pod \"catalog-operator-68c6474976-rdmsw\" (UID: \"839bd402-e470-4226-aa0c-d8d295773a5b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.016081 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b0544feb-caff-4805-8c58-bae454503fa0-default-certificate\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.020461 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5zq2\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-kube-api-access-g5zq2\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.033414 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl498\" (UniqueName: \"kubernetes.io/projected/6e592fee-28c7-49ef-b1e8-203a72d633b7-kube-api-access-pl498\") pod \"machine-config-controller-84d6567774-hx79g\" (UID: \"6e592fee-28c7-49ef-b1e8-203a72d633b7\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.037985 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgxl7\" (UniqueName: \"kubernetes.io/projected/50e06f1d-ae35-4177-afd0-55fc1112f0a7-kube-api-access-vgxl7\") pod \"collect-profiles-29500470-c2btb\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.059088 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwdjf\" (UniqueName: \"kubernetes.io/projected/6c932fa7-7181-4aac-bf6c-8a6d56f92ece-kube-api-access-nwdjf\") pod \"control-plane-machine-set-operator-78cbb6b69f-ptpkp\" (UID: \"6c932fa7-7181-4aac-bf6c-8a6d56f92ece\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.062496 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.063320 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.063515 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7110f507-8731-4643-9322-a43cd0aef174-node-bootstrap-token\") pod \"machine-config-server-f59sb\" (UID: \"7110f507-8731-4643-9322-a43cd0aef174\") " pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.063572 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b6be365-e876-4c20-9105-67da9ad35291-metrics-tls\") pod \"dns-default-jp9hd\" (UID: \"2b6be365-e876-4c20-9105-67da9ad35291\") " pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.063628 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98llx\" (UniqueName: \"kubernetes.io/projected/7110f507-8731-4643-9322-a43cd0aef174-kube-api-access-98llx\") pod \"machine-config-server-f59sb\" (UID: \"7110f507-8731-4643-9322-a43cd0aef174\") " pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.063650 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-csi-data-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.063715 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:37.563659956 +0000 UTC m=+143.309760691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.063849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-csi-data-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.063916 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h468c\" (UniqueName: \"kubernetes.io/projected/a8d139ed-0b63-4cfa-9b66-0fe970d40006-kube-api-access-h468c\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.064028 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-socket-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.064104 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b6be365-e876-4c20-9105-67da9ad35291-config-volume\") pod \"dns-default-jp9hd\" (UID: \"2b6be365-e876-4c20-9105-67da9ad35291\") " pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.064206 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7110f507-8731-4643-9322-a43cd0aef174-certs\") pod \"machine-config-server-f59sb\" (UID: \"7110f507-8731-4643-9322-a43cd0aef174\") " pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.064240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgp5l\" (UniqueName: \"kubernetes.io/projected/2b6be365-e876-4c20-9105-67da9ad35291-kube-api-access-fgp5l\") pod \"dns-default-jp9hd\" (UID: \"2b6be365-e876-4c20-9105-67da9ad35291\") " pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.064292 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-mountpoint-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.064380 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-plugins-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.064443 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-registration-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.064775 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-mountpoint-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.065219 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-socket-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.065278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-plugins-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.065463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a8d139ed-0b63-4cfa-9b66-0fe970d40006-registration-dir\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.065847 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b6be365-e876-4c20-9105-67da9ad35291-config-volume\") pod \"dns-default-jp9hd\" (UID: \"2b6be365-e876-4c20-9105-67da9ad35291\") " pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.071200 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2b6be365-e876-4c20-9105-67da9ad35291-metrics-tls\") pod \"dns-default-jp9hd\" (UID: \"2b6be365-e876-4c20-9105-67da9ad35291\") " pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.071441 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7110f507-8731-4643-9322-a43cd0aef174-certs\") pod \"machine-config-server-f59sb\" (UID: \"7110f507-8731-4643-9322-a43cd0aef174\") " pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.075848 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7110f507-8731-4643-9322-a43cd0aef174-node-bootstrap-token\") pod \"machine-config-server-f59sb\" (UID: \"7110f507-8731-4643-9322-a43cd0aef174\") " pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.077306 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.077849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvx9m\" (UniqueName: \"kubernetes.io/projected/7d4d413d-3b6e-436a-8e63-7c17ddb1cf86-kube-api-access-mvx9m\") pod \"package-server-manager-789f6589d5-kddv4\" (UID: \"7d4d413d-3b6e-436a-8e63-7c17ddb1cf86\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.105409 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rnmq\" (UniqueName: \"kubernetes.io/projected/ec5d8d6b-4bf8-442a-813d-7312fe78ab8a-kube-api-access-8rnmq\") pod \"ingress-canary-9gvvq\" (UID: \"ec5d8d6b-4bf8-442a-813d-7312fe78ab8a\") " pod="openshift-ingress-canary/ingress-canary-9gvvq" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.113924 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.126764 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9gvvq" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.130713 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d8bb8059-30d1-4fca-9f9e-c4dac4b0854f-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ftbpn\" (UID: \"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.168366 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.168801 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:37.668789757 +0000 UTC m=+143.414890492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.187117 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7w9d\" (UniqueName: \"kubernetes.io/projected/839bd402-e470-4226-aa0c-d8d295773a5b-kube-api-access-n7w9d\") pod \"catalog-operator-68c6474976-rdmsw\" (UID: \"839bd402-e470-4226-aa0c-d8d295773a5b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.189827 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-thqss"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.193507 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.194921 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.202904 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bngzv\" (UniqueName: \"kubernetes.io/projected/b0544feb-caff-4805-8c58-bae454503fa0-kube-api-access-bngzv\") pod \"router-default-5444994796-rtcqn\" (UID: \"b0544feb-caff-4805-8c58-bae454503fa0\") " pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.204719 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-64hf5"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.205000 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.207568 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndv2f\" (UniqueName: \"kubernetes.io/projected/78b7f343-023b-4ed9-bbb0-7b43c7d7a7d0-kube-api-access-ndv2f\") pod \"migrator-59844c95c7-4cdjr\" (UID: \"78b7f343-023b-4ed9-bbb0-7b43c7d7a7d0\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.249443 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnsvr\" (UniqueName: \"kubernetes.io/projected/b35c3a0c-04bf-4603-997b-09b0a4976d67-kube-api-access-jnsvr\") pod \"kube-storage-version-migrator-operator-b67b599dd-kw2jt\" (UID: \"b35c3a0c-04bf-4603-997b-09b0a4976d67\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:37 crc kubenswrapper[4909]: W0202 10:33:37.259714 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3bd1cfb_92ee_4f7b_a8aa_20d6e1727e5e.slice/crio-fe93efac4fb3f0a1151e82a6d9963e7371b07a9f602ef97966edf0dc02f09036 WatchSource:0}: Error finding container fe93efac4fb3f0a1151e82a6d9963e7371b07a9f602ef97966edf0dc02f09036: Status 404 returned error can't find the container with id fe93efac4fb3f0a1151e82a6d9963e7371b07a9f602ef97966edf0dc02f09036 Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.261870 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.261945 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shgb5\" (UniqueName: \"kubernetes.io/projected/97e0804e-f2d3-45b2-bd8e-c5df2099dea5-kube-api-access-shgb5\") pod \"service-ca-9c57cc56f-7rkdz\" (UID: \"97e0804e-f2d3-45b2-bd8e-c5df2099dea5\") " pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.273138 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.273835 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:37.773820223 +0000 UTC m=+143.519920958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.274908 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nk27\" (UniqueName: \"kubernetes.io/projected/b575e8ec-7b85-4647-b0af-03274d67afc8-kube-api-access-4nk27\") pod \"marketplace-operator-79b997595-nt7q4\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.285861 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.293086 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.308117 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.316279 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.316884 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsdpv\" (UniqueName: \"kubernetes.io/projected/a39a1e38-d1f2-4cda-a088-3a22446407cc-kube-api-access-gsdpv\") pod \"multus-admission-controller-857f4d67dd-2m2wr\" (UID: \"a39a1e38-d1f2-4cda-a088-3a22446407cc\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.321907 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.329626 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kflps\" (UniqueName: \"kubernetes.io/projected/bac9c408-8a21-4db2-b450-214c285e45c4-kube-api-access-kflps\") pod \"packageserver-d55dfcdfc-gvxk8\" (UID: \"bac9c408-8a21-4db2-b450-214c285e45c4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.334107 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.344947 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-bound-sa-token\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.352365 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.358389 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpnhz\" (UniqueName: \"kubernetes.io/projected/925654f4-6239-4a63-b5d2-9482939d83a8-kube-api-access-hpnhz\") pod \"service-ca-operator-777779d784-tsvwl\" (UID: \"925654f4-6239-4a63-b5d2-9482939d83a8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.365576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgp5l\" (UniqueName: \"kubernetes.io/projected/2b6be365-e876-4c20-9105-67da9ad35291-kube-api-access-fgp5l\") pod \"dns-default-jp9hd\" (UID: \"2b6be365-e876-4c20-9105-67da9ad35291\") " pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.374414 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.374729 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:37.874716442 +0000 UTC m=+143.620817177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.379417 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h468c\" (UniqueName: \"kubernetes.io/projected/a8d139ed-0b63-4cfa-9b66-0fe970d40006-kube-api-access-h468c\") pod \"csi-hostpathplugin-9sjgx\" (UID: \"a8d139ed-0b63-4cfa-9b66-0fe970d40006\") " pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.386249 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.395455 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.413249 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98llx\" (UniqueName: \"kubernetes.io/projected/7110f507-8731-4643-9322-a43cd0aef174-kube-api-access-98llx\") pod \"machine-config-server-f59sb\" (UID: \"7110f507-8731-4643-9322-a43cd0aef174\") " pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.414891 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.452284 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.460035 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.468399 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-f59sb" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.474942 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.475487 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:37.975465437 +0000 UTC m=+143.721566182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.475791 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.476139 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:37.976129506 +0000 UTC m=+143.722230241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.482040 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.509738 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ghmnw"] Feb 02 10:33:37 crc kubenswrapper[4909]: W0202 10:33:37.562196 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode650bcf6_f84c_4322_86ac_6df17841176d.slice/crio-760f52c91eb34c82018a78f0d84c21d2083d4c626f2cc61cfbf33e0907720695 WatchSource:0}: Error finding container 760f52c91eb34c82018a78f0d84c21d2083d4c626f2cc61cfbf33e0907720695: Status 404 returned error can't find the container with id 760f52c91eb34c82018a78f0d84c21d2083d4c626f2cc61cfbf33e0907720695 Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.579031 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.579441 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.079426674 +0000 UTC m=+143.825527409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.606625 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.620726 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.641752 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.680414 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.681196 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.181180548 +0000 UTC m=+143.927281303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.701975 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwjvx"] Feb 02 10:33:37 crc kubenswrapper[4909]: W0202 10:33:37.707211 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf318b6bc_f681_4794_bab8_059ccf270229.slice/crio-a224533818df57af5143aa2036c76bcbf4e36a011e56b62f431b4a1e4623bfeb WatchSource:0}: Error finding container a224533818df57af5143aa2036c76bcbf4e36a011e56b62f431b4a1e4623bfeb: Status 404 returned error can't find the container with id a224533818df57af5143aa2036c76bcbf4e36a011e56b62f431b4a1e4623bfeb Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.713938 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.721714 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ddclx"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.723169 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" event={"ID":"e650bcf6-f84c-4322-86ac-6df17841176d","Type":"ContainerStarted","Data":"760f52c91eb34c82018a78f0d84c21d2083d4c626f2cc61cfbf33e0907720695"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.726098 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rtcqn" event={"ID":"b0544feb-caff-4805-8c58-bae454503fa0","Type":"ContainerStarted","Data":"e60ed669d195f32aa21d841b5c4bf2cf94f32de470b188fcce7da5b6099b25fd"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.727005 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-64hf5" event={"ID":"6514ff15-d150-4c13-b96b-cf885b71504a","Type":"ContainerStarted","Data":"14c19faab88f4da1a2232ac50408feb0fc4c601957ef588f8193eec59f4bf54b"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.739210 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" event={"ID":"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e","Type":"ContainerStarted","Data":"fe93efac4fb3f0a1151e82a6d9963e7371b07a9f602ef97966edf0dc02f09036"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.743100 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" event={"ID":"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2","Type":"ContainerStarted","Data":"79e208987a1743758fba7c2baacec6c4d0d439b89689860a42a63855b4cfe3f2"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.745781 4909 generic.go:334] "Generic (PLEG): container finished" podID="2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2" containerID="ada0aed82aadbebbfe2622330f6e1d8e86ccff604483970517de17f20d765bb8" exitCode=0 Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.745911 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" event={"ID":"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2","Type":"ContainerDied","Data":"ada0aed82aadbebbfe2622330f6e1d8e86ccff604483970517de17f20d765bb8"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.759626 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" event={"ID":"349306e1-fe46-449a-96c5-0b1e13f29733","Type":"ContainerStarted","Data":"b0b307da219c202ac4cc0f3dbfdf725758783a2cc0fee02a3290cd6cfb737cb2"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.759671 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" event={"ID":"349306e1-fe46-449a-96c5-0b1e13f29733","Type":"ContainerStarted","Data":"ffd7cc164a7fc0b4215a8d4e68e26fa647b8f3bb200753e55a84d781634ee46c"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.761047 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" event={"ID":"87f3258e-a4f7-47bb-b448-aaa85f41c9d3","Type":"ContainerStarted","Data":"064314e28d1fa0c8b5a0bcfaec38482c0ff48f94c65e6da61590d0520d6f130d"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.764532 4909 generic.go:334] "Generic (PLEG): container finished" podID="175ecff3-6a2a-4076-a012-7eee503357f9" containerID="c77f9f516549a663efbcca7f46e05aa9c952578966d77aa9ddc373e36fe70556" exitCode=0 Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.764670 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" event={"ID":"175ecff3-6a2a-4076-a012-7eee503357f9","Type":"ContainerDied","Data":"c77f9f516549a663efbcca7f46e05aa9c952578966d77aa9ddc373e36fe70556"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.764688 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" event={"ID":"175ecff3-6a2a-4076-a012-7eee503357f9","Type":"ContainerStarted","Data":"01d9ca1ed29d17c832f0aaaf9c1e31a9aaf2fb6ca99b084816fe929a36a514f9"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.769770 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" event={"ID":"ba5776ad-fa9b-4f21-9dca-40958f01e293","Type":"ContainerStarted","Data":"7cf2af20901baa79ddc998b52509ceac48b0f1dc761cb78d4398487c4fefcaa2"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.772318 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" event={"ID":"bec6950a-558b-4df9-a47b-128b7a3e4edb","Type":"ContainerStarted","Data":"f8b09b37d32fb4c0f5984828e1cf2585e053222eafdb925bed163f5f06665ac2"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.779718 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" event={"ID":"112f9f29-0947-4a25-adde-b74961a6001b","Type":"ContainerStarted","Data":"6db07d6a5d8ae0c6d2a315e61717ef2277b7b89efddc58eebec50d7a21406d15"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.779776 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" event={"ID":"112f9f29-0947-4a25-adde-b74961a6001b","Type":"ContainerStarted","Data":"577a949010dda0966c3504e46e2d83b45d077dc7253fb0df7445f07d3b02ec4a"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.781801 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.782231 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.282213981 +0000 UTC m=+144.028314716 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.795593 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" event={"ID":"8ef937a5-1513-4ca5-b719-5418118d0987","Type":"ContainerStarted","Data":"c117c4e8ebe43624694ef6ab90fdfbb287b1c306215ecc58fad327b726d2ce21"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.795642 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" event={"ID":"8ef937a5-1513-4ca5-b719-5418118d0987","Type":"ContainerStarted","Data":"968ee9d08502507695f35c60e10db365da75f0d9aabf754abe10d4bb8f735323"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.808764 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ljw49"] Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.809183 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-thqss" event={"ID":"e6595d49-3b53-44fc-a253-a252a53333a2","Type":"ContainerStarted","Data":"8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.809225 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-thqss" event={"ID":"e6595d49-3b53-44fc-a253-a252a53333a2","Type":"ContainerStarted","Data":"181978bc01aa6ffc24bb048edec6fdc8696962aff1b749918e63f791fc400b2f"} Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.815391 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr"] Feb 02 10:33:37 crc kubenswrapper[4909]: W0202 10:33:37.827558 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29fc305d_0c7f_4ef9_8ae2_6534374de1ea.slice/crio-ade8c6b52cc55d23f081505b81b8f86383e96a5abc44c849eec75803b4704582 WatchSource:0}: Error finding container ade8c6b52cc55d23f081505b81b8f86383e96a5abc44c849eec75803b4704582: Status 404 returned error can't find the container with id ade8c6b52cc55d23f081505b81b8f86383e96a5abc44c849eec75803b4704582 Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.883308 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.886026 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.386009143 +0000 UTC m=+144.132109878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.985144 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.985925 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.485903743 +0000 UTC m=+144.232004478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:37 crc kubenswrapper[4909]: W0202 10:33:37.986719 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod166ca18b_e724_4858_b168_c5f607ed9def.slice/crio-281020fb37822ee6ff651d46d5ca88c904846b9f90ef3802b4130cc20eb5329c WatchSource:0}: Error finding container 281020fb37822ee6ff651d46d5ca88c904846b9f90ef3802b4130cc20eb5329c: Status 404 returned error can't find the container with id 281020fb37822ee6ff651d46d5ca88c904846b9f90ef3802b4130cc20eb5329c Feb 02 10:33:37 crc kubenswrapper[4909]: I0202 10:33:37.988859 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:37 crc kubenswrapper[4909]: E0202 10:33:37.990681 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.490662229 +0000 UTC m=+144.236762954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.090512 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.090671 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.590648593 +0000 UTC m=+144.336749328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.090799 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.091136 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.591129206 +0000 UTC m=+144.337229941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.192527 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.193401 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.693383414 +0000 UTC m=+144.439484149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.219233 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qqbpq"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.222324 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.224418 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.238237 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-gprz6"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.240030 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.243966 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-thqss" podStartSLOduration=123.243945732 podStartE2EDuration="2m3.243945732s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.237024924 +0000 UTC m=+143.983125679" watchObservedRunningTime="2026-02-02 10:33:38.243945732 +0000 UTC m=+143.990046477" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.287973 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-trqkm" podStartSLOduration=123.287954612 podStartE2EDuration="2m3.287954612s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.280776837 +0000 UTC m=+144.026877572" watchObservedRunningTime="2026-02-02 10:33:38.287954612 +0000 UTC m=+144.034055347" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.294626 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.294926 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.794915822 +0000 UTC m=+144.541016557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.395918 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.396493 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.896469159 +0000 UTC m=+144.642569894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.396577 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.396938 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.896926553 +0000 UTC m=+144.643027288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.434959 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9gvvq"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.449361 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.451331 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.454995 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.490080 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.492125 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.497750 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.498264 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:38.998245204 +0000 UTC m=+144.744345939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: W0202 10:33:38.501846 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d4d413d_3b6e_436a_8e63_7c17ddb1cf86.slice/crio-594feb89005570d721a3e7e2995b38478bbcbdf3e2faa7b72e39894838f7ef8e WatchSource:0}: Error finding container 594feb89005570d721a3e7e2995b38478bbcbdf3e2faa7b72e39894838f7ef8e: Status 404 returned error can't find the container with id 594feb89005570d721a3e7e2995b38478bbcbdf3e2faa7b72e39894838f7ef8e Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.503730 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7rkdz"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.505844 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2m2wr"] Feb 02 10:33:38 crc kubenswrapper[4909]: W0202 10:33:38.525433 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78b7f343_023b_4ed9_bbb0_7b43c7d7a7d0.slice/crio-f06dc9e76a9c8998a8fb852be0b9e22e6cff27b817155aa69e8894832cdfdb5e WatchSource:0}: Error finding container f06dc9e76a9c8998a8fb852be0b9e22e6cff27b817155aa69e8894832cdfdb5e: Status 404 returned error can't find the container with id f06dc9e76a9c8998a8fb852be0b9e22e6cff27b817155aa69e8894832cdfdb5e Feb 02 10:33:38 crc kubenswrapper[4909]: W0202 10:33:38.546409 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c932fa7_7181_4aac_bf6c_8a6d56f92ece.slice/crio-97e5e7a29aeb5ba2a46d44c307b191742da3ebbe71e0afcf52ebaa63f92c03eb WatchSource:0}: Error finding container 97e5e7a29aeb5ba2a46d44c307b191742da3ebbe71e0afcf52ebaa63f92c03eb: Status 404 returned error can't find the container with id 97e5e7a29aeb5ba2a46d44c307b191742da3ebbe71e0afcf52ebaa63f92c03eb Feb 02 10:33:38 crc kubenswrapper[4909]: W0202 10:33:38.548013 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cb86a1_a324_4ba8_b508_1ddcc20b4025.slice/crio-b903fce40f13f4a04f8b642d10dff10ae734b0e1483c24aa1f9e09962a212879 WatchSource:0}: Error finding container b903fce40f13f4a04f8b642d10dff10ae734b0e1483c24aa1f9e09962a212879: Status 404 returned error can't find the container with id b903fce40f13f4a04f8b642d10dff10ae734b0e1483c24aa1f9e09962a212879 Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.590505 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl"] Feb 02 10:33:38 crc kubenswrapper[4909]: W0202 10:33:38.597107 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda39a1e38_d1f2_4cda_a088_3a22446407cc.slice/crio-ebf5358deb0aebac0189010b4b0d89247702ecda8b41429ff3a8414f20c76377 WatchSource:0}: Error finding container ebf5358deb0aebac0189010b4b0d89247702ecda8b41429ff3a8414f20c76377: Status 404 returned error can't find the container with id ebf5358deb0aebac0189010b4b0d89247702ecda8b41429ff3a8414f20c76377 Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.598842 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.599092 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.099082581 +0000 UTC m=+144.845183306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: W0202 10:33:38.601896 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e0804e_f2d3_45b2_bd8e_c5df2099dea5.slice/crio-a9786d62fb9186b2951132f639c457cdb35f965771feab5e07f12d6102401893 WatchSource:0}: Error finding container a9786d62fb9186b2951132f639c457cdb35f965771feab5e07f12d6102401893: Status 404 returned error can't find the container with id a9786d62fb9186b2951132f639c457cdb35f965771feab5e07f12d6102401893 Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.602749 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-gxpkn" podStartSLOduration=123.602684314 podStartE2EDuration="2m3.602684314s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:38.597938038 +0000 UTC m=+144.344038783" watchObservedRunningTime="2026-02-02 10:33:38.602684314 +0000 UTC m=+144.348785069" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.689360 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.699828 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.700145 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.200130115 +0000 UTC m=+144.946230850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.720250 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jp9hd"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.722669 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.738280 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9sjgx"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.751600 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nt7q4"] Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.803714 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.804012 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.303999379 +0000 UTC m=+145.050100114 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.846311 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" event={"ID":"06cb86a1-a324-4ba8-b508-1ddcc20b4025","Type":"ContainerStarted","Data":"b903fce40f13f4a04f8b642d10dff10ae734b0e1483c24aa1f9e09962a212879"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.847512 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" event={"ID":"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e","Type":"ContainerStarted","Data":"cca01d117134f71590bcefef5d26afeb5988e0f341f55c6b49846c5a9954d1c9"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.848427 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.849780 4909 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-5mlmj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.849830 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" podUID="d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.851033 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-f59sb" event={"ID":"7110f507-8731-4643-9322-a43cd0aef174","Type":"ContainerStarted","Data":"66bf859b4b46e69826017d76e15a8f920d0f980c1e2c94abc52b49fd03db9755"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.851093 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-f59sb" event={"ID":"7110f507-8731-4643-9322-a43cd0aef174","Type":"ContainerStarted","Data":"fa0f57635cc6ca3407e14332e984036cc512c2a8ab2af6f899d960c42e5f4bae"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.855294 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gprz6" event={"ID":"b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb","Type":"ContainerStarted","Data":"52c066dcd8c8325ab9a08d488de8e4ebad10659eb43031ac72a3118f8ee70a4f"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.865485 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" event={"ID":"a2b2e114-73ec-4cc9-9c62-a57b8f7aebb2","Type":"ContainerStarted","Data":"548fc749e878c5ff4d0506fd142467fc64aed7b93d822f2d0c02f0ccb20253c9"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.878667 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" event={"ID":"6e592fee-28c7-49ef-b1e8-203a72d633b7","Type":"ContainerStarted","Data":"a6fa92bc6065eaecd1e0351ce0022b08fb92860bb914ad30371264e7ac9acf67"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.878757 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" event={"ID":"6e592fee-28c7-49ef-b1e8-203a72d633b7","Type":"ContainerStarted","Data":"940992163c3cd0f93dec499fac472bf4adaa714d28698ee97441dbff0d168abe"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.880280 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-64hf5" event={"ID":"6514ff15-d150-4c13-b96b-cf885b71504a","Type":"ContainerStarted","Data":"49c69386ce47a5127f6f4c89cc1e7f4104e5db67f7ba591d8de1ab9bb5feef6f"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.880733 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.886522 4909 patch_prober.go:28] interesting pod/console-operator-58897d9998-64hf5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.886572 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-64hf5" podUID="6514ff15-d150-4c13-b96b-cf885b71504a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.905444 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:38 crc kubenswrapper[4909]: E0202 10:33:38.907367 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.407321387 +0000 UTC m=+145.153422122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.907959 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" event={"ID":"349306e1-fe46-449a-96c5-0b1e13f29733","Type":"ContainerStarted","Data":"902821403b49e31f97b32ee5820b22ea38c4b23b6dce0f3eb5ed20c5b7c41494"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.921263 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" event={"ID":"166ca18b-e724-4858-b168-c5f607ed9def","Type":"ContainerStarted","Data":"935ea60af83a0894c2077be7cd26259414fb5036000043a337bd86a68661c72b"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.921315 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" event={"ID":"166ca18b-e724-4858-b168-c5f607ed9def","Type":"ContainerStarted","Data":"281020fb37822ee6ff651d46d5ca88c904846b9f90ef3802b4130cc20eb5329c"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.924704 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" event={"ID":"97e0804e-f2d3-45b2-bd8e-c5df2099dea5","Type":"ContainerStarted","Data":"a9786d62fb9186b2951132f639c457cdb35f965771feab5e07f12d6102401893"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.927026 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" event={"ID":"87f3258e-a4f7-47bb-b448-aaa85f41c9d3","Type":"ContainerStarted","Data":"cfd95ff010a3cf6bfc6733cd96c57afd78211798547ea15837b259af09288fd7"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.930762 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" event={"ID":"f318b6bc-f681-4794-bab8-059ccf270229","Type":"ContainerStarted","Data":"40956b9638cf9d636bea022d46adefb77e878c427cd98ffb2398bd321872e0f3"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.930802 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" event={"ID":"f318b6bc-f681-4794-bab8-059ccf270229","Type":"ContainerStarted","Data":"a224533818df57af5143aa2036c76bcbf4e36a011e56b62f431b4a1e4623bfeb"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.931014 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.935167 4909 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2xztg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.935207 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" podUID="f318b6bc-f681-4794-bab8-059ccf270229" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.962966 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9gvvq" event={"ID":"ec5d8d6b-4bf8-442a-813d-7312fe78ab8a","Type":"ContainerStarted","Data":"bb3e8d710263d929c3dd966bf832baf63b6ab51dea7b140a0093a126853b7c9d"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.967788 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr" event={"ID":"78b7f343-023b-4ed9-bbb0-7b43c7d7a7d0","Type":"ContainerStarted","Data":"f06dc9e76a9c8998a8fb852be0b9e22e6cff27b817155aa69e8894832cdfdb5e"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.968682 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" event={"ID":"50e06f1d-ae35-4177-afd0-55fc1112f0a7","Type":"ContainerStarted","Data":"126605d80c8022267bcac88bf24e8306217321d268c2ee1fd8f6fd716e526ed4"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.971705 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" event={"ID":"29fc305d-0c7f-4ef9-8ae2-6534374de1ea","Type":"ContainerStarted","Data":"6d0292200ab0f3bdf2319652ac825f16659c360bb2c2eab7dff4c3d8d1c9c0d3"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.971733 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" event={"ID":"29fc305d-0c7f-4ef9-8ae2-6534374de1ea","Type":"ContainerStarted","Data":"ade8c6b52cc55d23f081505b81b8f86383e96a5abc44c849eec75803b4704582"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.971797 4909 csr.go:261] certificate signing request csr-n4xdh is approved, waiting to be issued Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.972219 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.975267 4909 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mwjvx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.975309 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" podUID="29fc305d-0c7f-4ef9-8ae2-6534374de1ea" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.976134 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" event={"ID":"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb","Type":"ContainerStarted","Data":"f217d4fcc8c6ce607ee9742ebeeaa95f864eb732e0d164e49268732c3e8ece86"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.976170 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" event={"ID":"369d9bc5-f128-4daf-afcb-3e1cdf7da3fb","Type":"ContainerStarted","Data":"31caa5b6ab5886358d53d66f5ee25b02737b7f0f47580ddbc74ac5d2ddff0447"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.981635 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-rtcqn" event={"ID":"b0544feb-caff-4805-8c58-bae454503fa0","Type":"ContainerStarted","Data":"0e8fc87c242ba5edef0ea3bed99cabcf284e7296c677fdb82b0f50fa2251a8cb"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.985033 4909 csr.go:257] certificate signing request csr-n4xdh is issued Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.987113 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" event={"ID":"0e581142-928a-4e07-888c-362d0ae7b49f","Type":"ContainerStarted","Data":"8ce8321bf287e0eca48ebf6d7e906bff70bea114b0f8522d64a12be348fd97a7"} Feb 02 10:33:38 crc kubenswrapper[4909]: I0202 10:33:38.997213 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" event={"ID":"bec6950a-558b-4df9-a47b-128b7a3e4edb","Type":"ContainerStarted","Data":"d324c1c9b6d91767e85f1ddc512cb97aa0cbd3ad81f8b835cbd716a3b8dc3e06"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.007147 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.009237 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.509216405 +0000 UTC m=+145.255317260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.038747 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" event={"ID":"925654f4-6239-4a63-b5d2-9482939d83a8","Type":"ContainerStarted","Data":"a17a4a77299ea5bca1482fbcd3376b9366b13eae4e333a9af0c9c6ba1ae02f29"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.066406 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" event={"ID":"839bd402-e470-4226-aa0c-d8d295773a5b","Type":"ContainerStarted","Data":"7ccd49bad5cd3c61493f58fb2d4ad688bab126a5f937ae67488ecec9a4108882"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.072403 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" event={"ID":"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2","Type":"ContainerStarted","Data":"5d03ebffc24ba7adff2b377b4805c6c9ada641cffc5b22dc2668cdf8a447aa8b"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.086141 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-f59sb" podStartSLOduration=5.086125517 podStartE2EDuration="5.086125517s" podCreationTimestamp="2026-02-02 10:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.08552913 +0000 UTC m=+144.831629865" watchObservedRunningTime="2026-02-02 10:33:39.086125517 +0000 UTC m=+144.832226242" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.086478 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" podStartSLOduration=123.086472927 podStartE2EDuration="2m3.086472927s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.048898001 +0000 UTC m=+144.794998736" watchObservedRunningTime="2026-02-02 10:33:39.086472927 +0000 UTC m=+144.832573662" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.087775 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" event={"ID":"a39a1e38-d1f2-4cda-a088-3a22446407cc","Type":"ContainerStarted","Data":"ebf5358deb0aebac0189010b4b0d89247702ecda8b41429ff3a8414f20c76377"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.089884 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" event={"ID":"6c932fa7-7181-4aac-bf6c-8a6d56f92ece","Type":"ContainerStarted","Data":"97e5e7a29aeb5ba2a46d44c307b191742da3ebbe71e0afcf52ebaa63f92c03eb"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.090897 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" event={"ID":"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f","Type":"ContainerStarted","Data":"c0c440965901de415e6728439ee37a21486656689c0b90c1ac8ed4f4a0541d0a"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.107464 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" event={"ID":"35861719-3b6b-4572-8761-bb9c8bfce573","Type":"ContainerStarted","Data":"96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.107512 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" event={"ID":"35861719-3b6b-4572-8761-bb9c8bfce573","Type":"ContainerStarted","Data":"fa2fb108174000fef77fae837385d0ee2e7f119d3d115802fdb3d7eaf8c3521d"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.108468 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.109233 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.109421 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.609397774 +0000 UTC m=+145.355498519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.109931 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.113505 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.613490311 +0000 UTC m=+145.359591036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.113796 4909 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-ddclx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.113845 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" podUID="35861719-3b6b-4572-8761-bb9c8bfce573" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.125899 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" event={"ID":"e650bcf6-f84c-4322-86ac-6df17841176d","Type":"ContainerStarted","Data":"86c5de4eca0cddd45044a274ff2e6952f2ae82512f8e0211a12a0237f26f55b4"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.130354 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" event={"ID":"aa665eb8-a288-4806-9192-fc30b30868db","Type":"ContainerStarted","Data":"4387fcb3bc46aa155076dd34d287966d7e0c75fdfed35e05a9eb716e7ba1e546"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.144724 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" event={"ID":"175ecff3-6a2a-4076-a012-7eee503357f9","Type":"ContainerStarted","Data":"006ac812649556831c14f0d2f2e36f2156115474e9948b87d618b9e604ba0ea1"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.145447 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.183451 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-64hf5" podStartSLOduration=124.183435314 podStartE2EDuration="2m4.183435314s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.181599501 +0000 UTC m=+144.927700256" watchObservedRunningTime="2026-02-02 10:33:39.183435314 +0000 UTC m=+144.929536049" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.188943 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" event={"ID":"b35c3a0c-04bf-4603-997b-09b0a4976d67","Type":"ContainerStarted","Data":"dd948bacc16576ce528ca9655a6de289c1ce494e5bdce02b55942e8601f629d1"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.197767 4909 generic.go:334] "Generic (PLEG): container finished" podID="ba5776ad-fa9b-4f21-9dca-40958f01e293" containerID="bcf58743512d2ac0f1a0979f7631210f1f0657f33b64af8680eafc6fa38e1ca6" exitCode=0 Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.199897 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" event={"ID":"ba5776ad-fa9b-4f21-9dca-40958f01e293","Type":"ContainerDied","Data":"bcf58743512d2ac0f1a0979f7631210f1f0657f33b64af8680eafc6fa38e1ca6"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.210851 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.211231 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.711215139 +0000 UTC m=+145.457315874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.211529 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.213269 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.713257558 +0000 UTC m=+145.459358373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.232380 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" event={"ID":"7d4d413d-3b6e-436a-8e63-7c17ddb1cf86","Type":"ContainerStarted","Data":"594feb89005570d721a3e7e2995b38478bbcbdf3e2faa7b72e39894838f7ef8e"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.278341 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" event={"ID":"3a292656-f231-47aa-9ad8-f47d92cafb32","Type":"ContainerStarted","Data":"ab078da7459c3e4bee439874be529f1f26326fdf04f01e46be3a6ee1bde0c6a2"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.278411 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" event={"ID":"3a292656-f231-47aa-9ad8-f47d92cafb32","Type":"ContainerStarted","Data":"5b8ea9a9d7b206ff1c900d620df3e37071e266ed204ab7cbe59c97bcf53853c3"} Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.294022 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.296784 4909 patch_prober.go:28] interesting pod/router-default-5444994796-rtcqn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.296953 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtcqn" podUID="b0544feb-caff-4805-8c58-bae454503fa0" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.312307 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.312682 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.812662494 +0000 UTC m=+145.558763229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.313365 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.314296 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.814286211 +0000 UTC m=+145.560387046 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.377308 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jcpbk" podStartSLOduration=124.377288335 podStartE2EDuration="2m4.377288335s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.374452634 +0000 UTC m=+145.120553369" watchObservedRunningTime="2026-02-02 10:33:39.377288335 +0000 UTC m=+145.123389070" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.420393 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.422087 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:39.922062697 +0000 UTC m=+145.668163482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.461531 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-rtcqn" podStartSLOduration=124.461510686 podStartE2EDuration="2m4.461510686s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.459342434 +0000 UTC m=+145.205443169" watchObservedRunningTime="2026-02-02 10:33:39.461510686 +0000 UTC m=+145.207611421" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.522237 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.522754 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.02274039 +0000 UTC m=+145.768841125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.576304 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-rqcgf" podStartSLOduration=124.576289813 podStartE2EDuration="2m4.576289813s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.57374982 +0000 UTC m=+145.319850555" watchObservedRunningTime="2026-02-02 10:33:39.576289813 +0000 UTC m=+145.322390548" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.625126 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.625246 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.125220794 +0000 UTC m=+145.871321529 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.625454 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.625905 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.125892533 +0000 UTC m=+145.871993268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.656884 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" podStartSLOduration=124.65686369 podStartE2EDuration="2m4.65686369s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.65582009 +0000 UTC m=+145.401920825" watchObservedRunningTime="2026-02-02 10:33:39.65686369 +0000 UTC m=+145.402964425" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.727258 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.727837 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.22741305 +0000 UTC m=+145.973513785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.728059 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.728447 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.22843708 +0000 UTC m=+145.974537815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.737875 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6wjkz" podStartSLOduration=124.737843559 podStartE2EDuration="2m4.737843559s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.737611822 +0000 UTC m=+145.483712557" watchObservedRunningTime="2026-02-02 10:33:39.737843559 +0000 UTC m=+145.483944314" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.827259 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kpxbr" podStartSLOduration=124.827237359 podStartE2EDuration="2m4.827237359s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.774497649 +0000 UTC m=+145.520598384" watchObservedRunningTime="2026-02-02 10:33:39.827237359 +0000 UTC m=+145.573338094" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.830858 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.831208 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.331188942 +0000 UTC m=+146.077289677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.835578 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" podStartSLOduration=123.835556267 podStartE2EDuration="2m3.835556267s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.822269377 +0000 UTC m=+145.568370122" watchObservedRunningTime="2026-02-02 10:33:39.835556267 +0000 UTC m=+145.581657002" Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.934008 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:39 crc kubenswrapper[4909]: E0202 10:33:39.934298 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.434286495 +0000 UTC m=+146.180387230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.985959 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-02 10:28:38 +0000 UTC, rotation deadline is 2026-12-21 17:48:03.116788767 +0000 UTC Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.986403 4909 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7735h14m23.13038869s for next certificate rotation Feb 02 10:33:39 crc kubenswrapper[4909]: I0202 10:33:39.988003 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-mstlw" podStartSLOduration=124.987987972 podStartE2EDuration="2m4.987987972s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:39.907752935 +0000 UTC m=+145.653853680" watchObservedRunningTime="2026-02-02 10:33:39.987987972 +0000 UTC m=+145.734088707" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.035855 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.036163 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.536150362 +0000 UTC m=+146.282251097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.077933 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" podStartSLOduration=125.077917888 podStartE2EDuration="2m5.077917888s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.077105764 +0000 UTC m=+145.823206499" watchObservedRunningTime="2026-02-02 10:33:40.077917888 +0000 UTC m=+145.824018623" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.130569 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" podStartSLOduration=125.130544275 podStartE2EDuration="2m5.130544275s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.130305998 +0000 UTC m=+145.876406733" watchObservedRunningTime="2026-02-02 10:33:40.130544275 +0000 UTC m=+145.876645010" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.140399 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.140742 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.640730356 +0000 UTC m=+146.386831091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.157859 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" podStartSLOduration=125.157777824 podStartE2EDuration="2m5.157777824s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.154068668 +0000 UTC m=+145.900169403" watchObservedRunningTime="2026-02-02 10:33:40.157777824 +0000 UTC m=+145.903878569" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.179104 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-f8wcs" podStartSLOduration=125.179090185 podStartE2EDuration="2m5.179090185s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.176125 +0000 UTC m=+145.922225735" watchObservedRunningTime="2026-02-02 10:33:40.179090185 +0000 UTC m=+145.925190920" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.241768 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.242229 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.742210082 +0000 UTC m=+146.488310817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.286744 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jp9hd" event={"ID":"2b6be365-e876-4c20-9105-67da9ad35291","Type":"ContainerStarted","Data":"38de74bd6b3002148035c1046dc8723311e643fc9321df3a91c0903f3bf7e316"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.287019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jp9hd" event={"ID":"2b6be365-e876-4c20-9105-67da9ad35291","Type":"ContainerStarted","Data":"76d637802a2e82f747a1cf12e1a9c16f9a43c4d9ba26c0d552522b8363ca4499"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.290004 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9gvvq" event={"ID":"ec5d8d6b-4bf8-442a-813d-7312fe78ab8a","Type":"ContainerStarted","Data":"4466afdb2b3ecaa07129c5f9dd2fd8148ab490d56349e79c813c96a07123f69e"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.301351 4909 patch_prober.go:28] interesting pod/router-default-5444994796-rtcqn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:33:40 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Feb 02 10:33:40 crc kubenswrapper[4909]: [+]process-running ok Feb 02 10:33:40 crc kubenswrapper[4909]: healthz check failed Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.301824 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtcqn" podUID="b0544feb-caff-4805-8c58-bae454503fa0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.318314 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9gvvq" podStartSLOduration=6.318298171 podStartE2EDuration="6.318298171s" podCreationTimestamp="2026-02-02 10:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.315849861 +0000 UTC m=+146.061950596" watchObservedRunningTime="2026-02-02 10:33:40.318298171 +0000 UTC m=+146.064398906" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.319536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" event={"ID":"06cb86a1-a324-4ba8-b508-1ddcc20b4025","Type":"ContainerStarted","Data":"8bf988c5cd81e709de7b69e4c5b405bac5986c110b86b943758cd257613c102e"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.319665 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" event={"ID":"06cb86a1-a324-4ba8-b508-1ddcc20b4025","Type":"ContainerStarted","Data":"773ba5dd2d78fbc7eda5352483baf9e0ed9cf28bf61633fc87138da4b3aaa55e"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.320788 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" event={"ID":"a8d139ed-0b63-4cfa-9b66-0fe970d40006","Type":"ContainerStarted","Data":"896b0f0198b4e93bf3329dffb2a081c6324c3ecbb5fdb2a11e6ea2e356b1c177"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.343480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.344009 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.843990747 +0000 UTC m=+146.590091482 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.354149 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" event={"ID":"aa665eb8-a288-4806-9192-fc30b30868db","Type":"ContainerStarted","Data":"d2ae88a4efcd19b46139a3dc254a452ca0fd7426db96c8d636b31c5297357899"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.385876 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-gprz6" event={"ID":"b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb","Type":"ContainerStarted","Data":"135550bb04b8827afb7b1857b7e73365166054bb46defa196624e64901fd143f"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.386459 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-gprz6" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.389558 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-gprz6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.389671 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gprz6" podUID="b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.398038 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-v6jr9" podStartSLOduration=125.398020314 podStartE2EDuration="2m5.398020314s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.39648747 +0000 UTC m=+146.142588205" watchObservedRunningTime="2026-02-02 10:33:40.398020314 +0000 UTC m=+146.144121049" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.410310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" event={"ID":"7d4d413d-3b6e-436a-8e63-7c17ddb1cf86","Type":"ContainerStarted","Data":"e9c4648bace7d7861a222cc4c44b133b7671644e1f3274475319a09249ce0844"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.410355 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" event={"ID":"7d4d413d-3b6e-436a-8e63-7c17ddb1cf86","Type":"ContainerStarted","Data":"25464815a90fd78076f4a67ef86323196a3b31c3193fdae526d1e3cfd1077863"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.411088 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.421734 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" event={"ID":"e650bcf6-f84c-4322-86ac-6df17841176d","Type":"ContainerStarted","Data":"eff9ec98bac1bb6bc859a43153271077c144cea198e2827e8026ee775a88b776"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.434067 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-gprz6" podStartSLOduration=125.434051295 podStartE2EDuration="2m5.434051295s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.433251783 +0000 UTC m=+146.179352518" watchObservedRunningTime="2026-02-02 10:33:40.434051295 +0000 UTC m=+146.180152030" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.445782 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.447039 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:40.947022037 +0000 UTC m=+146.693122772 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.455057 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-h8sfq" event={"ID":"3a292656-f231-47aa-9ad8-f47d92cafb32","Type":"ContainerStarted","Data":"f5286d17d0e247039852684e7e4007793419348ccf98e30d2cee0d273acaacbc"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.469389 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" event={"ID":"b35c3a0c-04bf-4603-997b-09b0a4976d67","Type":"ContainerStarted","Data":"fe829cf7eb894373654da41a8ea1729560abf14f4ed08790b677702764ee698d"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.477424 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" event={"ID":"0e581142-928a-4e07-888c-362d0ae7b49f","Type":"ContainerStarted","Data":"251f8cb6cebc5f99b058f599699e0963cdd72b5769bb4cfef89f603ea8c4c353"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.477462 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" event={"ID":"0e581142-928a-4e07-888c-362d0ae7b49f","Type":"ContainerStarted","Data":"9f017880859bf943d6d0f7beef410f24e769ff416bdfad3e5b0d814fb6e981d9"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.487592 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" event={"ID":"2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2","Type":"ContainerStarted","Data":"961fa97675a6e5556a14af4241ac5f58ac6c8d301320ad22e94c1b13a4141bd7"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.492746 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qqbpq" podStartSLOduration=125.492732506 podStartE2EDuration="2m5.492732506s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.456079716 +0000 UTC m=+146.202180451" watchObservedRunningTime="2026-02-02 10:33:40.492732506 +0000 UTC m=+146.238833241" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.494096 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" podStartSLOduration=124.494089935 podStartE2EDuration="2m4.494089935s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.477548331 +0000 UTC m=+146.223649066" watchObservedRunningTime="2026-02-02 10:33:40.494089935 +0000 UTC m=+146.240190670" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.500067 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr" event={"ID":"78b7f343-023b-4ed9-bbb0-7b43c7d7a7d0","Type":"ContainerStarted","Data":"06b9bed002ee6e8799386f74827f5c3dc020e1a57c2b7287709eeee5848df46e"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.500109 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr" event={"ID":"78b7f343-023b-4ed9-bbb0-7b43c7d7a7d0","Type":"ContainerStarted","Data":"e05ed4a9226220b1e9607da6517f864515ea0df95a162b33a77588c0a053d7c8"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.502895 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" event={"ID":"50e06f1d-ae35-4177-afd0-55fc1112f0a7","Type":"ContainerStarted","Data":"f2ce9a446a9f4c9bdabeb2a92038d6a2aff0a57b0b772e5fde0eed499f1ff4ee"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.539184 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" event={"ID":"ba5776ad-fa9b-4f21-9dca-40958f01e293","Type":"ContainerStarted","Data":"d9a55767fe6e6df126e04580dd11fb3a6c3f5416e9d47ab8cf0a71d37a95740e"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.544752 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" event={"ID":"d8bb8059-30d1-4fca-9f9e-c4dac4b0854f","Type":"ContainerStarted","Data":"d803340a9e2ec65e8457485d783108cfd04c26e17fdcf9b9f2fa47ed79b7a080"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.549995 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.550582 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.050568652 +0000 UTC m=+146.796669387 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.555211 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ghmnw" podStartSLOduration=125.555196454 podStartE2EDuration="2m5.555196454s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.50998759 +0000 UTC m=+146.256088345" watchObservedRunningTime="2026-02-02 10:33:40.555196454 +0000 UTC m=+146.301297189" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.555327 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4cdjr" podStartSLOduration=125.555322898 podStartE2EDuration="2m5.555322898s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.549150571 +0000 UTC m=+146.295251306" watchObservedRunningTime="2026-02-02 10:33:40.555322898 +0000 UTC m=+146.301423643" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.559180 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" event={"ID":"97e0804e-f2d3-45b2-bd8e-c5df2099dea5","Type":"ContainerStarted","Data":"4bf818fe46e2e71aeb53aa7402f62106944e305a253db15c7fa8cbeb02dc0816"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.573185 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" event={"ID":"a39a1e38-d1f2-4cda-a088-3a22446407cc","Type":"ContainerStarted","Data":"7864d82de238665fa9a60e3c5077239f8e44314caa0358ea15eb0062d8b5ef1f"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.573240 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" event={"ID":"a39a1e38-d1f2-4cda-a088-3a22446407cc","Type":"ContainerStarted","Data":"f6716d3005bd753ec8a5103de61fad4b5b4b3318c2333b8de405d48a9c7cbd88"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.582016 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ljw49" podStartSLOduration=125.581997232 podStartE2EDuration="2m5.581997232s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.580570811 +0000 UTC m=+146.326671546" watchObservedRunningTime="2026-02-02 10:33:40.581997232 +0000 UTC m=+146.328097967" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.585007 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" event={"ID":"6c932fa7-7181-4aac-bf6c-8a6d56f92ece","Type":"ContainerStarted","Data":"ce559b7f6687080d01e57978cabc1e09b23dbd89052541f0b6ef58145f8910bf"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.590504 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" event={"ID":"6e592fee-28c7-49ef-b1e8-203a72d633b7","Type":"ContainerStarted","Data":"575d2239ac130b0e14844f7131c26101168e4155b2d3027d063ec37fa2fa2ccc"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.601939 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" event={"ID":"925654f4-6239-4a63-b5d2-9482939d83a8","Type":"ContainerStarted","Data":"96673945387b92371bab369ad649e77a73d7605b6d8733b413c7b7636971cf67"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.616848 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" event={"ID":"839bd402-e470-4226-aa0c-d8d295773a5b","Type":"ContainerStarted","Data":"9584c169a7176989160b5b22051cb17b2bbc118c857348e82d104679ec974aea"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.618049 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.618927 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" podStartSLOduration=125.618910089 podStartE2EDuration="2m5.618910089s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.61754709 +0000 UTC m=+146.363647845" watchObservedRunningTime="2026-02-02 10:33:40.618910089 +0000 UTC m=+146.365010824" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.620688 4909 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-rdmsw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.620735 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" podUID="839bd402-e470-4226-aa0c-d8d295773a5b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.632046 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" event={"ID":"b575e8ec-7b85-4647-b0af-03274d67afc8","Type":"ContainerStarted","Data":"fa9a9e026623d194f2e469fce2e458deac81c27f84db141aec5e4634c22f9654"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.632117 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" event={"ID":"b575e8ec-7b85-4647-b0af-03274d67afc8","Type":"ContainerStarted","Data":"55af95ee6adcd8e541b2adcf7ef60a39a2f7e8dd151aec5dc3b1955a0893f448"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.633445 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.642519 4909 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nt7q4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.642563 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.645361 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" podStartSLOduration=124.645351826 podStartE2EDuration="2m4.645351826s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.643015159 +0000 UTC m=+146.389115894" watchObservedRunningTime="2026-02-02 10:33:40.645351826 +0000 UTC m=+146.391452561" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.649459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" event={"ID":"bac9c408-8a21-4db2-b450-214c285e45c4","Type":"ContainerStarted","Data":"52cffbeabb8df6c294ee19b1c779e763a7254671d8df45b0c488b2e6f767ed92"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.649509 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.649520 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" event={"ID":"bac9c408-8a21-4db2-b450-214c285e45c4","Type":"ContainerStarted","Data":"9b25573a3f89e82482ff38d0ea5c3623720d506ae7ac9ed349be00d6349b4265"} Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.650990 4909 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mwjvx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.651023 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" podUID="29fc305d-0c7f-4ef9-8ae2-6534374de1ea" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.651603 4909 patch_prober.go:28] interesting pod/console-operator-58897d9998-64hf5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.651657 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-64hf5" podUID="6514ff15-d150-4c13-b96b-cf885b71504a" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.651983 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.652823 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.152229173 +0000 UTC m=+146.898329908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.652898 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.654398 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.154390525 +0000 UTC m=+146.900491260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.661404 4909 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gvxk8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.661666 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" podUID="bac9c408-8a21-4db2-b450-214c285e45c4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.669407 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.694213 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2xztg" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.716656 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hx79g" podStartSLOduration=125.716638727 podStartE2EDuration="2m5.716638727s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.716216525 +0000 UTC m=+146.462317260" watchObservedRunningTime="2026-02-02 10:33:40.716638727 +0000 UTC m=+146.462739462" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.717867 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kw2jt" podStartSLOduration=125.717860132 podStartE2EDuration="2m5.717860132s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.679469793 +0000 UTC m=+146.425570528" watchObservedRunningTime="2026-02-02 10:33:40.717860132 +0000 UTC m=+146.463960867" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.759039 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.760648 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.260627467 +0000 UTC m=+147.006728202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.797217 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" podStartSLOduration=124.797198844 podStartE2EDuration="2m4.797198844s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.797163453 +0000 UTC m=+146.543264198" watchObservedRunningTime="2026-02-02 10:33:40.797198844 +0000 UTC m=+146.543299579" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.797453 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ptpkp" podStartSLOduration=125.797448081 podStartE2EDuration="2m5.797448081s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.743024723 +0000 UTC m=+146.489125458" watchObservedRunningTime="2026-02-02 10:33:40.797448081 +0000 UTC m=+146.543548816" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.854972 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tsvwl" podStartSLOduration=124.854949646 podStartE2EDuration="2m4.854949646s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.854655778 +0000 UTC m=+146.600756513" watchObservedRunningTime="2026-02-02 10:33:40.854949646 +0000 UTC m=+146.601050381" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.855482 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" podStartSLOduration=124.855476242 podStartE2EDuration="2m4.855476242s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.840118632 +0000 UTC m=+146.586219367" watchObservedRunningTime="2026-02-02 10:33:40.855476242 +0000 UTC m=+146.601576977" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.863143 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.863567 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.363542933 +0000 UTC m=+147.109643668 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.908466 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" podStartSLOduration=124.908448788 podStartE2EDuration="2m4.908448788s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.907216443 +0000 UTC m=+146.653317178" watchObservedRunningTime="2026-02-02 10:33:40.908448788 +0000 UTC m=+146.654549523" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.946991 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" podStartSLOduration=124.946973342 podStartE2EDuration="2m4.946973342s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.945073807 +0000 UTC m=+146.691174542" watchObservedRunningTime="2026-02-02 10:33:40.946973342 +0000 UTC m=+146.693074077" Feb 02 10:33:40 crc kubenswrapper[4909]: I0202 10:33:40.964430 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:40 crc kubenswrapper[4909]: E0202 10:33:40.964924 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.464903545 +0000 UTC m=+147.211004280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.019682 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ftbpn" podStartSLOduration=126.019664763 podStartE2EDuration="2m6.019664763s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:40.976163407 +0000 UTC m=+146.722264162" watchObservedRunningTime="2026-02-02 10:33:41.019664763 +0000 UTC m=+146.765765498" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.019764 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2m2wr" podStartSLOduration=125.019760976 podStartE2EDuration="2m5.019760976s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:41.019296002 +0000 UTC m=+146.765396737" watchObservedRunningTime="2026-02-02 10:33:41.019760976 +0000 UTC m=+146.765861711" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.065788 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.066129 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.566114423 +0000 UTC m=+147.312215148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.067000 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7rkdz" podStartSLOduration=125.066989108 podStartE2EDuration="2m5.066989108s" podCreationTimestamp="2026-02-02 10:31:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:41.064306421 +0000 UTC m=+146.810407156" watchObservedRunningTime="2026-02-02 10:33:41.066989108 +0000 UTC m=+146.813089843" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.167671 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.168020 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.668005261 +0000 UTC m=+147.414105996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.269145 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.269632 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.76961247 +0000 UTC m=+147.515713275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.296772 4909 patch_prober.go:28] interesting pod/router-default-5444994796-rtcqn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:33:41 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Feb 02 10:33:41 crc kubenswrapper[4909]: [+]process-running ok Feb 02 10:33:41 crc kubenswrapper[4909]: healthz check failed Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.297091 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtcqn" podUID="b0544feb-caff-4805-8c58-bae454503fa0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.355843 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.356196 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.357233 4909 patch_prober.go:28] interesting pod/apiserver-76f77b778f-knpc2 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.357275 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" podUID="2c5a0a27-fe7f-4298-a3ea-734c4bd8c3f2" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.365755 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.370336 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.370648 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.870633973 +0000 UTC m=+147.616734708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.471302 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.471582 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:41.971568813 +0000 UTC m=+147.717669548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.528927 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.528972 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.531176 4909 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-q62sq container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.531246 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" podUID="ba5776ad-fa9b-4f21-9dca-40958f01e293" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.12:8443/livez\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.572549 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.572769 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.0727338 +0000 UTC m=+147.818834545 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.572878 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.573190 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.073182813 +0000 UTC m=+147.819283548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.652631 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jp9hd" event={"ID":"2b6be365-e876-4c20-9105-67da9ad35291","Type":"ContainerStarted","Data":"45c3c50578e56ed9879ffa30737cba338136da2cadeafe37570a61a2930308ab"} Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.653551 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.655731 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" event={"ID":"a8d139ed-0b63-4cfa-9b66-0fe970d40006","Type":"ContainerStarted","Data":"138ca84d8272ae83ec32910178bd64d19d4b7524fed7c03429a3be01ec693ca9"} Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.666614 4909 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nt7q4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.666668 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.667558 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-gprz6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.667591 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gprz6" podUID="b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.673631 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-rdmsw" Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.674278 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.174260798 +0000 UTC m=+147.920361533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.674310 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.674606 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.674974 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.174963178 +0000 UTC m=+147.921063923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.676174 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-26rcr" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.775934 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.777359 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.277345419 +0000 UTC m=+148.023446154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.841404 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jp9hd" podStartSLOduration=7.841388233 podStartE2EDuration="7.841388233s" podCreationTimestamp="2026-02-02 10:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:41.725443963 +0000 UTC m=+147.471544698" watchObservedRunningTime="2026-02-02 10:33:41.841388233 +0000 UTC m=+147.587488968" Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.879267 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.879989 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.379961168 +0000 UTC m=+148.126061903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:41 crc kubenswrapper[4909]: I0202 10:33:41.980380 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:41 crc kubenswrapper[4909]: E0202 10:33:41.980823 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.480787205 +0000 UTC m=+148.226887940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.081577 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.081858 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.581846059 +0000 UTC m=+148.327946794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.182444 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.182734 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.682691956 +0000 UTC m=+148.428792691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.182984 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.183338 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.683322734 +0000 UTC m=+148.429423469 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.284351 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.284541 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.784508912 +0000 UTC m=+148.530609657 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.284720 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.285284 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.785268293 +0000 UTC m=+148.531369028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.302352 4909 patch_prober.go:28] interesting pod/router-default-5444994796-rtcqn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:33:42 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Feb 02 10:33:42 crc kubenswrapper[4909]: [+]process-running ok Feb 02 10:33:42 crc kubenswrapper[4909]: healthz check failed Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.302478 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtcqn" podUID="b0544feb-caff-4805-8c58-bae454503fa0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.385792 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.386015 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.885902025 +0000 UTC m=+148.632002760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.386155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.386399 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.886392099 +0000 UTC m=+148.632492834 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.487140 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.487302 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.987276908 +0000 UTC m=+148.733377643 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.487402 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.487781 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:42.987744801 +0000 UTC m=+148.733845536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.588738 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.588883 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.088866477 +0000 UTC m=+148.834967212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.589297 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.589722 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.089702471 +0000 UTC m=+148.835803256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.661015 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" event={"ID":"a8d139ed-0b63-4cfa-9b66-0fe970d40006","Type":"ContainerStarted","Data":"96a1e8093e9a4408ec10a27a67d65642131f36fdf9e7df75af9a16403187bad9"} Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.662723 4909 generic.go:334] "Generic (PLEG): container finished" podID="50e06f1d-ae35-4177-afd0-55fc1112f0a7" containerID="f2ce9a446a9f4c9bdabeb2a92038d6a2aff0a57b0b772e5fde0eed499f1ff4ee" exitCode=0 Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.662877 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" event={"ID":"50e06f1d-ae35-4177-afd0-55fc1112f0a7","Type":"ContainerDied","Data":"f2ce9a446a9f4c9bdabeb2a92038d6a2aff0a57b0b772e5fde0eed499f1ff4ee"} Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.663442 4909 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-nt7q4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" start-of-body= Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.663483 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.18:8080/healthz\": dial tcp 10.217.0.18:8080: connect: connection refused" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.664283 4909 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-gvxk8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.664314 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" podUID="bac9c408-8a21-4db2-b450-214c285e45c4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.690356 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.690506 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.190481726 +0000 UTC m=+148.936582461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.690607 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.690973 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.1909624 +0000 UTC m=+148.937063135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.794359 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.794553 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.294526096 +0000 UTC m=+149.040626841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.795391 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.795473 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.295459022 +0000 UTC m=+149.041559757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.898053 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.898161 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.398138093 +0000 UTC m=+149.144238818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.898229 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.898321 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.898365 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.898394 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.898415 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.898494 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.398484522 +0000 UTC m=+149.144585247 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.903124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.905145 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.908077 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.912525 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.998911 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.999115 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.499086163 +0000 UTC m=+149.245186908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.999518 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:42 crc kubenswrapper[4909]: I0202 10:33:42.999531 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mlhl9"] Feb 02 10:33:42 crc kubenswrapper[4909]: E0202 10:33:42.999815 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.499778553 +0000 UTC m=+149.245879288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.000664 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.036172 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlhl9"] Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.037896 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.058768 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.059209 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.059887 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.100438 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.100689 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-utilities\") pod \"certified-operators-mlhl9\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.100728 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jr9\" (UniqueName: \"kubernetes.io/projected/7164d60d-218c-47e0-a74a-677793e589b0-kube-api-access-46jr9\") pod \"certified-operators-mlhl9\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.100777 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-catalog-content\") pod \"certified-operators-mlhl9\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: E0202 10:33:43.100906 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.600891898 +0000 UTC m=+149.346992633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.202427 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-utilities\") pod \"certified-operators-mlhl9\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.202483 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46jr9\" (UniqueName: \"kubernetes.io/projected/7164d60d-218c-47e0-a74a-677793e589b0-kube-api-access-46jr9\") pod \"certified-operators-mlhl9\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.202516 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.202564 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-catalog-content\") pod \"certified-operators-mlhl9\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: E0202 10:33:43.202915 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.702900979 +0000 UTC m=+149.449001714 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.202946 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-utilities\") pod \"certified-operators-mlhl9\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.203015 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-catalog-content\") pod \"certified-operators-mlhl9\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.221721 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jr9\" (UniqueName: \"kubernetes.io/projected/7164d60d-218c-47e0-a74a-677793e589b0-kube-api-access-46jr9\") pod \"certified-operators-mlhl9\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.300968 4909 patch_prober.go:28] interesting pod/router-default-5444994796-rtcqn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:33:43 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Feb 02 10:33:43 crc kubenswrapper[4909]: [+]process-running ok Feb 02 10:33:43 crc kubenswrapper[4909]: healthz check failed Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.301015 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtcqn" podUID="b0544feb-caff-4805-8c58-bae454503fa0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.303552 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:43 crc kubenswrapper[4909]: E0202 10:33:43.303904 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.803874821 +0000 UTC m=+149.549975556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.321766 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gwvb8"] Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.322732 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.323596 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.362310 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwvb8"] Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.406636 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-catalog-content\") pod \"certified-operators-gwvb8\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.406714 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.406767 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-utilities\") pod \"certified-operators-gwvb8\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.406832 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nds6\" (UniqueName: \"kubernetes.io/projected/62300332-ecea-47ea-9809-6cc89e9593bf-kube-api-access-7nds6\") pod \"certified-operators-gwvb8\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: E0202 10:33:43.407153 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:43.907140948 +0000 UTC m=+149.653241683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.434007 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.435127 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.442639 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.442866 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.460736 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.507596 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.507924 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nds6\" (UniqueName: \"kubernetes.io/projected/62300332-ecea-47ea-9809-6cc89e9593bf-kube-api-access-7nds6\") pod \"certified-operators-gwvb8\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.507985 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-catalog-content\") pod \"certified-operators-gwvb8\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.508051 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6950001-fb3a-457a-82f7-4c0cb20a876a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c6950001-fb3a-457a-82f7-4c0cb20a876a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.508078 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-utilities\") pod \"certified-operators-gwvb8\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.508101 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6950001-fb3a-457a-82f7-4c0cb20a876a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c6950001-fb3a-457a-82f7-4c0cb20a876a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:33:43 crc kubenswrapper[4909]: E0202 10:33:43.508214 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:44.008193831 +0000 UTC m=+149.754294576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.508883 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-catalog-content\") pod \"certified-operators-gwvb8\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.509143 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-utilities\") pod \"certified-operators-gwvb8\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.523973 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-72ltv"] Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.524872 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.544542 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72ltv"] Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.544746 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.571612 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nds6\" (UniqueName: \"kubernetes.io/projected/62300332-ecea-47ea-9809-6cc89e9593bf-kube-api-access-7nds6\") pod \"certified-operators-gwvb8\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.592151 4909 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.609579 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntsgz\" (UniqueName: \"kubernetes.io/projected/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-kube-api-access-ntsgz\") pod \"community-operators-72ltv\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.609659 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.609705 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6950001-fb3a-457a-82f7-4c0cb20a876a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c6950001-fb3a-457a-82f7-4c0cb20a876a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.609735 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6950001-fb3a-457a-82f7-4c0cb20a876a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c6950001-fb3a-457a-82f7-4c0cb20a876a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.609763 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-utilities\") pod \"community-operators-72ltv\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: E0202 10:33:43.610437 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:44.110418228 +0000 UTC m=+149.856519023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.614015 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-catalog-content\") pod \"community-operators-72ltv\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.614420 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6950001-fb3a-457a-82f7-4c0cb20a876a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c6950001-fb3a-457a-82f7-4c0cb20a876a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.655905 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6950001-fb3a-457a-82f7-4c0cb20a876a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c6950001-fb3a-457a-82f7-4c0cb20a876a\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.679148 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.702514 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" event={"ID":"a8d139ed-0b63-4cfa-9b66-0fe970d40006","Type":"ContainerStarted","Data":"be323af4e5a2c4646ed31a80e7fe6e2519fc5b7cc5be579a939931e8127ce884"} Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.702555 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" event={"ID":"a8d139ed-0b63-4cfa-9b66-0fe970d40006","Type":"ContainerStarted","Data":"6534c66477778f903c2f87af6a5a391c35bc9097d6d11e8ee5ba946b614da0d7"} Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.714921 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.715187 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-catalog-content\") pod \"community-operators-72ltv\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.715240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntsgz\" (UniqueName: \"kubernetes.io/projected/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-kube-api-access-ntsgz\") pod \"community-operators-72ltv\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.715333 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-utilities\") pod \"community-operators-72ltv\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.715926 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-utilities\") pod \"community-operators-72ltv\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.723419 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bxkq4"] Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.724081 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-catalog-content\") pod \"community-operators-72ltv\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: E0202 10:33:43.724398 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:44.224379612 +0000 UTC m=+149.970480347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.736575 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.751344 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxkq4"] Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.751531 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-9sjgx" podStartSLOduration=9.751508068 podStartE2EDuration="9.751508068s" podCreationTimestamp="2026-02-02 10:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:43.746631619 +0000 UTC m=+149.492732354" watchObservedRunningTime="2026-02-02 10:33:43.751508068 +0000 UTC m=+149.497608803" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.760383 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntsgz\" (UniqueName: \"kubernetes.io/projected/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-kube-api-access-ntsgz\") pod \"community-operators-72ltv\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.795196 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.818706 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-catalog-content\") pod \"community-operators-bxkq4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.818758 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.818851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-utilities\") pod \"community-operators-bxkq4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.818878 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw9mk\" (UniqueName: \"kubernetes.io/projected/7986860a-5f33-47fb-af58-a2925c4572a4-kube-api-access-tw9mk\") pod \"community-operators-bxkq4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:43 crc kubenswrapper[4909]: E0202 10:33:43.820634 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:44.320620487 +0000 UTC m=+150.066721222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.886038 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.919976 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.920384 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-catalog-content\") pod \"community-operators-bxkq4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.920461 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-utilities\") pod \"community-operators-bxkq4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.920481 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw9mk\" (UniqueName: \"kubernetes.io/projected/7986860a-5f33-47fb-af58-a2925c4572a4-kube-api-access-tw9mk\") pod \"community-operators-bxkq4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:43 crc kubenswrapper[4909]: E0202 10:33:43.920847 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:44.420830977 +0000 UTC m=+150.166931712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.921143 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-catalog-content\") pod \"community-operators-bxkq4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.921365 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-utilities\") pod \"community-operators-bxkq4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.959837 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlhl9"] Feb 02 10:33:43 crc kubenswrapper[4909]: I0202 10:33:43.978601 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw9mk\" (UniqueName: \"kubernetes.io/projected/7986860a-5f33-47fb-af58-a2925c4572a4-kube-api-access-tw9mk\") pod \"community-operators-bxkq4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.025462 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:44 crc kubenswrapper[4909]: E0202 10:33:44.025948 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:44.525927806 +0000 UTC m=+150.272028541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:44 crc kubenswrapper[4909]: W0202 10:33:44.029263 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7164d60d_218c_47e0_a74a_677793e589b0.slice/crio-09b5d6e00db532a19671a99815f400e9f9db763dd41c1b34fd494908bf963d7b WatchSource:0}: Error finding container 09b5d6e00db532a19671a99815f400e9f9db763dd41c1b34fd494908bf963d7b: Status 404 returned error can't find the container with id 09b5d6e00db532a19671a99815f400e9f9db763dd41c1b34fd494908bf963d7b Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.095025 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.129598 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:44 crc kubenswrapper[4909]: E0202 10:33:44.129834 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-02 10:33:44.629800741 +0000 UTC m=+150.375901476 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.129896 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:44 crc kubenswrapper[4909]: E0202 10:33:44.130160 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-02 10:33:44.630153261 +0000 UTC m=+150.376253996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-slspb" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.185913 4909 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-02T10:33:43.592406573Z","Handler":null,"Name":""} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.199116 4909 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.199156 4909 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.231928 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.244415 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.248727 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwvb8"] Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.252254 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.315037 4909 patch_prober.go:28] interesting pod/router-default-5444994796-rtcqn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:33:44 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Feb 02 10:33:44 crc kubenswrapper[4909]: [+]process-running ok Feb 02 10:33:44 crc kubenswrapper[4909]: healthz check failed Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.315087 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtcqn" podUID="b0544feb-caff-4805-8c58-bae454503fa0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.333109 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50e06f1d-ae35-4177-afd0-55fc1112f0a7-config-volume\") pod \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.333200 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgxl7\" (UniqueName: \"kubernetes.io/projected/50e06f1d-ae35-4177-afd0-55fc1112f0a7-kube-api-access-vgxl7\") pod \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.333252 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50e06f1d-ae35-4177-afd0-55fc1112f0a7-secret-volume\") pod \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\" (UID: \"50e06f1d-ae35-4177-afd0-55fc1112f0a7\") " Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.333391 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.334561 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e06f1d-ae35-4177-afd0-55fc1112f0a7-config-volume" (OuterVolumeSpecName: "config-volume") pod "50e06f1d-ae35-4177-afd0-55fc1112f0a7" (UID: "50e06f1d-ae35-4177-afd0-55fc1112f0a7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.353748 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e06f1d-ae35-4177-afd0-55fc1112f0a7-kube-api-access-vgxl7" (OuterVolumeSpecName: "kube-api-access-vgxl7") pod "50e06f1d-ae35-4177-afd0-55fc1112f0a7" (UID: "50e06f1d-ae35-4177-afd0-55fc1112f0a7"). InnerVolumeSpecName "kube-api-access-vgxl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.375131 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e06f1d-ae35-4177-afd0-55fc1112f0a7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "50e06f1d-ae35-4177-afd0-55fc1112f0a7" (UID: "50e06f1d-ae35-4177-afd0-55fc1112f0a7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.435471 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgxl7\" (UniqueName: \"kubernetes.io/projected/50e06f1d-ae35-4177-afd0-55fc1112f0a7-kube-api-access-vgxl7\") on node \"crc\" DevicePath \"\"" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.435505 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50e06f1d-ae35-4177-afd0-55fc1112f0a7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.435517 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50e06f1d-ae35-4177-afd0-55fc1112f0a7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.457138 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.457185 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.611490 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.623490 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72ltv"] Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.726321 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-slspb\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.761420 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7c3464c55ab4a8466c657348f58bc578c03a6c4c82e02ac53b4a0b1b90ed13cb"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.761466 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9f3faa533eae5a7624c799827b39de24e68440a21f89cf9d6445d590bc491015"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.769173 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72ltv" event={"ID":"58068665-fe9b-4bd9-ac11-a3d6c9ad888e","Type":"ContainerStarted","Data":"7ce4853e060f2381f0c12ffd62d69c2294355588172203b587da8db6b890e5cf"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.770410 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" event={"ID":"50e06f1d-ae35-4177-afd0-55fc1112f0a7","Type":"ContainerDied","Data":"126605d80c8022267bcac88bf24e8306217321d268c2ee1fd8f6fd716e526ed4"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.770432 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="126605d80c8022267bcac88bf24e8306217321d268c2ee1fd8f6fd716e526ed4" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.770497 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.776932 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c6950001-fb3a-457a-82f7-4c0cb20a876a","Type":"ContainerStarted","Data":"5ea14b10f05bbad9ac8a1c81e472c4e2ee614fbc4cf8e316005dd8d1494b029e"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.791677 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c16ac531808aae0bb249ca1fbbea16288d3cda9eb21b0e69b8d1c9312cb790d2"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.791752 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c3694922d82cde768df25ad2133c4aacd888b4d9fc26ce4bbf70bc82a65e72f3"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.792190 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.794763 4909 generic.go:334] "Generic (PLEG): container finished" podID="7164d60d-218c-47e0-a74a-677793e589b0" containerID="6b75b836db82aca720c06f2c34d41058856e207d97d5f8462a43887cf1d653fd" exitCode=0 Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.795435 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlhl9" event={"ID":"7164d60d-218c-47e0-a74a-677793e589b0","Type":"ContainerDied","Data":"6b75b836db82aca720c06f2c34d41058856e207d97d5f8462a43887cf1d653fd"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.795460 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlhl9" event={"ID":"7164d60d-218c-47e0-a74a-677793e589b0","Type":"ContainerStarted","Data":"09b5d6e00db532a19671a99815f400e9f9db763dd41c1b34fd494908bf963d7b"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.796276 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bxkq4"] Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.800381 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.805554 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"534ff3a8bb0f41a0a7e52a7633434d1b4117eef65894937415658a0f8101ef53"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.805601 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8648be5b09c52d6869f4ee98acd9a81df426b94c89353501a036141adbd1aa38"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.814982 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvb8" event={"ID":"62300332-ecea-47ea-9809-6cc89e9593bf","Type":"ContainerStarted","Data":"059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac"} Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.815018 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvb8" event={"ID":"62300332-ecea-47ea-9809-6cc89e9593bf","Type":"ContainerStarted","Data":"1a75680fe74618edb29fa32d4126a337774008ec0a6c95997c21076547980bf0"} Feb 02 10:33:44 crc kubenswrapper[4909]: W0202 10:33:44.835492 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7986860a_5f33_47fb_af58_a2925c4572a4.slice/crio-cfc8a81cdbddec4918fc6243e193940dc129a149cd7269f00198e88901d4725d WatchSource:0}: Error finding container cfc8a81cdbddec4918fc6243e193940dc129a149cd7269f00198e88901d4725d: Status 404 returned error can't find the container with id cfc8a81cdbddec4918fc6243e193940dc129a149cd7269f00198e88901d4725d Feb 02 10:33:44 crc kubenswrapper[4909]: I0202 10:33:44.997215 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.030278 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.300550 4909 patch_prober.go:28] interesting pod/router-default-5444994796-rtcqn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:33:45 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Feb 02 10:33:45 crc kubenswrapper[4909]: [+]process-running ok Feb 02 10:33:45 crc kubenswrapper[4909]: healthz check failed Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.300907 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtcqn" podUID="b0544feb-caff-4805-8c58-bae454503fa0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.305722 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-slspb"] Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.317440 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vn7ln"] Feb 02 10:33:45 crc kubenswrapper[4909]: E0202 10:33:45.317628 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e06f1d-ae35-4177-afd0-55fc1112f0a7" containerName="collect-profiles" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.317639 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e06f1d-ae35-4177-afd0-55fc1112f0a7" containerName="collect-profiles" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.317743 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e06f1d-ae35-4177-afd0-55fc1112f0a7" containerName="collect-profiles" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.318827 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: W0202 10:33:45.325129 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e75bc1a_54b9_4897_9da5_0a04a1d952cf.slice/crio-c84f1af740a7a9fc310fc4d6ca91881d1a08c31155b9950c171aa5d8d311232e WatchSource:0}: Error finding container c84f1af740a7a9fc310fc4d6ca91881d1a08c31155b9950c171aa5d8d311232e: Status 404 returned error can't find the container with id c84f1af740a7a9fc310fc4d6ca91881d1a08c31155b9950c171aa5d8d311232e Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.326393 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.334901 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vn7ln"] Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.471452 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6mt\" (UniqueName: \"kubernetes.io/projected/9483225f-edd3-4728-8e95-67f872692af9-kube-api-access-lx6mt\") pod \"redhat-marketplace-vn7ln\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.471531 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-catalog-content\") pod \"redhat-marketplace-vn7ln\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.471554 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-utilities\") pod \"redhat-marketplace-vn7ln\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.573092 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6mt\" (UniqueName: \"kubernetes.io/projected/9483225f-edd3-4728-8e95-67f872692af9-kube-api-access-lx6mt\") pod \"redhat-marketplace-vn7ln\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.573156 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-catalog-content\") pod \"redhat-marketplace-vn7ln\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.573187 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-utilities\") pod \"redhat-marketplace-vn7ln\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.573741 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-utilities\") pod \"redhat-marketplace-vn7ln\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.575057 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-catalog-content\") pod \"redhat-marketplace-vn7ln\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.595631 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6mt\" (UniqueName: \"kubernetes.io/projected/9483225f-edd3-4728-8e95-67f872692af9-kube-api-access-lx6mt\") pod \"redhat-marketplace-vn7ln\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.636588 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.713866 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fgxpg"] Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.715072 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.732372 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgxpg"] Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.847170 4909 generic.go:334] "Generic (PLEG): container finished" podID="62300332-ecea-47ea-9809-6cc89e9593bf" containerID="059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac" exitCode=0 Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.847233 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvb8" event={"ID":"62300332-ecea-47ea-9809-6cc89e9593bf","Type":"ContainerDied","Data":"059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac"} Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.870571 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" event={"ID":"6e75bc1a-54b9-4897-9da5-0a04a1d952cf","Type":"ContainerStarted","Data":"537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a"} Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.870621 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" event={"ID":"6e75bc1a-54b9-4897-9da5-0a04a1d952cf","Type":"ContainerStarted","Data":"c84f1af740a7a9fc310fc4d6ca91881d1a08c31155b9950c171aa5d8d311232e"} Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.870672 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.873392 4909 generic.go:334] "Generic (PLEG): container finished" podID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerID="ff969be085ae8d43c0f5ab045162c9554085bfd5d2d418f2e8787d3210432184" exitCode=0 Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.873447 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72ltv" event={"ID":"58068665-fe9b-4bd9-ac11-a3d6c9ad888e","Type":"ContainerDied","Data":"ff969be085ae8d43c0f5ab045162c9554085bfd5d2d418f2e8787d3210432184"} Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.877152 4909 generic.go:334] "Generic (PLEG): container finished" podID="7986860a-5f33-47fb-af58-a2925c4572a4" containerID="084647b93fb11309f8a05aef3463e8fc8476127d5db1e02be4f24bc9b0432307" exitCode=0 Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.877208 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxkq4" event={"ID":"7986860a-5f33-47fb-af58-a2925c4572a4","Type":"ContainerDied","Data":"084647b93fb11309f8a05aef3463e8fc8476127d5db1e02be4f24bc9b0432307"} Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.877231 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxkq4" event={"ID":"7986860a-5f33-47fb-af58-a2925c4572a4","Type":"ContainerStarted","Data":"cfc8a81cdbddec4918fc6243e193940dc129a149cd7269f00198e88901d4725d"} Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.883953 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-utilities\") pod \"redhat-marketplace-fgxpg\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.884068 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-catalog-content\") pod \"redhat-marketplace-fgxpg\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.884104 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnp7n\" (UniqueName: \"kubernetes.io/projected/9818ea74-330d-4bd0-8931-91fef529ef29-kube-api-access-fnp7n\") pod \"redhat-marketplace-fgxpg\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.889235 4909 generic.go:334] "Generic (PLEG): container finished" podID="c6950001-fb3a-457a-82f7-4c0cb20a876a" containerID="680b0e2bf8014ed40bbacacfb75317674e32988085c4c1bf1f4f7364f41647ff" exitCode=0 Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.891093 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c6950001-fb3a-457a-82f7-4c0cb20a876a","Type":"ContainerDied","Data":"680b0e2bf8014ed40bbacacfb75317674e32988085c4c1bf1f4f7364f41647ff"} Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.910657 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" podStartSLOduration=130.910641303 podStartE2EDuration="2m10.910641303s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:45.906662079 +0000 UTC m=+151.652762814" watchObservedRunningTime="2026-02-02 10:33:45.910641303 +0000 UTC m=+151.656742038" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.963965 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vn7ln"] Feb 02 10:33:45 crc kubenswrapper[4909]: W0202 10:33:45.984326 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9483225f_edd3_4728_8e95_67f872692af9.slice/crio-8dd5b04ea166b12c4f995cffeb401200d60e383f8a7e80d7f52ce5cfea93ae7f WatchSource:0}: Error finding container 8dd5b04ea166b12c4f995cffeb401200d60e383f8a7e80d7f52ce5cfea93ae7f: Status 404 returned error can't find the container with id 8dd5b04ea166b12c4f995cffeb401200d60e383f8a7e80d7f52ce5cfea93ae7f Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.984987 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-utilities\") pod \"redhat-marketplace-fgxpg\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.985034 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-catalog-content\") pod \"redhat-marketplace-fgxpg\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.985050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnp7n\" (UniqueName: \"kubernetes.io/projected/9818ea74-330d-4bd0-8931-91fef529ef29-kube-api-access-fnp7n\") pod \"redhat-marketplace-fgxpg\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.986312 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-catalog-content\") pod \"redhat-marketplace-fgxpg\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:45 crc kubenswrapper[4909]: I0202 10:33:45.986466 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-utilities\") pod \"redhat-marketplace-fgxpg\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.019286 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnp7n\" (UniqueName: \"kubernetes.io/projected/9818ea74-330d-4bd0-8931-91fef529ef29-kube-api-access-fnp7n\") pod \"redhat-marketplace-fgxpg\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.048322 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.297601 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgxpg"] Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.298996 4909 patch_prober.go:28] interesting pod/router-default-5444994796-rtcqn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:33:46 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Feb 02 10:33:46 crc kubenswrapper[4909]: [+]process-running ok Feb 02 10:33:46 crc kubenswrapper[4909]: healthz check failed Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.299061 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtcqn" podUID="b0544feb-caff-4805-8c58-bae454503fa0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.319275 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2qpc6"] Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.320470 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.323145 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.329084 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qpc6"] Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.362115 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:46 crc kubenswrapper[4909]: W0202 10:33:46.362211 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9818ea74_330d_4bd0_8931_91fef529ef29.slice/crio-1e1ca68921ab9b028a5530519dd4740da4d66013789a7b4840dda337d37402dc WatchSource:0}: Error finding container 1e1ca68921ab9b028a5530519dd4740da4d66013789a7b4840dda337d37402dc: Status 404 returned error can't find the container with id 1e1ca68921ab9b028a5530519dd4740da4d66013789a7b4840dda337d37402dc Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.367615 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-knpc2" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.391347 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8422q\" (UniqueName: \"kubernetes.io/projected/20869d58-911c-44ab-8f33-07ffc1056b3b-kube-api-access-8422q\") pod \"redhat-operators-2qpc6\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.391445 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-utilities\") pod \"redhat-operators-2qpc6\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.391693 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-catalog-content\") pod \"redhat-operators-2qpc6\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.492710 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8422q\" (UniqueName: \"kubernetes.io/projected/20869d58-911c-44ab-8f33-07ffc1056b3b-kube-api-access-8422q\") pod \"redhat-operators-2qpc6\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.492756 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-utilities\") pod \"redhat-operators-2qpc6\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.492786 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-catalog-content\") pod \"redhat-operators-2qpc6\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.499135 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-utilities\") pod \"redhat-operators-2qpc6\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.499157 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-catalog-content\") pod \"redhat-operators-2qpc6\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.500477 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.501108 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.503590 4909 patch_prober.go:28] interesting pod/console-f9d7485db-thqss container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.503631 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-thqss" podUID="e6595d49-3b53-44fc-a253-a252a53333a2" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.532701 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8422q\" (UniqueName: \"kubernetes.io/projected/20869d58-911c-44ab-8f33-07ffc1056b3b-kube-api-access-8422q\") pod \"redhat-operators-2qpc6\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.539510 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.554431 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q62sq" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.561168 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-64hf5" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.674199 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.688675 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.725490 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x94kg"] Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.726621 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.755302 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x94kg"] Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.840828 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6smm4\" (UniqueName: \"kubernetes.io/projected/65681cb4-6a6f-4fce-8322-b6efffeecc78-kube-api-access-6smm4\") pod \"redhat-operators-x94kg\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.841457 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-utilities\") pod \"redhat-operators-x94kg\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.841542 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-catalog-content\") pod \"redhat-operators-x94kg\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.905219 4909 generic.go:334] "Generic (PLEG): container finished" podID="9483225f-edd3-4728-8e95-67f872692af9" containerID="df52aced6438fbde03f7ed1f02ce7aeedd49844457c5a0459216293dcbd47f52" exitCode=0 Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.905306 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn7ln" event={"ID":"9483225f-edd3-4728-8e95-67f872692af9","Type":"ContainerDied","Data":"df52aced6438fbde03f7ed1f02ce7aeedd49844457c5a0459216293dcbd47f52"} Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.905349 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn7ln" event={"ID":"9483225f-edd3-4728-8e95-67f872692af9","Type":"ContainerStarted","Data":"8dd5b04ea166b12c4f995cffeb401200d60e383f8a7e80d7f52ce5cfea93ae7f"} Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.914001 4909 generic.go:334] "Generic (PLEG): container finished" podID="9818ea74-330d-4bd0-8931-91fef529ef29" containerID="56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42" exitCode=0 Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.915589 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxpg" event={"ID":"9818ea74-330d-4bd0-8931-91fef529ef29","Type":"ContainerDied","Data":"56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42"} Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.915629 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxpg" event={"ID":"9818ea74-330d-4bd0-8931-91fef529ef29","Type":"ContainerStarted","Data":"1e1ca68921ab9b028a5530519dd4740da4d66013789a7b4840dda337d37402dc"} Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.947448 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-catalog-content\") pod \"redhat-operators-x94kg\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.947647 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6smm4\" (UniqueName: \"kubernetes.io/projected/65681cb4-6a6f-4fce-8322-b6efffeecc78-kube-api-access-6smm4\") pod \"redhat-operators-x94kg\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.947853 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-utilities\") pod \"redhat-operators-x94kg\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.948674 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-catalog-content\") pod \"redhat-operators-x94kg\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.949886 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-utilities\") pod \"redhat-operators-x94kg\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:46 crc kubenswrapper[4909]: I0202 10:33:46.974840 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6smm4\" (UniqueName: \"kubernetes.io/projected/65681cb4-6a6f-4fce-8322-b6efffeecc78-kube-api-access-6smm4\") pod \"redhat-operators-x94kg\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:46.999445 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-gprz6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:46.999517 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gprz6" podUID="b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:46.999460 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-gprz6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:46.999588 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gprz6" podUID="b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.012864 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.013517 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.015736 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.015993 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.056077 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.102721 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.150145 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"76a6d60a-1322-4aaf-bdf1-e85bce501ed7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.150565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"76a6d60a-1322-4aaf-bdf1-e85bce501ed7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.252258 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"76a6d60a-1322-4aaf-bdf1-e85bce501ed7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.252388 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"76a6d60a-1322-4aaf-bdf1-e85bce501ed7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.252482 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"76a6d60a-1322-4aaf-bdf1-e85bce501ed7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.260389 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.279042 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"76a6d60a-1322-4aaf-bdf1-e85bce501ed7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.293900 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.303583 4909 patch_prober.go:28] interesting pod/router-default-5444994796-rtcqn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 02 10:33:47 crc kubenswrapper[4909]: [-]has-synced failed: reason withheld Feb 02 10:33:47 crc kubenswrapper[4909]: [+]process-running ok Feb 02 10:33:47 crc kubenswrapper[4909]: healthz check failed Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.303674 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-rtcqn" podUID="b0544feb-caff-4805-8c58-bae454503fa0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.334225 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.337973 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2qpc6"] Feb 02 10:33:47 crc kubenswrapper[4909]: W0202 10:33:47.363652 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20869d58_911c_44ab_8f33_07ffc1056b3b.slice/crio-67b8c45b75d900ca86b2984496e2e08af3b612ac38275e0b8d9af0b8d7cc9837 WatchSource:0}: Error finding container 67b8c45b75d900ca86b2984496e2e08af3b612ac38275e0b8d9af0b8d7cc9837: Status 404 returned error can't find the container with id 67b8c45b75d900ca86b2984496e2e08af3b612ac38275e0b8d9af0b8d7cc9837 Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.399269 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gvxk8" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.454919 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6950001-fb3a-457a-82f7-4c0cb20a876a-kubelet-dir\") pod \"c6950001-fb3a-457a-82f7-4c0cb20a876a\" (UID: \"c6950001-fb3a-457a-82f7-4c0cb20a876a\") " Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.454988 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6950001-fb3a-457a-82f7-4c0cb20a876a-kube-api-access\") pod \"c6950001-fb3a-457a-82f7-4c0cb20a876a\" (UID: \"c6950001-fb3a-457a-82f7-4c0cb20a876a\") " Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.455089 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c6950001-fb3a-457a-82f7-4c0cb20a876a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c6950001-fb3a-457a-82f7-4c0cb20a876a" (UID: "c6950001-fb3a-457a-82f7-4c0cb20a876a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.456430 4909 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c6950001-fb3a-457a-82f7-4c0cb20a876a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.482072 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6950001-fb3a-457a-82f7-4c0cb20a876a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c6950001-fb3a-457a-82f7-4c0cb20a876a" (UID: "c6950001-fb3a-457a-82f7-4c0cb20a876a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.494001 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.557191 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6950001-fb3a-457a-82f7-4c0cb20a876a-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.775529 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x94kg"] Feb 02 10:33:47 crc kubenswrapper[4909]: W0202 10:33:47.822017 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65681cb4_6a6f_4fce_8322_b6efffeecc78.slice/crio-f15a8822628645b4851610f112d8d76612fa8cf859599f7702659f914ec7d170 WatchSource:0}: Error finding container f15a8822628645b4851610f112d8d76612fa8cf859599f7702659f914ec7d170: Status 404 returned error can't find the container with id f15a8822628645b4851610f112d8d76612fa8cf859599f7702659f914ec7d170 Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.879429 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 02 10:33:47 crc kubenswrapper[4909]: W0202 10:33:47.886308 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod76a6d60a_1322_4aaf_bdf1_e85bce501ed7.slice/crio-0925e14d3a3fc14846bcde4d9f99d302bd6a68268e1546e6b48dac1e4730c4dd WatchSource:0}: Error finding container 0925e14d3a3fc14846bcde4d9f99d302bd6a68268e1546e6b48dac1e4730c4dd: Status 404 returned error can't find the container with id 0925e14d3a3fc14846bcde4d9f99d302bd6a68268e1546e6b48dac1e4730c4dd Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.924685 4909 generic.go:334] "Generic (PLEG): container finished" podID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerID="8dd68818e21d052db74034a7b1ee30eb2a6a632a6c638f91cd8b059f576b004e" exitCode=0 Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.924741 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qpc6" event={"ID":"20869d58-911c-44ab-8f33-07ffc1056b3b","Type":"ContainerDied","Data":"8dd68818e21d052db74034a7b1ee30eb2a6a632a6c638f91cd8b059f576b004e"} Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.924766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qpc6" event={"ID":"20869d58-911c-44ab-8f33-07ffc1056b3b","Type":"ContainerStarted","Data":"67b8c45b75d900ca86b2984496e2e08af3b612ac38275e0b8d9af0b8d7cc9837"} Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.929914 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x94kg" event={"ID":"65681cb4-6a6f-4fce-8322-b6efffeecc78","Type":"ContainerStarted","Data":"f15a8822628645b4851610f112d8d76612fa8cf859599f7702659f914ec7d170"} Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.944853 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"76a6d60a-1322-4aaf-bdf1-e85bce501ed7","Type":"ContainerStarted","Data":"0925e14d3a3fc14846bcde4d9f99d302bd6a68268e1546e6b48dac1e4730c4dd"} Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.952567 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c6950001-fb3a-457a-82f7-4c0cb20a876a","Type":"ContainerDied","Data":"5ea14b10f05bbad9ac8a1c81e472c4e2ee614fbc4cf8e316005dd8d1494b029e"} Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.952602 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ea14b10f05bbad9ac8a1c81e472c4e2ee614fbc4cf8e316005dd8d1494b029e" Feb 02 10:33:47 crc kubenswrapper[4909]: I0202 10:33:47.952648 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 02 10:33:48 crc kubenswrapper[4909]: I0202 10:33:48.299308 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:48 crc kubenswrapper[4909]: I0202 10:33:48.304369 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-rtcqn" Feb 02 10:33:48 crc kubenswrapper[4909]: I0202 10:33:48.973348 4909 generic.go:334] "Generic (PLEG): container finished" podID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerID="05f03a6492746357c30e6cd225e8e444732a077b005d4f5647476b3edff54d2c" exitCode=0 Feb 02 10:33:48 crc kubenswrapper[4909]: I0202 10:33:48.974209 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x94kg" event={"ID":"65681cb4-6a6f-4fce-8322-b6efffeecc78","Type":"ContainerDied","Data":"05f03a6492746357c30e6cd225e8e444732a077b005d4f5647476b3edff54d2c"} Feb 02 10:33:48 crc kubenswrapper[4909]: I0202 10:33:48.983599 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"76a6d60a-1322-4aaf-bdf1-e85bce501ed7","Type":"ContainerStarted","Data":"229d3f45404869a39f465080f0cbcfaa8b1fe05f3aadfc3b785c972a70f4f137"} Feb 02 10:33:49 crc kubenswrapper[4909]: I0202 10:33:49.031748 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.031728213 podStartE2EDuration="2.031728213s" podCreationTimestamp="2026-02-02 10:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:33:49.026011639 +0000 UTC m=+154.772112374" watchObservedRunningTime="2026-02-02 10:33:49.031728213 +0000 UTC m=+154.777828948" Feb 02 10:33:49 crc kubenswrapper[4909]: I0202 10:33:49.477543 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jp9hd" Feb 02 10:33:49 crc kubenswrapper[4909]: I0202 10:33:49.512191 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:33:49 crc kubenswrapper[4909]: I0202 10:33:49.512260 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:33:50 crc kubenswrapper[4909]: I0202 10:33:50.021586 4909 generic.go:334] "Generic (PLEG): container finished" podID="76a6d60a-1322-4aaf-bdf1-e85bce501ed7" containerID="229d3f45404869a39f465080f0cbcfaa8b1fe05f3aadfc3b785c972a70f4f137" exitCode=0 Feb 02 10:33:50 crc kubenswrapper[4909]: I0202 10:33:50.021634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"76a6d60a-1322-4aaf-bdf1-e85bce501ed7","Type":"ContainerDied","Data":"229d3f45404869a39f465080f0cbcfaa8b1fe05f3aadfc3b785c972a70f4f137"} Feb 02 10:33:51 crc kubenswrapper[4909]: I0202 10:33:51.286054 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:33:51 crc kubenswrapper[4909]: I0202 10:33:51.428953 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kubelet-dir\") pod \"76a6d60a-1322-4aaf-bdf1-e85bce501ed7\" (UID: \"76a6d60a-1322-4aaf-bdf1-e85bce501ed7\") " Feb 02 10:33:51 crc kubenswrapper[4909]: I0202 10:33:51.429083 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "76a6d60a-1322-4aaf-bdf1-e85bce501ed7" (UID: "76a6d60a-1322-4aaf-bdf1-e85bce501ed7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:33:51 crc kubenswrapper[4909]: I0202 10:33:51.429417 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kube-api-access\") pod \"76a6d60a-1322-4aaf-bdf1-e85bce501ed7\" (UID: \"76a6d60a-1322-4aaf-bdf1-e85bce501ed7\") " Feb 02 10:33:51 crc kubenswrapper[4909]: I0202 10:33:51.429632 4909 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:33:51 crc kubenswrapper[4909]: I0202 10:33:51.438444 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "76a6d60a-1322-4aaf-bdf1-e85bce501ed7" (UID: "76a6d60a-1322-4aaf-bdf1-e85bce501ed7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:33:51 crc kubenswrapper[4909]: I0202 10:33:51.530424 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76a6d60a-1322-4aaf-bdf1-e85bce501ed7-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:33:52 crc kubenswrapper[4909]: I0202 10:33:52.050127 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"76a6d60a-1322-4aaf-bdf1-e85bce501ed7","Type":"ContainerDied","Data":"0925e14d3a3fc14846bcde4d9f99d302bd6a68268e1546e6b48dac1e4730c4dd"} Feb 02 10:33:52 crc kubenswrapper[4909]: I0202 10:33:52.050178 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0925e14d3a3fc14846bcde4d9f99d302bd6a68268e1546e6b48dac1e4730c4dd" Feb 02 10:33:52 crc kubenswrapper[4909]: I0202 10:33:52.050240 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 02 10:33:56 crc kubenswrapper[4909]: I0202 10:33:56.517656 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:56 crc kubenswrapper[4909]: I0202 10:33:56.522534 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:33:56 crc kubenswrapper[4909]: I0202 10:33:56.998856 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-gprz6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Feb 02 10:33:56 crc kubenswrapper[4909]: I0202 10:33:56.998858 4909 patch_prober.go:28] interesting pod/downloads-7954f5f757-gprz6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Feb 02 10:33:56 crc kubenswrapper[4909]: I0202 10:33:56.998923 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-gprz6" podUID="b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Feb 02 10:33:56 crc kubenswrapper[4909]: I0202 10:33:56.998923 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-gprz6" podUID="b85dc0d9-6b5b-4254-bd4b-a99712b8a1fb" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.35:8080/\": dial tcp 10.217.0.35:8080: connect: connection refused" Feb 02 10:33:57 crc kubenswrapper[4909]: I0202 10:33:57.238655 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:57 crc kubenswrapper[4909]: I0202 10:33:57.682911 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f457793-f4e0-4417-ae91-4455722372c1-metrics-certs\") pod \"network-metrics-daemon-2v5vw\" (UID: \"0f457793-f4e0-4417-ae91-4455722372c1\") " pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:33:57 crc kubenswrapper[4909]: I0202 10:33:57.731554 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2v5vw" Feb 02 10:34:02 crc kubenswrapper[4909]: I0202 10:34:02.523363 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwjvx"] Feb 02 10:34:02 crc kubenswrapper[4909]: I0202 10:34:02.524049 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" podUID="29fc305d-0c7f-4ef9-8ae2-6534374de1ea" containerName="controller-manager" containerID="cri-o://6d0292200ab0f3bdf2319652ac825f16659c360bb2c2eab7dff4c3d8d1c9c0d3" gracePeriod=30 Feb 02 10:34:02 crc kubenswrapper[4909]: I0202 10:34:02.554010 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj"] Feb 02 10:34:02 crc kubenswrapper[4909]: I0202 10:34:02.554767 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" podUID="d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" containerName="route-controller-manager" containerID="cri-o://cca01d117134f71590bcefef5d26afeb5988e0f341f55c6b49846c5a9954d1c9" gracePeriod=30 Feb 02 10:34:05 crc kubenswrapper[4909]: I0202 10:34:05.003494 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:34:05 crc kubenswrapper[4909]: I0202 10:34:05.157063 4909 generic.go:334] "Generic (PLEG): container finished" podID="29fc305d-0c7f-4ef9-8ae2-6534374de1ea" containerID="6d0292200ab0f3bdf2319652ac825f16659c360bb2c2eab7dff4c3d8d1c9c0d3" exitCode=0 Feb 02 10:34:05 crc kubenswrapper[4909]: I0202 10:34:05.157106 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" event={"ID":"29fc305d-0c7f-4ef9-8ae2-6534374de1ea","Type":"ContainerDied","Data":"6d0292200ab0f3bdf2319652ac825f16659c360bb2c2eab7dff4c3d8d1c9c0d3"} Feb 02 10:34:06 crc kubenswrapper[4909]: I0202 10:34:06.516686 4909 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-5mlmj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Feb 02 10:34:06 crc kubenswrapper[4909]: I0202 10:34:06.516740 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" podUID="d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Feb 02 10:34:06 crc kubenswrapper[4909]: I0202 10:34:06.671075 4909 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mwjvx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 02 10:34:06 crc kubenswrapper[4909]: I0202 10:34:06.671209 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" podUID="29fc305d-0c7f-4ef9-8ae2-6534374de1ea" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 02 10:34:07 crc kubenswrapper[4909]: I0202 10:34:07.025095 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-gprz6" Feb 02 10:34:07 crc kubenswrapper[4909]: I0202 10:34:07.169610 4909 generic.go:334] "Generic (PLEG): container finished" podID="d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" containerID="cca01d117134f71590bcefef5d26afeb5988e0f341f55c6b49846c5a9954d1c9" exitCode=0 Feb 02 10:34:07 crc kubenswrapper[4909]: I0202 10:34:07.169683 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" event={"ID":"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e","Type":"ContainerDied","Data":"cca01d117134f71590bcefef5d26afeb5988e0f341f55c6b49846c5a9954d1c9"} Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.940508 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.947738 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.974399 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv"] Feb 02 10:34:09 crc kubenswrapper[4909]: E0202 10:34:09.974596 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6950001-fb3a-457a-82f7-4c0cb20a876a" containerName="pruner" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.974608 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6950001-fb3a-457a-82f7-4c0cb20a876a" containerName="pruner" Feb 02 10:34:09 crc kubenswrapper[4909]: E0202 10:34:09.974619 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29fc305d-0c7f-4ef9-8ae2-6534374de1ea" containerName="controller-manager" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.974626 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="29fc305d-0c7f-4ef9-8ae2-6534374de1ea" containerName="controller-manager" Feb 02 10:34:09 crc kubenswrapper[4909]: E0202 10:34:09.974633 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76a6d60a-1322-4aaf-bdf1-e85bce501ed7" containerName="pruner" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.974639 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="76a6d60a-1322-4aaf-bdf1-e85bce501ed7" containerName="pruner" Feb 02 10:34:09 crc kubenswrapper[4909]: E0202 10:34:09.974656 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" containerName="route-controller-manager" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.974662 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" containerName="route-controller-manager" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.974749 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" containerName="route-controller-manager" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.974759 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6950001-fb3a-457a-82f7-4c0cb20a876a" containerName="pruner" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.974770 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="76a6d60a-1322-4aaf-bdf1-e85bce501ed7" containerName="pruner" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.974777 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="29fc305d-0c7f-4ef9-8ae2-6534374de1ea" containerName="controller-manager" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.975138 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:09 crc kubenswrapper[4909]: I0202 10:34:09.986922 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv"] Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.097590 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-config\") pod \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.097638 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-proxy-ca-bundles\") pod \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.097668 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-serving-cert\") pod \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.097706 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-serving-cert\") pod \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.098581 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-config" (OuterVolumeSpecName: "config") pod "d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" (UID: "d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.098634 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "29fc305d-0c7f-4ef9-8ae2-6534374de1ea" (UID: "29fc305d-0c7f-4ef9-8ae2-6534374de1ea"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.098886 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69s26\" (UniqueName: \"kubernetes.io/projected/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-kube-api-access-69s26\") pod \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.098916 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-client-ca\") pod \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.098984 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-config\") pod \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\" (UID: \"29fc305d-0c7f-4ef9-8ae2-6534374de1ea\") " Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.099013 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-client-ca\") pod \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.099034 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzcnm\" (UniqueName: \"kubernetes.io/projected/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-kube-api-access-xzcnm\") pod \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\" (UID: \"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e\") " Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.099169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xqqt\" (UniqueName: \"kubernetes.io/projected/f3cc2ba7-eaa3-4656-8174-a37340b34c19-kube-api-access-8xqqt\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.099208 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-client-ca\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.099238 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-config\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.099333 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3cc2ba7-eaa3-4656-8174-a37340b34c19-serving-cert\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.099424 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.099448 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.099863 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-client-ca" (OuterVolumeSpecName: "client-ca") pod "d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" (UID: "d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.099924 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-config" (OuterVolumeSpecName: "config") pod "29fc305d-0c7f-4ef9-8ae2-6534374de1ea" (UID: "29fc305d-0c7f-4ef9-8ae2-6534374de1ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.100283 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-client-ca" (OuterVolumeSpecName: "client-ca") pod "29fc305d-0c7f-4ef9-8ae2-6534374de1ea" (UID: "29fc305d-0c7f-4ef9-8ae2-6534374de1ea"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.107437 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-kube-api-access-xzcnm" (OuterVolumeSpecName: "kube-api-access-xzcnm") pod "d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" (UID: "d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e"). InnerVolumeSpecName "kube-api-access-xzcnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.107457 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "29fc305d-0c7f-4ef9-8ae2-6534374de1ea" (UID: "29fc305d-0c7f-4ef9-8ae2-6534374de1ea"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.107480 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" (UID: "d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.107794 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-kube-api-access-69s26" (OuterVolumeSpecName: "kube-api-access-69s26") pod "29fc305d-0c7f-4ef9-8ae2-6534374de1ea" (UID: "29fc305d-0c7f-4ef9-8ae2-6534374de1ea"). InnerVolumeSpecName "kube-api-access-69s26". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.190375 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.190386 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mwjvx" event={"ID":"29fc305d-0c7f-4ef9-8ae2-6534374de1ea","Type":"ContainerDied","Data":"ade8c6b52cc55d23f081505b81b8f86383e96a5abc44c849eec75803b4704582"} Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.190443 4909 scope.go:117] "RemoveContainer" containerID="6d0292200ab0f3bdf2319652ac825f16659c360bb2c2eab7dff4c3d8d1c9c0d3" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.193933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" event={"ID":"d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e","Type":"ContainerDied","Data":"fe93efac4fb3f0a1151e82a6d9963e7371b07a9f602ef97966edf0dc02f09036"} Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.194019 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.200560 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3cc2ba7-eaa3-4656-8174-a37340b34c19-serving-cert\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.200832 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xqqt\" (UniqueName: \"kubernetes.io/projected/f3cc2ba7-eaa3-4656-8174-a37340b34c19-kube-api-access-8xqqt\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.200952 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-client-ca\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.201097 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-config\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.201294 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.201390 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.201478 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.201561 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69s26\" (UniqueName: \"kubernetes.io/projected/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-kube-api-access-69s26\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.201650 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29fc305d-0c7f-4ef9-8ae2-6534374de1ea-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.201735 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.201842 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzcnm\" (UniqueName: \"kubernetes.io/projected/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e-kube-api-access-xzcnm\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.202350 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-client-ca\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.202404 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-config\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.206247 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3cc2ba7-eaa3-4656-8174-a37340b34c19-serving-cert\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.216586 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xqqt\" (UniqueName: \"kubernetes.io/projected/f3cc2ba7-eaa3-4656-8174-a37340b34c19-kube-api-access-8xqqt\") pod \"route-controller-manager-5c97956df5-ncggv\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.245079 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj"] Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.254097 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5mlmj"] Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.257273 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwjvx"] Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.260064 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mwjvx"] Feb 02 10:34:10 crc kubenswrapper[4909]: I0202 10:34:10.303239 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:11 crc kubenswrapper[4909]: I0202 10:34:11.027638 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29fc305d-0c7f-4ef9-8ae2-6534374de1ea" path="/var/lib/kubelet/pods/29fc305d-0c7f-4ef9-8ae2-6534374de1ea/volumes" Feb 02 10:34:11 crc kubenswrapper[4909]: I0202 10:34:11.028582 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e" path="/var/lib/kubelet/pods/d3bd1cfb-92ee-4f7b-a8aa-20d6e1727e5e/volumes" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.811815 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c6744c7c6-hd4cv"] Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.813133 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.815788 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c6744c7c6-hd4cv"] Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.816735 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.816917 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.819050 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.820317 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.820420 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.821623 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.826263 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.940063 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-client-ca\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.940144 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhkd7\" (UniqueName: \"kubernetes.io/projected/d499336c-c3a2-476d-b5d0-41ad4513cfec-kube-api-access-zhkd7\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.940184 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-config\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.940234 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d499336c-c3a2-476d-b5d0-41ad4513cfec-serving-cert\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:12 crc kubenswrapper[4909]: I0202 10:34:12.940257 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-proxy-ca-bundles\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.041861 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-client-ca\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.042248 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhkd7\" (UniqueName: \"kubernetes.io/projected/d499336c-c3a2-476d-b5d0-41ad4513cfec-kube-api-access-zhkd7\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.042296 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-config\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.042355 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d499336c-c3a2-476d-b5d0-41ad4513cfec-serving-cert\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.042378 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-proxy-ca-bundles\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.044617 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-config\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.044621 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-client-ca\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.045675 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-proxy-ca-bundles\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.049094 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d499336c-c3a2-476d-b5d0-41ad4513cfec-serving-cert\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.060413 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhkd7\" (UniqueName: \"kubernetes.io/projected/d499336c-c3a2-476d-b5d0-41ad4513cfec-kube-api-access-zhkd7\") pod \"controller-manager-c6744c7c6-hd4cv\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:13 crc kubenswrapper[4909]: I0202 10:34:13.131732 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:14 crc kubenswrapper[4909]: E0202 10:34:14.487104 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 02 10:34:14 crc kubenswrapper[4909]: E0202 10:34:14.487501 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-46jr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mlhl9_openshift-marketplace(7164d60d-218c-47e0-a74a-677793e589b0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 10:34:14 crc kubenswrapper[4909]: E0202 10:34:14.488693 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mlhl9" podUID="7164d60d-218c-47e0-a74a-677793e589b0" Feb 02 10:34:15 crc kubenswrapper[4909]: E0202 10:34:15.787189 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mlhl9" podUID="7164d60d-218c-47e0-a74a-677793e589b0" Feb 02 10:34:15 crc kubenswrapper[4909]: E0202 10:34:15.879861 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 02 10:34:15 crc kubenswrapper[4909]: E0202 10:34:15.880003 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fnp7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fgxpg_openshift-marketplace(9818ea74-330d-4bd0-8931-91fef529ef29): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 10:34:15 crc kubenswrapper[4909]: E0202 10:34:15.882080 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fgxpg" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" Feb 02 10:34:17 crc kubenswrapper[4909]: E0202 10:34:17.084222 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fgxpg" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" Feb 02 10:34:17 crc kubenswrapper[4909]: I0202 10:34:17.121519 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kddv4" Feb 02 10:34:17 crc kubenswrapper[4909]: I0202 10:34:17.282470 4909 scope.go:117] "RemoveContainer" containerID="cca01d117134f71590bcefef5d26afeb5988e0f341f55c6b49846c5a9954d1c9" Feb 02 10:34:17 crc kubenswrapper[4909]: I0202 10:34:17.556302 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c6744c7c6-hd4cv"] Feb 02 10:34:17 crc kubenswrapper[4909]: I0202 10:34:17.622359 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv"] Feb 02 10:34:17 crc kubenswrapper[4909]: I0202 10:34:17.636248 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2v5vw"] Feb 02 10:34:17 crc kubenswrapper[4909]: W0202 10:34:17.716668 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3cc2ba7_eaa3_4656_8174_a37340b34c19.slice/crio-23213c7857f76e0f8ea7e7fa5ee96861ee7c2b81e0b0e132f3349fb1eb178e36 WatchSource:0}: Error finding container 23213c7857f76e0f8ea7e7fa5ee96861ee7c2b81e0b0e132f3349fb1eb178e36: Status 404 returned error can't find the container with id 23213c7857f76e0f8ea7e7fa5ee96861ee7c2b81e0b0e132f3349fb1eb178e36 Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.242750 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" event={"ID":"f3cc2ba7-eaa3-4656-8174-a37340b34c19","Type":"ContainerStarted","Data":"23213c7857f76e0f8ea7e7fa5ee96861ee7c2b81e0b0e132f3349fb1eb178e36"} Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.245419 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x94kg" event={"ID":"65681cb4-6a6f-4fce-8322-b6efffeecc78","Type":"ContainerStarted","Data":"638a2bed8eb11ef192acb4759a1053e3da4c211ec3a998956e0ae1332f2b136d"} Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.247376 4909 generic.go:334] "Generic (PLEG): container finished" podID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerID="a0f5998bd01e5e9a2c083e1c74d42ac73c83dd2e6ebcfb63dbae3eccd5d92323" exitCode=0 Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.247420 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72ltv" event={"ID":"58068665-fe9b-4bd9-ac11-a3d6c9ad888e","Type":"ContainerDied","Data":"a0f5998bd01e5e9a2c083e1c74d42ac73c83dd2e6ebcfb63dbae3eccd5d92323"} Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.249463 4909 generic.go:334] "Generic (PLEG): container finished" podID="7986860a-5f33-47fb-af58-a2925c4572a4" containerID="d0177008ab3cf7a24f4da5a4e3899cca448d6e1911204e67585482592ea529c5" exitCode=0 Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.249502 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxkq4" event={"ID":"7986860a-5f33-47fb-af58-a2925c4572a4","Type":"ContainerDied","Data":"d0177008ab3cf7a24f4da5a4e3899cca448d6e1911204e67585482592ea529c5"} Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.261766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qpc6" event={"ID":"20869d58-911c-44ab-8f33-07ffc1056b3b","Type":"ContainerStarted","Data":"82b876000a029687ce66c11ed9367a0da960d149eb488f5543d1003426afefea"} Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.263153 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" event={"ID":"d499336c-c3a2-476d-b5d0-41ad4513cfec","Type":"ContainerStarted","Data":"251f0f99cbf9de1bfb87c2e8ecf8629274f41a73872068079fac144e4f406f7a"} Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.263195 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" event={"ID":"d499336c-c3a2-476d-b5d0-41ad4513cfec","Type":"ContainerStarted","Data":"048ad216a4f774c1bf3988cf2d2eec12aef7f71ca35174f5da019d6dfcaed85c"} Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.263215 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.272570 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.273562 4909 generic.go:334] "Generic (PLEG): container finished" podID="9483225f-edd3-4728-8e95-67f872692af9" containerID="0f2ec9f22118bd1fd91af8be07d418cd8978c48f9b91c8a1d147f55b2e62fc89" exitCode=0 Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.273608 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn7ln" event={"ID":"9483225f-edd3-4728-8e95-67f872692af9","Type":"ContainerDied","Data":"0f2ec9f22118bd1fd91af8be07d418cd8978c48f9b91c8a1d147f55b2e62fc89"} Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.276682 4909 generic.go:334] "Generic (PLEG): container finished" podID="62300332-ecea-47ea-9809-6cc89e9593bf" containerID="4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012" exitCode=0 Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.276786 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvb8" event={"ID":"62300332-ecea-47ea-9809-6cc89e9593bf","Type":"ContainerDied","Data":"4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012"} Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.278333 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" event={"ID":"0f457793-f4e0-4417-ae91-4455722372c1","Type":"ContainerStarted","Data":"09e3c6c068af06aef0443bc92b624e834853e88640379165413d8007bd21923a"} Feb 02 10:34:18 crc kubenswrapper[4909]: I0202 10:34:18.309942 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" podStartSLOduration=16.309924114 podStartE2EDuration="16.309924114s" podCreationTimestamp="2026-02-02 10:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:18.307767603 +0000 UTC m=+184.053868338" watchObservedRunningTime="2026-02-02 10:34:18.309924114 +0000 UTC m=+184.056024849" Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.285688 4909 generic.go:334] "Generic (PLEG): container finished" podID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerID="82b876000a029687ce66c11ed9367a0da960d149eb488f5543d1003426afefea" exitCode=0 Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.286257 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qpc6" event={"ID":"20869d58-911c-44ab-8f33-07ffc1056b3b","Type":"ContainerDied","Data":"82b876000a029687ce66c11ed9367a0da960d149eb488f5543d1003426afefea"} Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.298970 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" event={"ID":"f3cc2ba7-eaa3-4656-8174-a37340b34c19","Type":"ContainerStarted","Data":"96408de21a034f0be028f0f012955af1a16e9e8b154024822b5e4a5f1a2a7b98"} Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.299426 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.300925 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" event={"ID":"0f457793-f4e0-4417-ae91-4455722372c1","Type":"ContainerStarted","Data":"ca9c4ef797167e73066f20dbce6ee9df06302039b91146bc9eb4c20c4f30d16a"} Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.303328 4909 generic.go:334] "Generic (PLEG): container finished" podID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerID="638a2bed8eb11ef192acb4759a1053e3da4c211ec3a998956e0ae1332f2b136d" exitCode=0 Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.303462 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x94kg" event={"ID":"65681cb4-6a6f-4fce-8322-b6efffeecc78","Type":"ContainerDied","Data":"638a2bed8eb11ef192acb4759a1053e3da4c211ec3a998956e0ae1332f2b136d"} Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.313008 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.333938 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" podStartSLOduration=17.333922696 podStartE2EDuration="17.333922696s" podCreationTimestamp="2026-02-02 10:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:19.328501281 +0000 UTC m=+185.074602016" watchObservedRunningTime="2026-02-02 10:34:19.333922696 +0000 UTC m=+185.080023431" Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.510875 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:34:19 crc kubenswrapper[4909]: I0202 10:34:19.510939 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:34:20 crc kubenswrapper[4909]: I0202 10:34:20.310869 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2v5vw" event={"ID":"0f457793-f4e0-4417-ae91-4455722372c1","Type":"ContainerStarted","Data":"d12b1daf1d40cdc630be3ac28f906e958ae7560bb19904c21dbd843151a51372"} Feb 02 10:34:20 crc kubenswrapper[4909]: I0202 10:34:20.328340 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2v5vw" podStartSLOduration=165.328325339 podStartE2EDuration="2m45.328325339s" podCreationTimestamp="2026-02-02 10:31:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:20.325083176 +0000 UTC m=+186.071183911" watchObservedRunningTime="2026-02-02 10:34:20.328325339 +0000 UTC m=+186.074426064" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.323749 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxkq4" event={"ID":"7986860a-5f33-47fb-af58-a2925c4572a4","Type":"ContainerStarted","Data":"2e8067e3821be8c88d2728fa2f815cb671c795db2ade99e6a83765bb9cca8ea7"} Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.345793 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bxkq4" podStartSLOduration=3.722908881 podStartE2EDuration="39.345771788s" podCreationTimestamp="2026-02-02 10:33:43 +0000 UTC" firstStartedPulling="2026-02-02 10:33:45.881175739 +0000 UTC m=+151.627276474" lastFinishedPulling="2026-02-02 10:34:21.504038646 +0000 UTC m=+187.250139381" observedRunningTime="2026-02-02 10:34:22.34235624 +0000 UTC m=+188.088456985" watchObservedRunningTime="2026-02-02 10:34:22.345771788 +0000 UTC m=+188.091872523" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.411131 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.412078 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.420298 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.420550 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.420609 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.491165 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c6744c7c6-hd4cv"] Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.491437 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" podUID="d499336c-c3a2-476d-b5d0-41ad4513cfec" containerName="controller-manager" containerID="cri-o://251f0f99cbf9de1bfb87c2e8ecf8629274f41a73872068079fac144e4f406f7a" gracePeriod=30 Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.574283 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b15359b-7d40-44fd-91e3-742c7ed025f7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3b15359b-7d40-44fd-91e3-742c7ed025f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.574363 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b15359b-7d40-44fd-91e3-742c7ed025f7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3b15359b-7d40-44fd-91e3-742c7ed025f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.585382 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv"] Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.585618 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" podUID="f3cc2ba7-eaa3-4656-8174-a37340b34c19" containerName="route-controller-manager" containerID="cri-o://96408de21a034f0be028f0f012955af1a16e9e8b154024822b5e4a5f1a2a7b98" gracePeriod=30 Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.675560 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b15359b-7d40-44fd-91e3-742c7ed025f7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3b15359b-7d40-44fd-91e3-742c7ed025f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.675647 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b15359b-7d40-44fd-91e3-742c7ed025f7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3b15359b-7d40-44fd-91e3-742c7ed025f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.676017 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b15359b-7d40-44fd-91e3-742c7ed025f7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3b15359b-7d40-44fd-91e3-742c7ed025f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.697421 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b15359b-7d40-44fd-91e3-742c7ed025f7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3b15359b-7d40-44fd-91e3-742c7ed025f7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:34:22 crc kubenswrapper[4909]: I0202 10:34:22.728915 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.067827 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.133356 4909 patch_prober.go:28] interesting pod/controller-manager-c6744c7c6-hd4cv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" start-of-body= Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.133431 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" podUID="d499336c-c3a2-476d-b5d0-41ad4513cfec" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: connect: connection refused" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.186540 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 02 10:34:23 crc kubenswrapper[4909]: W0202 10:34:23.195166 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3b15359b_7d40_44fd_91e3_742c7ed025f7.slice/crio-73de77f1a53acda002647dc790b459a7b4eefdfa671d9d6d4dc5ef21e7b344ca WatchSource:0}: Error finding container 73de77f1a53acda002647dc790b459a7b4eefdfa671d9d6d4dc5ef21e7b344ca: Status 404 returned error can't find the container with id 73de77f1a53acda002647dc790b459a7b4eefdfa671d9d6d4dc5ef21e7b344ca Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.333579 4909 generic.go:334] "Generic (PLEG): container finished" podID="d499336c-c3a2-476d-b5d0-41ad4513cfec" containerID="251f0f99cbf9de1bfb87c2e8ecf8629274f41a73872068079fac144e4f406f7a" exitCode=0 Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.333645 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" event={"ID":"d499336c-c3a2-476d-b5d0-41ad4513cfec","Type":"ContainerDied","Data":"251f0f99cbf9de1bfb87c2e8ecf8629274f41a73872068079fac144e4f406f7a"} Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.335024 4909 generic.go:334] "Generic (PLEG): container finished" podID="f3cc2ba7-eaa3-4656-8174-a37340b34c19" containerID="96408de21a034f0be028f0f012955af1a16e9e8b154024822b5e4a5f1a2a7b98" exitCode=0 Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.335072 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" event={"ID":"f3cc2ba7-eaa3-4656-8174-a37340b34c19","Type":"ContainerDied","Data":"96408de21a034f0be028f0f012955af1a16e9e8b154024822b5e4a5f1a2a7b98"} Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.337556 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72ltv" event={"ID":"58068665-fe9b-4bd9-ac11-a3d6c9ad888e","Type":"ContainerStarted","Data":"fd84459d6b5dd305397ff426e6d97cea6881fbf3eb15e9df62a9513f52197354"} Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.338706 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b15359b-7d40-44fd-91e3-742c7ed025f7","Type":"ContainerStarted","Data":"73de77f1a53acda002647dc790b459a7b4eefdfa671d9d6d4dc5ef21e7b344ca"} Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.915701 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.927933 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.946858 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99"] Feb 02 10:34:23 crc kubenswrapper[4909]: E0202 10:34:23.947111 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3cc2ba7-eaa3-4656-8174-a37340b34c19" containerName="route-controller-manager" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.947130 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3cc2ba7-eaa3-4656-8174-a37340b34c19" containerName="route-controller-manager" Feb 02 10:34:23 crc kubenswrapper[4909]: E0202 10:34:23.947151 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d499336c-c3a2-476d-b5d0-41ad4513cfec" containerName="controller-manager" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.947158 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d499336c-c3a2-476d-b5d0-41ad4513cfec" containerName="controller-manager" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.947254 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3cc2ba7-eaa3-4656-8174-a37340b34c19" containerName="route-controller-manager" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.947263 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d499336c-c3a2-476d-b5d0-41ad4513cfec" containerName="controller-manager" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.947689 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:23 crc kubenswrapper[4909]: I0202 10:34:23.960631 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99"] Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095189 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d499336c-c3a2-476d-b5d0-41ad4513cfec-serving-cert\") pod \"d499336c-c3a2-476d-b5d0-41ad4513cfec\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095243 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-config\") pod \"d499336c-c3a2-476d-b5d0-41ad4513cfec\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095309 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-proxy-ca-bundles\") pod \"d499336c-c3a2-476d-b5d0-41ad4513cfec\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095348 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-config\") pod \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095399 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-client-ca\") pod \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095419 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-client-ca\") pod \"d499336c-c3a2-476d-b5d0-41ad4513cfec\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095447 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhkd7\" (UniqueName: \"kubernetes.io/projected/d499336c-c3a2-476d-b5d0-41ad4513cfec-kube-api-access-zhkd7\") pod \"d499336c-c3a2-476d-b5d0-41ad4513cfec\" (UID: \"d499336c-c3a2-476d-b5d0-41ad4513cfec\") " Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095467 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3cc2ba7-eaa3-4656-8174-a37340b34c19-serving-cert\") pod \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095494 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xqqt\" (UniqueName: \"kubernetes.io/projected/f3cc2ba7-eaa3-4656-8174-a37340b34c19-kube-api-access-8xqqt\") pod \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\" (UID: \"f3cc2ba7-eaa3-4656-8174-a37340b34c19\") " Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095656 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b545\" (UniqueName: \"kubernetes.io/projected/2d50641a-3b94-4d0a-9d73-7be005b9a395-kube-api-access-5b545\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095689 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d50641a-3b94-4d0a-9d73-7be005b9a395-serving-cert\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095723 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-config\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.095738 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-client-ca\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.097082 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.097160 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.098053 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-client-ca" (OuterVolumeSpecName: "client-ca") pod "d499336c-c3a2-476d-b5d0-41ad4513cfec" (UID: "d499336c-c3a2-476d-b5d0-41ad4513cfec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.098244 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d499336c-c3a2-476d-b5d0-41ad4513cfec" (UID: "d499336c-c3a2-476d-b5d0-41ad4513cfec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.098246 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-client-ca" (OuterVolumeSpecName: "client-ca") pod "f3cc2ba7-eaa3-4656-8174-a37340b34c19" (UID: "f3cc2ba7-eaa3-4656-8174-a37340b34c19"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.098730 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-config" (OuterVolumeSpecName: "config") pod "f3cc2ba7-eaa3-4656-8174-a37340b34c19" (UID: "f3cc2ba7-eaa3-4656-8174-a37340b34c19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.098897 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-config" (OuterVolumeSpecName: "config") pod "d499336c-c3a2-476d-b5d0-41ad4513cfec" (UID: "d499336c-c3a2-476d-b5d0-41ad4513cfec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.101957 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3cc2ba7-eaa3-4656-8174-a37340b34c19-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f3cc2ba7-eaa3-4656-8174-a37340b34c19" (UID: "f3cc2ba7-eaa3-4656-8174-a37340b34c19"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.101998 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3cc2ba7-eaa3-4656-8174-a37340b34c19-kube-api-access-8xqqt" (OuterVolumeSpecName: "kube-api-access-8xqqt") pod "f3cc2ba7-eaa3-4656-8174-a37340b34c19" (UID: "f3cc2ba7-eaa3-4656-8174-a37340b34c19"). InnerVolumeSpecName "kube-api-access-8xqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.102132 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d499336c-c3a2-476d-b5d0-41ad4513cfec-kube-api-access-zhkd7" (OuterVolumeSpecName: "kube-api-access-zhkd7") pod "d499336c-c3a2-476d-b5d0-41ad4513cfec" (UID: "d499336c-c3a2-476d-b5d0-41ad4513cfec"). InnerVolumeSpecName "kube-api-access-zhkd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.102376 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d499336c-c3a2-476d-b5d0-41ad4513cfec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d499336c-c3a2-476d-b5d0-41ad4513cfec" (UID: "d499336c-c3a2-476d-b5d0-41ad4513cfec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.197379 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b545\" (UniqueName: \"kubernetes.io/projected/2d50641a-3b94-4d0a-9d73-7be005b9a395-kube-api-access-5b545\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.197860 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d50641a-3b94-4d0a-9d73-7be005b9a395-serving-cert\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.197900 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-config\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.197925 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-client-ca\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.198174 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d499336c-c3a2-476d-b5d0-41ad4513cfec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.199262 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-config\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.199276 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-client-ca\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.199308 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.199696 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.199715 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.199728 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f3cc2ba7-eaa3-4656-8174-a37340b34c19-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.199738 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d499336c-c3a2-476d-b5d0-41ad4513cfec-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.199751 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhkd7\" (UniqueName: \"kubernetes.io/projected/d499336c-c3a2-476d-b5d0-41ad4513cfec-kube-api-access-zhkd7\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.199764 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3cc2ba7-eaa3-4656-8174-a37340b34c19-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.199775 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xqqt\" (UniqueName: \"kubernetes.io/projected/f3cc2ba7-eaa3-4656-8174-a37340b34c19-kube-api-access-8xqqt\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.203315 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d50641a-3b94-4d0a-9d73-7be005b9a395-serving-cert\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.215047 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b545\" (UniqueName: \"kubernetes.io/projected/2d50641a-3b94-4d0a-9d73-7be005b9a395-kube-api-access-5b545\") pod \"route-controller-manager-7d56d5955c-27w99\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.269095 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.345071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" event={"ID":"f3cc2ba7-eaa3-4656-8174-a37340b34c19","Type":"ContainerDied","Data":"23213c7857f76e0f8ea7e7fa5ee96861ee7c2b81e0b0e132f3349fb1eb178e36"} Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.345121 4909 scope.go:117] "RemoveContainer" containerID="96408de21a034f0be028f0f012955af1a16e9e8b154024822b5e4a5f1a2a7b98" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.345165 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.352969 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" event={"ID":"d499336c-c3a2-476d-b5d0-41ad4513cfec","Type":"ContainerDied","Data":"048ad216a4f774c1bf3988cf2d2eec12aef7f71ca35174f5da019d6dfcaed85c"} Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.353266 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c6744c7c6-hd4cv" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.379206 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-72ltv" podStartSLOduration=4.4377555 podStartE2EDuration="41.379173793s" podCreationTimestamp="2026-02-02 10:33:43 +0000 UTC" firstStartedPulling="2026-02-02 10:33:45.876175376 +0000 UTC m=+151.622276111" lastFinishedPulling="2026-02-02 10:34:22.817593669 +0000 UTC m=+188.563694404" observedRunningTime="2026-02-02 10:34:24.377369461 +0000 UTC m=+190.123470186" watchObservedRunningTime="2026-02-02 10:34:24.379173793 +0000 UTC m=+190.125274518" Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.392679 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c6744c7c6-hd4cv"] Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.401378 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c6744c7c6-hd4cv"] Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.405320 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv"] Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.408502 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c97956df5-ncggv"] Feb 02 10:34:24 crc kubenswrapper[4909]: I0202 10:34:24.899265 4909 scope.go:117] "RemoveContainer" containerID="251f0f99cbf9de1bfb87c2e8ecf8629274f41a73872068079fac144e4f406f7a" Feb 02 10:34:25 crc kubenswrapper[4909]: I0202 10:34:25.008309 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ddclx"] Feb 02 10:34:25 crc kubenswrapper[4909]: I0202 10:34:25.028947 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d499336c-c3a2-476d-b5d0-41ad4513cfec" path="/var/lib/kubelet/pods/d499336c-c3a2-476d-b5d0-41ad4513cfec/volumes" Feb 02 10:34:25 crc kubenswrapper[4909]: I0202 10:34:25.029512 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3cc2ba7-eaa3-4656-8174-a37340b34c19" path="/var/lib/kubelet/pods/f3cc2ba7-eaa3-4656-8174-a37340b34c19/volumes" Feb 02 10:34:25 crc kubenswrapper[4909]: I0202 10:34:25.349121 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99"] Feb 02 10:34:25 crc kubenswrapper[4909]: W0202 10:34:25.370860 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d50641a_3b94_4d0a_9d73_7be005b9a395.slice/crio-136eec769a5dfb72996cf1c40999e54c0b11508b1d3a027bf4737939f06f43dc WatchSource:0}: Error finding container 136eec769a5dfb72996cf1c40999e54c0b11508b1d3a027bf4737939f06f43dc: Status 404 returned error can't find the container with id 136eec769a5dfb72996cf1c40999e54c0b11508b1d3a027bf4737939f06f43dc Feb 02 10:34:25 crc kubenswrapper[4909]: I0202 10:34:25.379469 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b15359b-7d40-44fd-91e3-742c7ed025f7","Type":"ContainerStarted","Data":"a1f021aee4f32a7d4b304f7510c042e2d507c31c20cbffa65c220561a9173602"} Feb 02 10:34:25 crc kubenswrapper[4909]: I0202 10:34:25.407063 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=3.407039105 podStartE2EDuration="3.407039105s" podCreationTimestamp="2026-02-02 10:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:25.399738586 +0000 UTC m=+191.145839321" watchObservedRunningTime="2026-02-02 10:34:25.407039105 +0000 UTC m=+191.153139840" Feb 02 10:34:25 crc kubenswrapper[4909]: I0202 10:34:25.440451 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-bxkq4" podUID="7986860a-5f33-47fb-af58-a2925c4572a4" containerName="registry-server" probeResult="failure" output=< Feb 02 10:34:25 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 10:34:25 crc kubenswrapper[4909]: > Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.402972 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" event={"ID":"2d50641a-3b94-4d0a-9d73-7be005b9a395","Type":"ContainerStarted","Data":"b47d02dda8a78823a1afc5a2e2ee50f690dfa3af87f0498872aa2f8b9a2df4e9"} Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.403661 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" event={"ID":"2d50641a-3b94-4d0a-9d73-7be005b9a395","Type":"ContainerStarted","Data":"136eec769a5dfb72996cf1c40999e54c0b11508b1d3a027bf4737939f06f43dc"} Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.404019 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.407358 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvb8" event={"ID":"62300332-ecea-47ea-9809-6cc89e9593bf","Type":"ContainerStarted","Data":"c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac"} Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.410331 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x94kg" event={"ID":"65681cb4-6a6f-4fce-8322-b6efffeecc78","Type":"ContainerStarted","Data":"7aaffea257a196b278c9b906495218b5885030d05463dd37e7d03d47bc02061d"} Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.414336 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn7ln" event={"ID":"9483225f-edd3-4728-8e95-67f872692af9","Type":"ContainerStarted","Data":"87395505e45e0fa9c6d0c5b4105d2959b5928c2bf1d6cf5e4f8684baf8f3d33b"} Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.414486 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.417276 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qpc6" event={"ID":"20869d58-911c-44ab-8f33-07ffc1056b3b","Type":"ContainerStarted","Data":"41298415446fe6302e48ab3ed17896c921c3e1f955919a0ae503353e8157769b"} Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.419334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b15359b-7d40-44fd-91e3-742c7ed025f7","Type":"ContainerDied","Data":"a1f021aee4f32a7d4b304f7510c042e2d507c31c20cbffa65c220561a9173602"} Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.419399 4909 generic.go:334] "Generic (PLEG): container finished" podID="3b15359b-7d40-44fd-91e3-742c7ed025f7" containerID="a1f021aee4f32a7d4b304f7510c042e2d507c31c20cbffa65c220561a9173602" exitCode=0 Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.477636 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" podStartSLOduration=4.477618531 podStartE2EDuration="4.477618531s" podCreationTimestamp="2026-02-02 10:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:26.456733833 +0000 UTC m=+192.202834568" watchObservedRunningTime="2026-02-02 10:34:26.477618531 +0000 UTC m=+192.223719266" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.479374 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2qpc6" podStartSLOduration=3.063524444 podStartE2EDuration="40.479367791s" podCreationTimestamp="2026-02-02 10:33:46 +0000 UTC" firstStartedPulling="2026-02-02 10:33:47.932745415 +0000 UTC m=+153.678846150" lastFinishedPulling="2026-02-02 10:34:25.348588762 +0000 UTC m=+191.094689497" observedRunningTime="2026-02-02 10:34:26.475670835 +0000 UTC m=+192.221771570" watchObservedRunningTime="2026-02-02 10:34:26.479367791 +0000 UTC m=+192.225468526" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.507637 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gwvb8" podStartSLOduration=2.981723028 podStartE2EDuration="43.50761782s" podCreationTimestamp="2026-02-02 10:33:43 +0000 UTC" firstStartedPulling="2026-02-02 10:33:44.821619139 +0000 UTC m=+150.567719874" lastFinishedPulling="2026-02-02 10:34:25.347513931 +0000 UTC m=+191.093614666" observedRunningTime="2026-02-02 10:34:26.503207293 +0000 UTC m=+192.249308038" watchObservedRunningTime="2026-02-02 10:34:26.50761782 +0000 UTC m=+192.253718565" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.531892 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x94kg" podStartSLOduration=4.090193101 podStartE2EDuration="40.531873124s" podCreationTimestamp="2026-02-02 10:33:46 +0000 UTC" firstStartedPulling="2026-02-02 10:33:48.975574535 +0000 UTC m=+154.721675270" lastFinishedPulling="2026-02-02 10:34:25.417254558 +0000 UTC m=+191.163355293" observedRunningTime="2026-02-02 10:34:26.527379696 +0000 UTC m=+192.273480431" watchObservedRunningTime="2026-02-02 10:34:26.531873124 +0000 UTC m=+192.277973869" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.675783 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.675849 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.808406 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vn7ln" podStartSLOduration=4.454961033 podStartE2EDuration="41.808390732s" podCreationTimestamp="2026-02-02 10:33:45 +0000 UTC" firstStartedPulling="2026-02-02 10:33:46.911115721 +0000 UTC m=+152.657216456" lastFinishedPulling="2026-02-02 10:34:24.26454542 +0000 UTC m=+190.010646155" observedRunningTime="2026-02-02 10:34:26.633149874 +0000 UTC m=+192.379250609" watchObservedRunningTime="2026-02-02 10:34:26.808390732 +0000 UTC m=+192.554491467" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.809042 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg"] Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.809654 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.811355 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.812943 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.813643 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.814168 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.814540 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.817928 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.820232 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.826719 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg"] Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.944098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0625823-24a7-4ccb-a37e-57f8307cb09f-serving-cert\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.944190 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-client-ca\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.944224 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnmrp\" (UniqueName: \"kubernetes.io/projected/b0625823-24a7-4ccb-a37e-57f8307cb09f-kube-api-access-tnmrp\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.944439 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-config\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:26 crc kubenswrapper[4909]: I0202 10:34:26.944629 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-proxy-ca-bundles\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.045330 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-proxy-ca-bundles\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.045397 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0625823-24a7-4ccb-a37e-57f8307cb09f-serving-cert\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.045432 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-client-ca\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.045451 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnmrp\" (UniqueName: \"kubernetes.io/projected/b0625823-24a7-4ccb-a37e-57f8307cb09f-kube-api-access-tnmrp\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.045482 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-config\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.047308 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-proxy-ca-bundles\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.047625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-client-ca\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.047793 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-config\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.055620 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0625823-24a7-4ccb-a37e-57f8307cb09f-serving-cert\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.068679 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnmrp\" (UniqueName: \"kubernetes.io/projected/b0625823-24a7-4ccb-a37e-57f8307cb09f-kube-api-access-tnmrp\") pod \"controller-manager-6cc4f874d5-jcgcg\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.103546 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.103609 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.171028 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.393213 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg"] Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.435758 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" event={"ID":"b0625823-24a7-4ccb-a37e-57f8307cb09f","Type":"ContainerStarted","Data":"c53532f7de9b0bfe0e705234c79fc5398fa121ab57c2f2e96b9aeaf1aa05a89a"} Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.727374 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2qpc6" podUID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerName="registry-server" probeResult="failure" output=< Feb 02 10:34:27 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 10:34:27 crc kubenswrapper[4909]: > Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.737357 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.855793 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b15359b-7d40-44fd-91e3-742c7ed025f7-kubelet-dir\") pod \"3b15359b-7d40-44fd-91e3-742c7ed025f7\" (UID: \"3b15359b-7d40-44fd-91e3-742c7ed025f7\") " Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.855928 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b15359b-7d40-44fd-91e3-742c7ed025f7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3b15359b-7d40-44fd-91e3-742c7ed025f7" (UID: "3b15359b-7d40-44fd-91e3-742c7ed025f7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.855968 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b15359b-7d40-44fd-91e3-742c7ed025f7-kube-api-access\") pod \"3b15359b-7d40-44fd-91e3-742c7ed025f7\" (UID: \"3b15359b-7d40-44fd-91e3-742c7ed025f7\") " Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.856332 4909 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b15359b-7d40-44fd-91e3-742c7ed025f7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.865782 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b15359b-7d40-44fd-91e3-742c7ed025f7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3b15359b-7d40-44fd-91e3-742c7ed025f7" (UID: "3b15359b-7d40-44fd-91e3-742c7ed025f7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:27 crc kubenswrapper[4909]: I0202 10:34:27.958054 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b15359b-7d40-44fd-91e3-742c7ed025f7-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.152359 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x94kg" podUID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerName="registry-server" probeResult="failure" output=< Feb 02 10:34:28 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 10:34:28 crc kubenswrapper[4909]: > Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.404580 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 10:34:28 crc kubenswrapper[4909]: E0202 10:34:28.404864 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b15359b-7d40-44fd-91e3-742c7ed025f7" containerName="pruner" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.404877 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b15359b-7d40-44fd-91e3-742c7ed025f7" containerName="pruner" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.404983 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b15359b-7d40-44fd-91e3-742c7ed025f7" containerName="pruner" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.405341 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.420584 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.448864 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" event={"ID":"b0625823-24a7-4ccb-a37e-57f8307cb09f","Type":"ContainerStarted","Data":"0c9cd46f9045b1a5719a0334c989c3171daa5427d3f45466b412eeff50ce12cc"} Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.451536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"3b15359b-7d40-44fd-91e3-742c7ed025f7","Type":"ContainerDied","Data":"73de77f1a53acda002647dc790b459a7b4eefdfa671d9d6d4dc5ef21e7b344ca"} Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.451612 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73de77f1a53acda002647dc790b459a7b4eefdfa671d9d6d4dc5ef21e7b344ca" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.451640 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.475851 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" podStartSLOduration=6.475789456 podStartE2EDuration="6.475789456s" podCreationTimestamp="2026-02-02 10:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:28.473308135 +0000 UTC m=+194.219408870" watchObservedRunningTime="2026-02-02 10:34:28.475789456 +0000 UTC m=+194.221890191" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.568726 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc3e167c-a86d-4fb1-baf2-98233797a107-kube-api-access\") pod \"installer-9-crc\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.568829 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-var-lock\") pod \"installer-9-crc\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.568878 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.670347 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.670469 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc3e167c-a86d-4fb1-baf2-98233797a107-kube-api-access\") pod \"installer-9-crc\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.670516 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-var-lock\") pod \"installer-9-crc\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.670509 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-kubelet-dir\") pod \"installer-9-crc\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.671068 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-var-lock\") pod \"installer-9-crc\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.693671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc3e167c-a86d-4fb1-baf2-98233797a107-kube-api-access\") pod \"installer-9-crc\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:28 crc kubenswrapper[4909]: I0202 10:34:28.718901 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:34:29 crc kubenswrapper[4909]: I0202 10:34:29.137105 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 02 10:34:29 crc kubenswrapper[4909]: I0202 10:34:29.469320 4909 generic.go:334] "Generic (PLEG): container finished" podID="7164d60d-218c-47e0-a74a-677793e589b0" containerID="36a34c9180ca4fb5850058b6c97fcc54de5ec05ad1814483e266ffd72db9f50e" exitCode=0 Feb 02 10:34:29 crc kubenswrapper[4909]: I0202 10:34:29.469403 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlhl9" event={"ID":"7164d60d-218c-47e0-a74a-677793e589b0","Type":"ContainerDied","Data":"36a34c9180ca4fb5850058b6c97fcc54de5ec05ad1814483e266ffd72db9f50e"} Feb 02 10:34:29 crc kubenswrapper[4909]: I0202 10:34:29.474434 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc3e167c-a86d-4fb1-baf2-98233797a107","Type":"ContainerStarted","Data":"114e95411d95016419b44923a8f1df7c9557dd1eaf16822d7272542330cd7111"} Feb 02 10:34:29 crc kubenswrapper[4909]: I0202 10:34:29.474508 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:29 crc kubenswrapper[4909]: I0202 10:34:29.482547 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:30 crc kubenswrapper[4909]: I0202 10:34:30.479204 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc3e167c-a86d-4fb1-baf2-98233797a107","Type":"ContainerStarted","Data":"b7b7cd03e7f10147debac05bafb26dea18dbce1c59b7c71462f1a0471a06653b"} Feb 02 10:34:30 crc kubenswrapper[4909]: I0202 10:34:30.494946 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.494931014 podStartE2EDuration="2.494931014s" podCreationTimestamp="2026-02-02 10:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:30.491038642 +0000 UTC m=+196.237139397" watchObservedRunningTime="2026-02-02 10:34:30.494931014 +0000 UTC m=+196.241031749" Feb 02 10:34:32 crc kubenswrapper[4909]: I0202 10:34:32.491113 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlhl9" event={"ID":"7164d60d-218c-47e0-a74a-677793e589b0","Type":"ContainerStarted","Data":"55959a236926f75800be59e6899fb48a94b11dd291e4d485c7f7a8b7aade6bd5"} Feb 02 10:34:33 crc kubenswrapper[4909]: I0202 10:34:33.515268 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mlhl9" podStartSLOduration=4.352027313 podStartE2EDuration="51.515250075s" podCreationTimestamp="2026-02-02 10:33:42 +0000 UTC" firstStartedPulling="2026-02-02 10:33:44.800131104 +0000 UTC m=+150.546231839" lastFinishedPulling="2026-02-02 10:34:31.963353866 +0000 UTC m=+197.709454601" observedRunningTime="2026-02-02 10:34:33.513944798 +0000 UTC m=+199.260045543" watchObservedRunningTime="2026-02-02 10:34:33.515250075 +0000 UTC m=+199.261350810" Feb 02 10:34:33 crc kubenswrapper[4909]: I0202 10:34:33.680508 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:34:33 crc kubenswrapper[4909]: I0202 10:34:33.680562 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:34:33 crc kubenswrapper[4909]: I0202 10:34:33.764690 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:34:33 crc kubenswrapper[4909]: I0202 10:34:33.887000 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:34:33 crc kubenswrapper[4909]: I0202 10:34:33.887056 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:34:33 crc kubenswrapper[4909]: I0202 10:34:33.930874 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:34:34 crc kubenswrapper[4909]: I0202 10:34:34.142085 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:34:34 crc kubenswrapper[4909]: I0202 10:34:34.184825 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:34:34 crc kubenswrapper[4909]: I0202 10:34:34.547084 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:34:34 crc kubenswrapper[4909]: I0202 10:34:34.547379 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:34:35 crc kubenswrapper[4909]: I0202 10:34:35.508914 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxpg" event={"ID":"9818ea74-330d-4bd0-8931-91fef529ef29","Type":"ContainerStarted","Data":"f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953"} Feb 02 10:34:35 crc kubenswrapper[4909]: I0202 10:34:35.636975 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:34:35 crc kubenswrapper[4909]: I0202 10:34:35.637044 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:34:35 crc kubenswrapper[4909]: I0202 10:34:35.673868 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:34:36 crc kubenswrapper[4909]: I0202 10:34:36.329215 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxkq4"] Feb 02 10:34:36 crc kubenswrapper[4909]: I0202 10:34:36.329583 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bxkq4" podUID="7986860a-5f33-47fb-af58-a2925c4572a4" containerName="registry-server" containerID="cri-o://2e8067e3821be8c88d2728fa2f815cb671c795db2ade99e6a83765bb9cca8ea7" gracePeriod=2 Feb 02 10:34:36 crc kubenswrapper[4909]: E0202 10:34:36.504717 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7986860a_5f33_47fb_af58_a2925c4572a4.slice/crio-conmon-2e8067e3821be8c88d2728fa2f815cb671c795db2ade99e6a83765bb9cca8ea7.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:34:36 crc kubenswrapper[4909]: I0202 10:34:36.517526 4909 generic.go:334] "Generic (PLEG): container finished" podID="7986860a-5f33-47fb-af58-a2925c4572a4" containerID="2e8067e3821be8c88d2728fa2f815cb671c795db2ade99e6a83765bb9cca8ea7" exitCode=0 Feb 02 10:34:36 crc kubenswrapper[4909]: I0202 10:34:36.517737 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxkq4" event={"ID":"7986860a-5f33-47fb-af58-a2925c4572a4","Type":"ContainerDied","Data":"2e8067e3821be8c88d2728fa2f815cb671c795db2ade99e6a83765bb9cca8ea7"} Feb 02 10:34:36 crc kubenswrapper[4909]: I0202 10:34:36.521294 4909 generic.go:334] "Generic (PLEG): container finished" podID="9818ea74-330d-4bd0-8931-91fef529ef29" containerID="f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953" exitCode=0 Feb 02 10:34:36 crc kubenswrapper[4909]: I0202 10:34:36.521349 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxpg" event={"ID":"9818ea74-330d-4bd0-8931-91fef529ef29","Type":"ContainerDied","Data":"f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953"} Feb 02 10:34:36 crc kubenswrapper[4909]: I0202 10:34:36.568283 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:34:36 crc kubenswrapper[4909]: I0202 10:34:36.721874 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:34:36 crc kubenswrapper[4909]: I0202 10:34:36.763342 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:34:36 crc kubenswrapper[4909]: I0202 10:34:36.926009 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwvb8"] Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.152429 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.199534 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.329202 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.501797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw9mk\" (UniqueName: \"kubernetes.io/projected/7986860a-5f33-47fb-af58-a2925c4572a4-kube-api-access-tw9mk\") pod \"7986860a-5f33-47fb-af58-a2925c4572a4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.501869 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-catalog-content\") pod \"7986860a-5f33-47fb-af58-a2925c4572a4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.501934 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-utilities\") pod \"7986860a-5f33-47fb-af58-a2925c4572a4\" (UID: \"7986860a-5f33-47fb-af58-a2925c4572a4\") " Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.502800 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-utilities" (OuterVolumeSpecName: "utilities") pod "7986860a-5f33-47fb-af58-a2925c4572a4" (UID: "7986860a-5f33-47fb-af58-a2925c4572a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.508350 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7986860a-5f33-47fb-af58-a2925c4572a4-kube-api-access-tw9mk" (OuterVolumeSpecName: "kube-api-access-tw9mk") pod "7986860a-5f33-47fb-af58-a2925c4572a4" (UID: "7986860a-5f33-47fb-af58-a2925c4572a4"). InnerVolumeSpecName "kube-api-access-tw9mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.528027 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxpg" event={"ID":"9818ea74-330d-4bd0-8931-91fef529ef29","Type":"ContainerStarted","Data":"4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea"} Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.531207 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bxkq4" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.531357 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bxkq4" event={"ID":"7986860a-5f33-47fb-af58-a2925c4572a4","Type":"ContainerDied","Data":"cfc8a81cdbddec4918fc6243e193940dc129a149cd7269f00198e88901d4725d"} Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.531386 4909 scope.go:117] "RemoveContainer" containerID="2e8067e3821be8c88d2728fa2f815cb671c795db2ade99e6a83765bb9cca8ea7" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.531586 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gwvb8" podUID="62300332-ecea-47ea-9809-6cc89e9593bf" containerName="registry-server" containerID="cri-o://c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac" gracePeriod=2 Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.547098 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fgxpg" podStartSLOduration=2.402495178 podStartE2EDuration="52.547082512s" podCreationTimestamp="2026-02-02 10:33:45 +0000 UTC" firstStartedPulling="2026-02-02 10:33:46.917122473 +0000 UTC m=+152.663223208" lastFinishedPulling="2026-02-02 10:34:37.061709807 +0000 UTC m=+202.807810542" observedRunningTime="2026-02-02 10:34:37.545018323 +0000 UTC m=+203.291119058" watchObservedRunningTime="2026-02-02 10:34:37.547082512 +0000 UTC m=+203.293183247" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.555964 4909 scope.go:117] "RemoveContainer" containerID="d0177008ab3cf7a24f4da5a4e3899cca448d6e1911204e67585482592ea529c5" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.558027 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7986860a-5f33-47fb-af58-a2925c4572a4" (UID: "7986860a-5f33-47fb-af58-a2925c4572a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.570285 4909 scope.go:117] "RemoveContainer" containerID="084647b93fb11309f8a05aef3463e8fc8476127d5db1e02be4f24bc9b0432307" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.603651 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw9mk\" (UniqueName: \"kubernetes.io/projected/7986860a-5f33-47fb-af58-a2925c4572a4-kube-api-access-tw9mk\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.603729 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.604077 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7986860a-5f33-47fb-af58-a2925c4572a4-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.864923 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bxkq4"] Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.867456 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bxkq4"] Feb 02 10:34:37 crc kubenswrapper[4909]: I0202 10:34:37.914013 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.012753 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-catalog-content\") pod \"62300332-ecea-47ea-9809-6cc89e9593bf\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.012823 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-utilities\") pod \"62300332-ecea-47ea-9809-6cc89e9593bf\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.012851 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nds6\" (UniqueName: \"kubernetes.io/projected/62300332-ecea-47ea-9809-6cc89e9593bf-kube-api-access-7nds6\") pod \"62300332-ecea-47ea-9809-6cc89e9593bf\" (UID: \"62300332-ecea-47ea-9809-6cc89e9593bf\") " Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.013543 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-utilities" (OuterVolumeSpecName: "utilities") pod "62300332-ecea-47ea-9809-6cc89e9593bf" (UID: "62300332-ecea-47ea-9809-6cc89e9593bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.017166 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62300332-ecea-47ea-9809-6cc89e9593bf-kube-api-access-7nds6" (OuterVolumeSpecName: "kube-api-access-7nds6") pod "62300332-ecea-47ea-9809-6cc89e9593bf" (UID: "62300332-ecea-47ea-9809-6cc89e9593bf"). InnerVolumeSpecName "kube-api-access-7nds6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.057578 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62300332-ecea-47ea-9809-6cc89e9593bf" (UID: "62300332-ecea-47ea-9809-6cc89e9593bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.114603 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.114641 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62300332-ecea-47ea-9809-6cc89e9593bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.114651 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nds6\" (UniqueName: \"kubernetes.io/projected/62300332-ecea-47ea-9809-6cc89e9593bf-kube-api-access-7nds6\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.537861 4909 generic.go:334] "Generic (PLEG): container finished" podID="62300332-ecea-47ea-9809-6cc89e9593bf" containerID="c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac" exitCode=0 Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.537925 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwvb8" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.537939 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvb8" event={"ID":"62300332-ecea-47ea-9809-6cc89e9593bf","Type":"ContainerDied","Data":"c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac"} Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.538240 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwvb8" event={"ID":"62300332-ecea-47ea-9809-6cc89e9593bf","Type":"ContainerDied","Data":"1a75680fe74618edb29fa32d4126a337774008ec0a6c95997c21076547980bf0"} Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.538265 4909 scope.go:117] "RemoveContainer" containerID="c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.553788 4909 scope.go:117] "RemoveContainer" containerID="4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.561021 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwvb8"] Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.569047 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gwvb8"] Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.580879 4909 scope.go:117] "RemoveContainer" containerID="059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.593461 4909 scope.go:117] "RemoveContainer" containerID="c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac" Feb 02 10:34:38 crc kubenswrapper[4909]: E0202 10:34:38.593894 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac\": container with ID starting with c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac not found: ID does not exist" containerID="c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.593940 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac"} err="failed to get container status \"c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac\": rpc error: code = NotFound desc = could not find container \"c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac\": container with ID starting with c3252d7af68d94ad637af95ad67c451bf55023486e3f8bae94e33545f65d62ac not found: ID does not exist" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.593994 4909 scope.go:117] "RemoveContainer" containerID="4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012" Feb 02 10:34:38 crc kubenswrapper[4909]: E0202 10:34:38.594401 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012\": container with ID starting with 4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012 not found: ID does not exist" containerID="4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.594531 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012"} err="failed to get container status \"4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012\": rpc error: code = NotFound desc = could not find container \"4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012\": container with ID starting with 4f46b8c5a6eb5313b34b14ddc05afd2abbe800d99415a5172b4fb66dfdf23012 not found: ID does not exist" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.594642 4909 scope.go:117] "RemoveContainer" containerID="059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac" Feb 02 10:34:38 crc kubenswrapper[4909]: E0202 10:34:38.595073 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac\": container with ID starting with 059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac not found: ID does not exist" containerID="059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac" Feb 02 10:34:38 crc kubenswrapper[4909]: I0202 10:34:38.595109 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac"} err="failed to get container status \"059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac\": rpc error: code = NotFound desc = could not find container \"059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac\": container with ID starting with 059509f5569ee76ec766d592f7902d5d3903a54858eee07a453a48f47469f8ac not found: ID does not exist" Feb 02 10:34:39 crc kubenswrapper[4909]: I0202 10:34:39.024642 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62300332-ecea-47ea-9809-6cc89e9593bf" path="/var/lib/kubelet/pods/62300332-ecea-47ea-9809-6cc89e9593bf/volumes" Feb 02 10:34:39 crc kubenswrapper[4909]: I0202 10:34:39.026787 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7986860a-5f33-47fb-af58-a2925c4572a4" path="/var/lib/kubelet/pods/7986860a-5f33-47fb-af58-a2925c4572a4/volumes" Feb 02 10:34:40 crc kubenswrapper[4909]: I0202 10:34:40.727705 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x94kg"] Feb 02 10:34:40 crc kubenswrapper[4909]: I0202 10:34:40.728279 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x94kg" podUID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerName="registry-server" containerID="cri-o://7aaffea257a196b278c9b906495218b5885030d05463dd37e7d03d47bc02061d" gracePeriod=2 Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.558617 4909 generic.go:334] "Generic (PLEG): container finished" podID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerID="7aaffea257a196b278c9b906495218b5885030d05463dd37e7d03d47bc02061d" exitCode=0 Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.558778 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x94kg" event={"ID":"65681cb4-6a6f-4fce-8322-b6efffeecc78","Type":"ContainerDied","Data":"7aaffea257a196b278c9b906495218b5885030d05463dd37e7d03d47bc02061d"} Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.712948 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.760228 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-catalog-content\") pod \"65681cb4-6a6f-4fce-8322-b6efffeecc78\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.760314 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-utilities\") pod \"65681cb4-6a6f-4fce-8322-b6efffeecc78\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.760405 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6smm4\" (UniqueName: \"kubernetes.io/projected/65681cb4-6a6f-4fce-8322-b6efffeecc78-kube-api-access-6smm4\") pod \"65681cb4-6a6f-4fce-8322-b6efffeecc78\" (UID: \"65681cb4-6a6f-4fce-8322-b6efffeecc78\") " Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.761188 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-utilities" (OuterVolumeSpecName: "utilities") pod "65681cb4-6a6f-4fce-8322-b6efffeecc78" (UID: "65681cb4-6a6f-4fce-8322-b6efffeecc78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.767092 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65681cb4-6a6f-4fce-8322-b6efffeecc78-kube-api-access-6smm4" (OuterVolumeSpecName: "kube-api-access-6smm4") pod "65681cb4-6a6f-4fce-8322-b6efffeecc78" (UID: "65681cb4-6a6f-4fce-8322-b6efffeecc78"). InnerVolumeSpecName "kube-api-access-6smm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.862710 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.862758 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6smm4\" (UniqueName: \"kubernetes.io/projected/65681cb4-6a6f-4fce-8322-b6efffeecc78-kube-api-access-6smm4\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.876276 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65681cb4-6a6f-4fce-8322-b6efffeecc78" (UID: "65681cb4-6a6f-4fce-8322-b6efffeecc78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:34:41 crc kubenswrapper[4909]: I0202 10:34:41.963407 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65681cb4-6a6f-4fce-8322-b6efffeecc78-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.491257 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg"] Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.491510 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" podUID="b0625823-24a7-4ccb-a37e-57f8307cb09f" containerName="controller-manager" containerID="cri-o://0c9cd46f9045b1a5719a0334c989c3171daa5427d3f45466b412eeff50ce12cc" gracePeriod=30 Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.506186 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99"] Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.506442 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" podUID="2d50641a-3b94-4d0a-9d73-7be005b9a395" containerName="route-controller-manager" containerID="cri-o://b47d02dda8a78823a1afc5a2e2ee50f690dfa3af87f0498872aa2f8b9a2df4e9" gracePeriod=30 Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.567189 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x94kg" event={"ID":"65681cb4-6a6f-4fce-8322-b6efffeecc78","Type":"ContainerDied","Data":"f15a8822628645b4851610f112d8d76612fa8cf859599f7702659f914ec7d170"} Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.567256 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x94kg" Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.567271 4909 scope.go:117] "RemoveContainer" containerID="7aaffea257a196b278c9b906495218b5885030d05463dd37e7d03d47bc02061d" Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.583032 4909 scope.go:117] "RemoveContainer" containerID="638a2bed8eb11ef192acb4759a1053e3da4c211ec3a998956e0ae1332f2b136d" Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.596298 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x94kg"] Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.599625 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x94kg"] Feb 02 10:34:42 crc kubenswrapper[4909]: I0202 10:34:42.601253 4909 scope.go:117] "RemoveContainer" containerID="05f03a6492746357c30e6cd225e8e444732a077b005d4f5647476b3edff54d2c" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.022866 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65681cb4-6a6f-4fce-8322-b6efffeecc78" path="/var/lib/kubelet/pods/65681cb4-6a6f-4fce-8322-b6efffeecc78/volumes" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.327130 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.341666 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.376061 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.577393 4909 generic.go:334] "Generic (PLEG): container finished" podID="b0625823-24a7-4ccb-a37e-57f8307cb09f" containerID="0c9cd46f9045b1a5719a0334c989c3171daa5427d3f45466b412eeff50ce12cc" exitCode=0 Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.577436 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" event={"ID":"b0625823-24a7-4ccb-a37e-57f8307cb09f","Type":"ContainerDied","Data":"0c9cd46f9045b1a5719a0334c989c3171daa5427d3f45466b412eeff50ce12cc"} Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.579536 4909 generic.go:334] "Generic (PLEG): container finished" podID="2d50641a-3b94-4d0a-9d73-7be005b9a395" containerID="b47d02dda8a78823a1afc5a2e2ee50f690dfa3af87f0498872aa2f8b9a2df4e9" exitCode=0 Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.580536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" event={"ID":"2d50641a-3b94-4d0a-9d73-7be005b9a395","Type":"ContainerDied","Data":"b47d02dda8a78823a1afc5a2e2ee50f690dfa3af87f0498872aa2f8b9a2df4e9"} Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.580565 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" event={"ID":"2d50641a-3b94-4d0a-9d73-7be005b9a395","Type":"ContainerDied","Data":"136eec769a5dfb72996cf1c40999e54c0b11508b1d3a027bf4737939f06f43dc"} Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.580580 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="136eec769a5dfb72996cf1c40999e54c0b11508b1d3a027bf4737939f06f43dc" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.587552 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.624885 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629261 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk"] Feb 02 10:34:43 crc kubenswrapper[4909]: E0202 10:34:43.629498 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7986860a-5f33-47fb-af58-a2925c4572a4" containerName="extract-utilities" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629520 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7986860a-5f33-47fb-af58-a2925c4572a4" containerName="extract-utilities" Feb 02 10:34:43 crc kubenswrapper[4909]: E0202 10:34:43.629533 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62300332-ecea-47ea-9809-6cc89e9593bf" containerName="extract-content" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629543 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="62300332-ecea-47ea-9809-6cc89e9593bf" containerName="extract-content" Feb 02 10:34:43 crc kubenswrapper[4909]: E0202 10:34:43.629553 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7986860a-5f33-47fb-af58-a2925c4572a4" containerName="extract-content" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629563 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7986860a-5f33-47fb-af58-a2925c4572a4" containerName="extract-content" Feb 02 10:34:43 crc kubenswrapper[4909]: E0202 10:34:43.629583 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62300332-ecea-47ea-9809-6cc89e9593bf" containerName="extract-utilities" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629591 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="62300332-ecea-47ea-9809-6cc89e9593bf" containerName="extract-utilities" Feb 02 10:34:43 crc kubenswrapper[4909]: E0202 10:34:43.629602 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62300332-ecea-47ea-9809-6cc89e9593bf" containerName="registry-server" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629610 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="62300332-ecea-47ea-9809-6cc89e9593bf" containerName="registry-server" Feb 02 10:34:43 crc kubenswrapper[4909]: E0202 10:34:43.629621 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7986860a-5f33-47fb-af58-a2925c4572a4" containerName="registry-server" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629631 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7986860a-5f33-47fb-af58-a2925c4572a4" containerName="registry-server" Feb 02 10:34:43 crc kubenswrapper[4909]: E0202 10:34:43.629647 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerName="extract-content" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629655 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerName="extract-content" Feb 02 10:34:43 crc kubenswrapper[4909]: E0202 10:34:43.629667 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerName="extract-utilities" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629674 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerName="extract-utilities" Feb 02 10:34:43 crc kubenswrapper[4909]: E0202 10:34:43.629685 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerName="registry-server" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629694 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerName="registry-server" Feb 02 10:34:43 crc kubenswrapper[4909]: E0202 10:34:43.629704 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d50641a-3b94-4d0a-9d73-7be005b9a395" containerName="route-controller-manager" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.629712 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d50641a-3b94-4d0a-9d73-7be005b9a395" containerName="route-controller-manager" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.630952 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="65681cb4-6a6f-4fce-8322-b6efffeecc78" containerName="registry-server" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.631063 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7986860a-5f33-47fb-af58-a2925c4572a4" containerName="registry-server" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.631086 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="62300332-ecea-47ea-9809-6cc89e9593bf" containerName="registry-server" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.631100 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d50641a-3b94-4d0a-9d73-7be005b9a395" containerName="route-controller-manager" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.632578 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.639144 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk"] Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.686045 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b545\" (UniqueName: \"kubernetes.io/projected/2d50641a-3b94-4d0a-9d73-7be005b9a395-kube-api-access-5b545\") pod \"2d50641a-3b94-4d0a-9d73-7be005b9a395\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.686121 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-client-ca\") pod \"2d50641a-3b94-4d0a-9d73-7be005b9a395\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.686161 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-config\") pod \"2d50641a-3b94-4d0a-9d73-7be005b9a395\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.686186 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d50641a-3b94-4d0a-9d73-7be005b9a395-serving-cert\") pod \"2d50641a-3b94-4d0a-9d73-7be005b9a395\" (UID: \"2d50641a-3b94-4d0a-9d73-7be005b9a395\") " Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.686413 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-client-ca\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.686440 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b0879a-ca51-45a3-8075-9443292e3a4a-serving-cert\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.686472 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-config\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.686514 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v87mn\" (UniqueName: \"kubernetes.io/projected/20b0879a-ca51-45a3-8075-9443292e3a4a-kube-api-access-v87mn\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.688216 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-client-ca" (OuterVolumeSpecName: "client-ca") pod "2d50641a-3b94-4d0a-9d73-7be005b9a395" (UID: "2d50641a-3b94-4d0a-9d73-7be005b9a395"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.688514 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-config" (OuterVolumeSpecName: "config") pod "2d50641a-3b94-4d0a-9d73-7be005b9a395" (UID: "2d50641a-3b94-4d0a-9d73-7be005b9a395"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.692002 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d50641a-3b94-4d0a-9d73-7be005b9a395-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2d50641a-3b94-4d0a-9d73-7be005b9a395" (UID: "2d50641a-3b94-4d0a-9d73-7be005b9a395"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.692056 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d50641a-3b94-4d0a-9d73-7be005b9a395-kube-api-access-5b545" (OuterVolumeSpecName: "kube-api-access-5b545") pod "2d50641a-3b94-4d0a-9d73-7be005b9a395" (UID: "2d50641a-3b94-4d0a-9d73-7be005b9a395"). InnerVolumeSpecName "kube-api-access-5b545". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.727799 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.787015 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-config\") pod \"b0625823-24a7-4ccb-a37e-57f8307cb09f\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.787063 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnmrp\" (UniqueName: \"kubernetes.io/projected/b0625823-24a7-4ccb-a37e-57f8307cb09f-kube-api-access-tnmrp\") pod \"b0625823-24a7-4ccb-a37e-57f8307cb09f\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.787208 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-proxy-ca-bundles\") pod \"b0625823-24a7-4ccb-a37e-57f8307cb09f\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.787991 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b0625823-24a7-4ccb-a37e-57f8307cb09f" (UID: "b0625823-24a7-4ccb-a37e-57f8307cb09f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.788039 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0625823-24a7-4ccb-a37e-57f8307cb09f-serving-cert\") pod \"b0625823-24a7-4ccb-a37e-57f8307cb09f\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.788077 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-client-ca\") pod \"b0625823-24a7-4ccb-a37e-57f8307cb09f\" (UID: \"b0625823-24a7-4ccb-a37e-57f8307cb09f\") " Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.788085 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-config" (OuterVolumeSpecName: "config") pod "b0625823-24a7-4ccb-a37e-57f8307cb09f" (UID: "b0625823-24a7-4ccb-a37e-57f8307cb09f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.788236 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v87mn\" (UniqueName: \"kubernetes.io/projected/20b0879a-ca51-45a3-8075-9443292e3a4a-kube-api-access-v87mn\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.788350 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-client-ca\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.788371 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b0879a-ca51-45a3-8075-9443292e3a4a-serving-cert\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.788567 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-config\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.788704 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-client-ca" (OuterVolumeSpecName: "client-ca") pod "b0625823-24a7-4ccb-a37e-57f8307cb09f" (UID: "b0625823-24a7-4ccb-a37e-57f8307cb09f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.789351 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.789385 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b545\" (UniqueName: \"kubernetes.io/projected/2d50641a-3b94-4d0a-9d73-7be005b9a395-kube-api-access-5b545\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.789402 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.789415 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d50641a-3b94-4d0a-9d73-7be005b9a395-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.789427 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d50641a-3b94-4d0a-9d73-7be005b9a395-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.789438 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.789773 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-client-ca\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.790569 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-config\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.790719 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0625823-24a7-4ccb-a37e-57f8307cb09f-kube-api-access-tnmrp" (OuterVolumeSpecName: "kube-api-access-tnmrp") pod "b0625823-24a7-4ccb-a37e-57f8307cb09f" (UID: "b0625823-24a7-4ccb-a37e-57f8307cb09f"). InnerVolumeSpecName "kube-api-access-tnmrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.791608 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b0879a-ca51-45a3-8075-9443292e3a4a-serving-cert\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.792177 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0625823-24a7-4ccb-a37e-57f8307cb09f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b0625823-24a7-4ccb-a37e-57f8307cb09f" (UID: "b0625823-24a7-4ccb-a37e-57f8307cb09f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.807398 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v87mn\" (UniqueName: \"kubernetes.io/projected/20b0879a-ca51-45a3-8075-9443292e3a4a-kube-api-access-v87mn\") pod \"route-controller-manager-796d6c9969-w45gk\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.890033 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0625823-24a7-4ccb-a37e-57f8307cb09f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.890082 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0625823-24a7-4ccb-a37e-57f8307cb09f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.890101 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnmrp\" (UniqueName: \"kubernetes.io/projected/b0625823-24a7-4ccb-a37e-57f8307cb09f-kube-api-access-tnmrp\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:43 crc kubenswrapper[4909]: I0202 10:34:43.960184 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.356065 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk"] Feb 02 10:34:44 crc kubenswrapper[4909]: W0202 10:34:44.362309 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20b0879a_ca51_45a3_8075_9443292e3a4a.slice/crio-b59da5e80140876194dedb29c8545698fffd1c0db3c41285b1dd0458f25e77db WatchSource:0}: Error finding container b59da5e80140876194dedb29c8545698fffd1c0db3c41285b1dd0458f25e77db: Status 404 returned error can't find the container with id b59da5e80140876194dedb29c8545698fffd1c0db3c41285b1dd0458f25e77db Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.606454 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" event={"ID":"b0625823-24a7-4ccb-a37e-57f8307cb09f","Type":"ContainerDied","Data":"c53532f7de9b0bfe0e705234c79fc5398fa121ab57c2f2e96b9aeaf1aa05a89a"} Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.606512 4909 scope.go:117] "RemoveContainer" containerID="0c9cd46f9045b1a5719a0334c989c3171daa5427d3f45466b412eeff50ce12cc" Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.606522 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg" Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.610102 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99" Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.609950 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" event={"ID":"20b0879a-ca51-45a3-8075-9443292e3a4a","Type":"ContainerStarted","Data":"869ca1e13267f38fafb474468ff645d0500e7b469e2b9abc9799c07975aac6ea"} Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.610629 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.610731 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" event={"ID":"20b0879a-ca51-45a3-8075-9443292e3a4a","Type":"ContainerStarted","Data":"b59da5e80140876194dedb29c8545698fffd1c0db3c41285b1dd0458f25e77db"} Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.631253 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" podStartSLOduration=2.631235088 podStartE2EDuration="2.631235088s" podCreationTimestamp="2026-02-02 10:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:44.628277334 +0000 UTC m=+210.374378069" watchObservedRunningTime="2026-02-02 10:34:44.631235088 +0000 UTC m=+210.377335823" Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.643595 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg"] Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.649033 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6cc4f874d5-jcgcg"] Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.657286 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99"] Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.661311 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d56d5955c-27w99"] Feb 02 10:34:44 crc kubenswrapper[4909]: I0202 10:34:44.975252 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.028375 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d50641a-3b94-4d0a-9d73-7be005b9a395" path="/var/lib/kubelet/pods/2d50641a-3b94-4d0a-9d73-7be005b9a395/volumes" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.029039 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0625823-24a7-4ccb-a37e-57f8307cb09f" path="/var/lib/kubelet/pods/b0625823-24a7-4ccb-a37e-57f8307cb09f/volumes" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.823766 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f976bb66f-cgdfb"] Feb 02 10:34:45 crc kubenswrapper[4909]: E0202 10:34:45.824233 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0625823-24a7-4ccb-a37e-57f8307cb09f" containerName="controller-manager" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.824246 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0625823-24a7-4ccb-a37e-57f8307cb09f" containerName="controller-manager" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.824346 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0625823-24a7-4ccb-a37e-57f8307cb09f" containerName="controller-manager" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.824690 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.827284 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.827591 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.827703 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.827770 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.830613 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.830955 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.835926 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.839048 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f976bb66f-cgdfb"] Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.913665 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-proxy-ca-bundles\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.913715 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-config\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.913735 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad69359-949f-4445-92da-5b7157fa3a07-serving-cert\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.913758 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-client-ca\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:45 crc kubenswrapper[4909]: I0202 10:34:45.913913 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnxtb\" (UniqueName: \"kubernetes.io/projected/dad69359-949f-4445-92da-5b7157fa3a07-kube-api-access-dnxtb\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.015245 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxtb\" (UniqueName: \"kubernetes.io/projected/dad69359-949f-4445-92da-5b7157fa3a07-kube-api-access-dnxtb\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.015310 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-proxy-ca-bundles\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.015338 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-config\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.015356 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad69359-949f-4445-92da-5b7157fa3a07-serving-cert\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.015385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-client-ca\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.016522 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-client-ca\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.016888 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-proxy-ca-bundles\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.017748 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-config\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.020833 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad69359-949f-4445-92da-5b7157fa3a07-serving-cert\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.031328 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnxtb\" (UniqueName: \"kubernetes.io/projected/dad69359-949f-4445-92da-5b7157fa3a07-kube-api-access-dnxtb\") pod \"controller-manager-5f976bb66f-cgdfb\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.048923 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.049430 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.088450 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.145960 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.527493 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f976bb66f-cgdfb"] Feb 02 10:34:46 crc kubenswrapper[4909]: W0202 10:34:46.538939 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddad69359_949f_4445_92da_5b7157fa3a07.slice/crio-b4ad86092c8911aa34dfaa16fe2f84250a4d2f32949ed567e15629288fd3cbfa WatchSource:0}: Error finding container b4ad86092c8911aa34dfaa16fe2f84250a4d2f32949ed567e15629288fd3cbfa: Status 404 returned error can't find the container with id b4ad86092c8911aa34dfaa16fe2f84250a4d2f32949ed567e15629288fd3cbfa Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.624780 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" event={"ID":"dad69359-949f-4445-92da-5b7157fa3a07","Type":"ContainerStarted","Data":"b4ad86092c8911aa34dfaa16fe2f84250a4d2f32949ed567e15629288fd3cbfa"} Feb 02 10:34:46 crc kubenswrapper[4909]: I0202 10:34:46.701653 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:34:47 crc kubenswrapper[4909]: I0202 10:34:47.631468 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" event={"ID":"dad69359-949f-4445-92da-5b7157fa3a07","Type":"ContainerStarted","Data":"39d95f31646bad0b19361da8666154e0a3893104376fcfa16952c8ebdddacaf4"} Feb 02 10:34:47 crc kubenswrapper[4909]: I0202 10:34:47.645853 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" podStartSLOduration=5.645830928 podStartE2EDuration="5.645830928s" podCreationTimestamp="2026-02-02 10:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:34:47.644455749 +0000 UTC m=+213.390556504" watchObservedRunningTime="2026-02-02 10:34:47.645830928 +0000 UTC m=+213.391931673" Feb 02 10:34:48 crc kubenswrapper[4909]: I0202 10:34:48.646882 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:48 crc kubenswrapper[4909]: I0202 10:34:48.652024 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:34:48 crc kubenswrapper[4909]: I0202 10:34:48.723151 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgxpg"] Feb 02 10:34:49 crc kubenswrapper[4909]: I0202 10:34:49.510654 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:34:49 crc kubenswrapper[4909]: I0202 10:34:49.510734 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:34:49 crc kubenswrapper[4909]: I0202 10:34:49.510789 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:34:49 crc kubenswrapper[4909]: I0202 10:34:49.511566 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:34:49 crc kubenswrapper[4909]: I0202 10:34:49.511669 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013" gracePeriod=600 Feb 02 10:34:49 crc kubenswrapper[4909]: I0202 10:34:49.653006 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013" exitCode=0 Feb 02 10:34:49 crc kubenswrapper[4909]: I0202 10:34:49.653078 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013"} Feb 02 10:34:49 crc kubenswrapper[4909]: I0202 10:34:49.654014 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fgxpg" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" containerName="registry-server" containerID="cri-o://4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea" gracePeriod=2 Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.030362 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" podUID="35861719-3b6b-4572-8761-bb9c8bfce573" containerName="oauth-openshift" containerID="cri-o://96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd" gracePeriod=15 Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.108489 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.167734 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-catalog-content\") pod \"9818ea74-330d-4bd0-8931-91fef529ef29\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.167797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnp7n\" (UniqueName: \"kubernetes.io/projected/9818ea74-330d-4bd0-8931-91fef529ef29-kube-api-access-fnp7n\") pod \"9818ea74-330d-4bd0-8931-91fef529ef29\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.167952 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-utilities\") pod \"9818ea74-330d-4bd0-8931-91fef529ef29\" (UID: \"9818ea74-330d-4bd0-8931-91fef529ef29\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.168732 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-utilities" (OuterVolumeSpecName: "utilities") pod "9818ea74-330d-4bd0-8931-91fef529ef29" (UID: "9818ea74-330d-4bd0-8931-91fef529ef29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.174882 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9818ea74-330d-4bd0-8931-91fef529ef29-kube-api-access-fnp7n" (OuterVolumeSpecName: "kube-api-access-fnp7n") pod "9818ea74-330d-4bd0-8931-91fef529ef29" (UID: "9818ea74-330d-4bd0-8931-91fef529ef29"). InnerVolumeSpecName "kube-api-access-fnp7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.190271 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9818ea74-330d-4bd0-8931-91fef529ef29" (UID: "9818ea74-330d-4bd0-8931-91fef529ef29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.269059 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.269092 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9818ea74-330d-4bd0-8931-91fef529ef29-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.269105 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnp7n\" (UniqueName: \"kubernetes.io/projected/9818ea74-330d-4bd0-8931-91fef529ef29-kube-api-access-fnp7n\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.389700 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.472462 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-ocp-branding-template\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.472837 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-cliconfig\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.472870 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wstr\" (UniqueName: \"kubernetes.io/projected/35861719-3b6b-4572-8761-bb9c8bfce573-kube-api-access-4wstr\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.472901 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-service-ca\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.472924 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-provider-selection\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.472981 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-login\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.473008 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-trusted-ca-bundle\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.473032 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-session\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.473059 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-router-certs\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.473085 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-serving-cert\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.473107 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-audit-policies\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.473137 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35861719-3b6b-4572-8761-bb9c8bfce573-audit-dir\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.473159 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-error\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.473192 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-idp-0-file-data\") pod \"35861719-3b6b-4572-8761-bb9c8bfce573\" (UID: \"35861719-3b6b-4572-8761-bb9c8bfce573\") " Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.473483 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.474384 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.474573 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.474632 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35861719-3b6b-4572-8761-bb9c8bfce573-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.474794 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.477543 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.477704 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.477962 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.478271 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35861719-3b6b-4572-8761-bb9c8bfce573-kube-api-access-4wstr" (OuterVolumeSpecName: "kube-api-access-4wstr") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "kube-api-access-4wstr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.478423 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.478502 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.478707 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.480051 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.480104 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "35861719-3b6b-4572-8761-bb9c8bfce573" (UID: "35861719-3b6b-4572-8761-bb9c8bfce573"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574696 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574726 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574756 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574766 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wstr\" (UniqueName: \"kubernetes.io/projected/35861719-3b6b-4572-8761-bb9c8bfce573-kube-api-access-4wstr\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574775 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574784 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574795 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574818 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574827 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574835 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574843 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574853 4909 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/35861719-3b6b-4572-8761-bb9c8bfce573-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574862 4909 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/35861719-3b6b-4572-8761-bb9c8bfce573-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.574869 4909 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/35861719-3b6b-4572-8761-bb9c8bfce573-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.661689 4909 generic.go:334] "Generic (PLEG): container finished" podID="35861719-3b6b-4572-8761-bb9c8bfce573" containerID="96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd" exitCode=0 Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.661759 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" event={"ID":"35861719-3b6b-4572-8761-bb9c8bfce573","Type":"ContainerDied","Data":"96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd"} Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.661845 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" event={"ID":"35861719-3b6b-4572-8761-bb9c8bfce573","Type":"ContainerDied","Data":"fa2fb108174000fef77fae837385d0ee2e7f119d3d115802fdb3d7eaf8c3521d"} Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.661864 4909 scope.go:117] "RemoveContainer" containerID="96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.661775 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-ddclx" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.670113 4909 generic.go:334] "Generic (PLEG): container finished" podID="9818ea74-330d-4bd0-8931-91fef529ef29" containerID="4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea" exitCode=0 Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.670217 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fgxpg" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.670510 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxpg" event={"ID":"9818ea74-330d-4bd0-8931-91fef529ef29","Type":"ContainerDied","Data":"4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea"} Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.670568 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fgxpg" event={"ID":"9818ea74-330d-4bd0-8931-91fef529ef29","Type":"ContainerDied","Data":"1e1ca68921ab9b028a5530519dd4740da4d66013789a7b4840dda337d37402dc"} Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.674845 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"55fd471bfde53741a19ef0c82ccf0a2fc7d599b14ec95f480294c68c01189727"} Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.689565 4909 scope.go:117] "RemoveContainer" containerID="96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd" Feb 02 10:34:50 crc kubenswrapper[4909]: E0202 10:34:50.690459 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd\": container with ID starting with 96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd not found: ID does not exist" containerID="96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.690517 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd"} err="failed to get container status \"96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd\": rpc error: code = NotFound desc = could not find container \"96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd\": container with ID starting with 96307e4d81072ce1836a138f7c748ec8031d8c4bbe1a16060f4c35e4a89acbfd not found: ID does not exist" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.690555 4909 scope.go:117] "RemoveContainer" containerID="4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.714199 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ddclx"] Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.718326 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-ddclx"] Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.725638 4909 scope.go:117] "RemoveContainer" containerID="f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.736640 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgxpg"] Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.751206 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fgxpg"] Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.753618 4909 scope.go:117] "RemoveContainer" containerID="56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.780863 4909 scope.go:117] "RemoveContainer" containerID="4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea" Feb 02 10:34:50 crc kubenswrapper[4909]: E0202 10:34:50.785326 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea\": container with ID starting with 4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea not found: ID does not exist" containerID="4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.785364 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea"} err="failed to get container status \"4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea\": rpc error: code = NotFound desc = could not find container \"4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea\": container with ID starting with 4a58ec92b413320aadbaa6aa6ce7095a3484112cf9d1871ebc57d60df2de70ea not found: ID does not exist" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.785391 4909 scope.go:117] "RemoveContainer" containerID="f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953" Feb 02 10:34:50 crc kubenswrapper[4909]: E0202 10:34:50.785604 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953\": container with ID starting with f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953 not found: ID does not exist" containerID="f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.785624 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953"} err="failed to get container status \"f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953\": rpc error: code = NotFound desc = could not find container \"f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953\": container with ID starting with f9bf48fa100974b52786100d06a870e205e392c85992ac5043530d091a045953 not found: ID does not exist" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.785640 4909 scope.go:117] "RemoveContainer" containerID="56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42" Feb 02 10:34:50 crc kubenswrapper[4909]: E0202 10:34:50.785950 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42\": container with ID starting with 56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42 not found: ID does not exist" containerID="56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42" Feb 02 10:34:50 crc kubenswrapper[4909]: I0202 10:34:50.785976 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42"} err="failed to get container status \"56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42\": rpc error: code = NotFound desc = could not find container \"56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42\": container with ID starting with 56173d3130fabdeeadf5dd2d8ae21b05a9fc8793edbf092d33b09ecc96044a42 not found: ID does not exist" Feb 02 10:34:51 crc kubenswrapper[4909]: I0202 10:34:51.022760 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35861719-3b6b-4572-8761-bb9c8bfce573" path="/var/lib/kubelet/pods/35861719-3b6b-4572-8761-bb9c8bfce573/volumes" Feb 02 10:34:51 crc kubenswrapper[4909]: I0202 10:34:51.023394 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" path="/var/lib/kubelet/pods/9818ea74-330d-4bd0-8931-91fef529ef29/volumes" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.834829 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd"] Feb 02 10:34:59 crc kubenswrapper[4909]: E0202 10:34:59.835510 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" containerName="registry-server" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.835523 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" containerName="registry-server" Feb 02 10:34:59 crc kubenswrapper[4909]: E0202 10:34:59.835534 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35861719-3b6b-4572-8761-bb9c8bfce573" containerName="oauth-openshift" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.835540 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="35861719-3b6b-4572-8761-bb9c8bfce573" containerName="oauth-openshift" Feb 02 10:34:59 crc kubenswrapper[4909]: E0202 10:34:59.835555 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" containerName="extract-content" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.835562 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" containerName="extract-content" Feb 02 10:34:59 crc kubenswrapper[4909]: E0202 10:34:59.835572 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" containerName="extract-utilities" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.835579 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" containerName="extract-utilities" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.835687 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="35861719-3b6b-4572-8761-bb9c8bfce573" containerName="oauth-openshift" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.835695 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9818ea74-330d-4bd0-8931-91fef529ef29" containerName="registry-server" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.836168 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.838798 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.839167 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.839609 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.839848 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.840069 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.840181 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.840621 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.840822 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.841746 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.842268 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.842401 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.847333 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.850720 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.851275 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd"] Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.855413 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.860427 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979460 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-template-login\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979500 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-template-error\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979517 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979545 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979602 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-audit-dir\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979618 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979637 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979653 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979678 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-session\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979693 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979725 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-audit-policies\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979747 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979764 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng4wz\" (UniqueName: \"kubernetes.io/projected/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-kube-api-access-ng4wz\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:34:59 crc kubenswrapper[4909]: I0202 10:34:59.979781 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.080789 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-template-login\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.081063 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-template-error\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.081165 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.081250 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.082153 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.082245 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-audit-dir\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.082345 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.082447 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.082540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-session\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.082619 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.082723 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-audit-policies\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.082829 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.083599 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.083567 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-audit-policies\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.082344 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-audit-dir\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.083916 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng4wz\" (UniqueName: \"kubernetes.io/projected/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-kube-api-access-ng4wz\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.084029 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.084373 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.086839 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-service-ca\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.086953 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-template-error\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.087095 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.089248 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-session\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.089342 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.089397 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-router-certs\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.089634 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.090697 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.091124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-v4-0-config-user-template-login\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.101801 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng4wz\" (UniqueName: \"kubernetes.io/projected/ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb-kube-api-access-ng4wz\") pod \"oauth-openshift-6b8bc975dc-gp5rd\" (UID: \"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb\") " pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.163519 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.537670 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd"] Feb 02 10:35:00 crc kubenswrapper[4909]: W0202 10:35:00.544295 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac69ed08_7b6a_438f_9d2f_7ed4f3d191eb.slice/crio-d356da714b3d166acbd4d1d29765433a7cec8c09f60536479132b34387c5ebe0 WatchSource:0}: Error finding container d356da714b3d166acbd4d1d29765433a7cec8c09f60536479132b34387c5ebe0: Status 404 returned error can't find the container with id d356da714b3d166acbd4d1d29765433a7cec8c09f60536479132b34387c5ebe0 Feb 02 10:35:00 crc kubenswrapper[4909]: I0202 10:35:00.727152 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" event={"ID":"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb","Type":"ContainerStarted","Data":"d356da714b3d166acbd4d1d29765433a7cec8c09f60536479132b34387c5ebe0"} Feb 02 10:35:01 crc kubenswrapper[4909]: I0202 10:35:01.733233 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" event={"ID":"ac69ed08-7b6a-438f-9d2f-7ed4f3d191eb","Type":"ContainerStarted","Data":"b6ae3a110333f84d36e7612fcd5b1aa5c135a046a1406a3742260cc1482e51fc"} Feb 02 10:35:01 crc kubenswrapper[4909]: I0202 10:35:01.733561 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:01 crc kubenswrapper[4909]: I0202 10:35:01.742150 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" Feb 02 10:35:01 crc kubenswrapper[4909]: I0202 10:35:01.752729 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6b8bc975dc-gp5rd" podStartSLOduration=36.752713385 podStartE2EDuration="36.752713385s" podCreationTimestamp="2026-02-02 10:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:01.751787909 +0000 UTC m=+227.497888654" watchObservedRunningTime="2026-02-02 10:35:01.752713385 +0000 UTC m=+227.498814120" Feb 02 10:35:02 crc kubenswrapper[4909]: I0202 10:35:02.452946 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f976bb66f-cgdfb"] Feb 02 10:35:02 crc kubenswrapper[4909]: I0202 10:35:02.453205 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" podUID="dad69359-949f-4445-92da-5b7157fa3a07" containerName="controller-manager" containerID="cri-o://39d95f31646bad0b19361da8666154e0a3893104376fcfa16952c8ebdddacaf4" gracePeriod=30 Feb 02 10:35:02 crc kubenswrapper[4909]: I0202 10:35:02.552083 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk"] Feb 02 10:35:02 crc kubenswrapper[4909]: I0202 10:35:02.552606 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" podUID="20b0879a-ca51-45a3-8075-9443292e3a4a" containerName="route-controller-manager" containerID="cri-o://869ca1e13267f38fafb474468ff645d0500e7b469e2b9abc9799c07975aac6ea" gracePeriod=30 Feb 02 10:35:02 crc kubenswrapper[4909]: I0202 10:35:02.739926 4909 generic.go:334] "Generic (PLEG): container finished" podID="dad69359-949f-4445-92da-5b7157fa3a07" containerID="39d95f31646bad0b19361da8666154e0a3893104376fcfa16952c8ebdddacaf4" exitCode=0 Feb 02 10:35:02 crc kubenswrapper[4909]: I0202 10:35:02.740007 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" event={"ID":"dad69359-949f-4445-92da-5b7157fa3a07","Type":"ContainerDied","Data":"39d95f31646bad0b19361da8666154e0a3893104376fcfa16952c8ebdddacaf4"} Feb 02 10:35:02 crc kubenswrapper[4909]: I0202 10:35:02.741646 4909 generic.go:334] "Generic (PLEG): container finished" podID="20b0879a-ca51-45a3-8075-9443292e3a4a" containerID="869ca1e13267f38fafb474468ff645d0500e7b469e2b9abc9799c07975aac6ea" exitCode=0 Feb 02 10:35:02 crc kubenswrapper[4909]: I0202 10:35:02.741729 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" event={"ID":"20b0879a-ca51-45a3-8075-9443292e3a4a","Type":"ContainerDied","Data":"869ca1e13267f38fafb474468ff645d0500e7b469e2b9abc9799c07975aac6ea"} Feb 02 10:35:02 crc kubenswrapper[4909]: I0202 10:35:02.995174 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.019985 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-client-ca\") pod \"20b0879a-ca51-45a3-8075-9443292e3a4a\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.020251 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b0879a-ca51-45a3-8075-9443292e3a4a-serving-cert\") pod \"20b0879a-ca51-45a3-8075-9443292e3a4a\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.020386 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v87mn\" (UniqueName: \"kubernetes.io/projected/20b0879a-ca51-45a3-8075-9443292e3a4a-kube-api-access-v87mn\") pod \"20b0879a-ca51-45a3-8075-9443292e3a4a\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.020497 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-config\") pod \"20b0879a-ca51-45a3-8075-9443292e3a4a\" (UID: \"20b0879a-ca51-45a3-8075-9443292e3a4a\") " Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.021901 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-config" (OuterVolumeSpecName: "config") pod "20b0879a-ca51-45a3-8075-9443292e3a4a" (UID: "20b0879a-ca51-45a3-8075-9443292e3a4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.023833 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-client-ca" (OuterVolumeSpecName: "client-ca") pod "20b0879a-ca51-45a3-8075-9443292e3a4a" (UID: "20b0879a-ca51-45a3-8075-9443292e3a4a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.027776 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0879a-ca51-45a3-8075-9443292e3a4a-kube-api-access-v87mn" (OuterVolumeSpecName: "kube-api-access-v87mn") pod "20b0879a-ca51-45a3-8075-9443292e3a4a" (UID: "20b0879a-ca51-45a3-8075-9443292e3a4a"). InnerVolumeSpecName "kube-api-access-v87mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.029056 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0879a-ca51-45a3-8075-9443292e3a4a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "20b0879a-ca51-45a3-8075-9443292e3a4a" (UID: "20b0879a-ca51-45a3-8075-9443292e3a4a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.057213 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.121393 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-client-ca\") pod \"dad69359-949f-4445-92da-5b7157fa3a07\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.121724 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnxtb\" (UniqueName: \"kubernetes.io/projected/dad69359-949f-4445-92da-5b7157fa3a07-kube-api-access-dnxtb\") pod \"dad69359-949f-4445-92da-5b7157fa3a07\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.121910 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-proxy-ca-bundles\") pod \"dad69359-949f-4445-92da-5b7157fa3a07\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.122115 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad69359-949f-4445-92da-5b7157fa3a07-serving-cert\") pod \"dad69359-949f-4445-92da-5b7157fa3a07\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.122257 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-client-ca" (OuterVolumeSpecName: "client-ca") pod "dad69359-949f-4445-92da-5b7157fa3a07" (UID: "dad69359-949f-4445-92da-5b7157fa3a07"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.122404 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-config\") pod \"dad69359-949f-4445-92da-5b7157fa3a07\" (UID: \"dad69359-949f-4445-92da-5b7157fa3a07\") " Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.122823 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.122935 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b0879a-ca51-45a3-8075-9443292e3a4a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.123018 4909 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-client-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.123079 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v87mn\" (UniqueName: \"kubernetes.io/projected/20b0879a-ca51-45a3-8075-9443292e3a4a-kube-api-access-v87mn\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.123138 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b0879a-ca51-45a3-8075-9443292e3a4a-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.122955 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "dad69359-949f-4445-92da-5b7157fa3a07" (UID: "dad69359-949f-4445-92da-5b7157fa3a07"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.123329 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-config" (OuterVolumeSpecName: "config") pod "dad69359-949f-4445-92da-5b7157fa3a07" (UID: "dad69359-949f-4445-92da-5b7157fa3a07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.125608 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dad69359-949f-4445-92da-5b7157fa3a07-kube-api-access-dnxtb" (OuterVolumeSpecName: "kube-api-access-dnxtb") pod "dad69359-949f-4445-92da-5b7157fa3a07" (UID: "dad69359-949f-4445-92da-5b7157fa3a07"). InnerVolumeSpecName "kube-api-access-dnxtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.126464 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dad69359-949f-4445-92da-5b7157fa3a07-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dad69359-949f-4445-92da-5b7157fa3a07" (UID: "dad69359-949f-4445-92da-5b7157fa3a07"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.223760 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnxtb\" (UniqueName: \"kubernetes.io/projected/dad69359-949f-4445-92da-5b7157fa3a07-kube-api-access-dnxtb\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.223788 4909 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.223797 4909 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dad69359-949f-4445-92da-5b7157fa3a07-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.223819 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dad69359-949f-4445-92da-5b7157fa3a07-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.748616 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" event={"ID":"dad69359-949f-4445-92da-5b7157fa3a07","Type":"ContainerDied","Data":"b4ad86092c8911aa34dfaa16fe2f84250a4d2f32949ed567e15629288fd3cbfa"} Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.749242 4909 scope.go:117] "RemoveContainer" containerID="39d95f31646bad0b19361da8666154e0a3893104376fcfa16952c8ebdddacaf4" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.748663 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f976bb66f-cgdfb" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.750460 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.751459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk" event={"ID":"20b0879a-ca51-45a3-8075-9443292e3a4a","Type":"ContainerDied","Data":"b59da5e80140876194dedb29c8545698fffd1c0db3c41285b1dd0458f25e77db"} Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.769646 4909 scope.go:117] "RemoveContainer" containerID="869ca1e13267f38fafb474468ff645d0500e7b469e2b9abc9799c07975aac6ea" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.778521 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f976bb66f-cgdfb"] Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.785365 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f976bb66f-cgdfb"] Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.787776 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk"] Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.789919 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-796d6c9969-w45gk"] Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.840412 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6fbd566777-lg7l4"] Feb 02 10:35:03 crc kubenswrapper[4909]: E0202 10:35:03.843297 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b0879a-ca51-45a3-8075-9443292e3a4a" containerName="route-controller-manager" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.843324 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b0879a-ca51-45a3-8075-9443292e3a4a" containerName="route-controller-manager" Feb 02 10:35:03 crc kubenswrapper[4909]: E0202 10:35:03.843340 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dad69359-949f-4445-92da-5b7157fa3a07" containerName="controller-manager" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.843346 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="dad69359-949f-4445-92da-5b7157fa3a07" containerName="controller-manager" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.843443 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="dad69359-949f-4445-92da-5b7157fa3a07" containerName="controller-manager" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.843458 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b0879a-ca51-45a3-8075-9443292e3a4a" containerName="route-controller-manager" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.843841 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.844620 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c"] Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.846078 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.846826 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.847655 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.847879 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.847887 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.848038 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.849344 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.853885 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.854403 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.854406 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.854469 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.854534 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.854921 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.859937 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fbd566777-lg7l4"] Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.869894 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.875419 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c"] Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.931006 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-serving-cert\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.931293 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93423862-598b-42a7-8758-6e27c9f8e0bf-serving-cert\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.931324 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-config\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.931356 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93423862-598b-42a7-8758-6e27c9f8e0bf-config\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.931379 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drcvd\" (UniqueName: \"kubernetes.io/projected/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-kube-api-access-drcvd\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.931396 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-client-ca\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.931411 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgk56\" (UniqueName: \"kubernetes.io/projected/93423862-598b-42a7-8758-6e27c9f8e0bf-kube-api-access-cgk56\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.931576 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-proxy-ca-bundles\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:03 crc kubenswrapper[4909]: I0202 10:35:03.931641 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93423862-598b-42a7-8758-6e27c9f8e0bf-client-ca\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.033102 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drcvd\" (UniqueName: \"kubernetes.io/projected/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-kube-api-access-drcvd\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.033170 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-client-ca\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.033208 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgk56\" (UniqueName: \"kubernetes.io/projected/93423862-598b-42a7-8758-6e27c9f8e0bf-kube-api-access-cgk56\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.033269 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-proxy-ca-bundles\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.033291 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93423862-598b-42a7-8758-6e27c9f8e0bf-client-ca\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.033339 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-serving-cert\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.033364 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93423862-598b-42a7-8758-6e27c9f8e0bf-serving-cert\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.033423 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-config\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.033469 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93423862-598b-42a7-8758-6e27c9f8e0bf-config\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.034758 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93423862-598b-42a7-8758-6e27c9f8e0bf-config\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.035261 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-client-ca\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.036097 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/93423862-598b-42a7-8758-6e27c9f8e0bf-client-ca\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.036294 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-config\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.036359 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-proxy-ca-bundles\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.039357 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-serving-cert\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.039497 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93423862-598b-42a7-8758-6e27c9f8e0bf-serving-cert\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.051679 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgk56\" (UniqueName: \"kubernetes.io/projected/93423862-598b-42a7-8758-6e27c9f8e0bf-kube-api-access-cgk56\") pod \"route-controller-manager-5bb4f48756-c5r8c\" (UID: \"93423862-598b-42a7-8758-6e27c9f8e0bf\") " pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.051732 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drcvd\" (UniqueName: \"kubernetes.io/projected/14cbc8e4-6dcf-4d70-811a-c7ad72c62050-kube-api-access-drcvd\") pod \"controller-manager-6fbd566777-lg7l4\" (UID: \"14cbc8e4-6dcf-4d70-811a-c7ad72c62050\") " pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.175020 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.186728 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.559790 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fbd566777-lg7l4"] Feb 02 10:35:04 crc kubenswrapper[4909]: W0202 10:35:04.569292 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14cbc8e4_6dcf_4d70_811a_c7ad72c62050.slice/crio-28e5d2f5d7586541026ab46ca0b86031cbb30db66d079d360b9b341bfa803a55 WatchSource:0}: Error finding container 28e5d2f5d7586541026ab46ca0b86031cbb30db66d079d360b9b341bfa803a55: Status 404 returned error can't find the container with id 28e5d2f5d7586541026ab46ca0b86031cbb30db66d079d360b9b341bfa803a55 Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.613131 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c"] Feb 02 10:35:04 crc kubenswrapper[4909]: W0202 10:35:04.625054 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93423862_598b_42a7_8758_6e27c9f8e0bf.slice/crio-35b55886b5d26e63d868bd8c9b47a01822c7d190b39df19d34fb68cb883abfd6 WatchSource:0}: Error finding container 35b55886b5d26e63d868bd8c9b47a01822c7d190b39df19d34fb68cb883abfd6: Status 404 returned error can't find the container with id 35b55886b5d26e63d868bd8c9b47a01822c7d190b39df19d34fb68cb883abfd6 Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.762174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" event={"ID":"14cbc8e4-6dcf-4d70-811a-c7ad72c62050","Type":"ContainerStarted","Data":"f7935b266754dba0c29015e5a45d92186baedd83ab7a0bbcd984f357ebbc4b95"} Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.762219 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" event={"ID":"14cbc8e4-6dcf-4d70-811a-c7ad72c62050","Type":"ContainerStarted","Data":"28e5d2f5d7586541026ab46ca0b86031cbb30db66d079d360b9b341bfa803a55"} Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.763370 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.764481 4909 patch_prober.go:28] interesting pod/controller-manager-6fbd566777-lg7l4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.764521 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" podUID="14cbc8e4-6dcf-4d70-811a-c7ad72c62050" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.766700 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" event={"ID":"93423862-598b-42a7-8758-6e27c9f8e0bf","Type":"ContainerStarted","Data":"35b55886b5d26e63d868bd8c9b47a01822c7d190b39df19d34fb68cb883abfd6"} Feb 02 10:35:04 crc kubenswrapper[4909]: I0202 10:35:04.782540 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" podStartSLOduration=2.782511896 podStartE2EDuration="2.782511896s" podCreationTimestamp="2026-02-02 10:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:04.777490904 +0000 UTC m=+230.523591659" watchObservedRunningTime="2026-02-02 10:35:04.782511896 +0000 UTC m=+230.528612641" Feb 02 10:35:05 crc kubenswrapper[4909]: I0202 10:35:05.021895 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0879a-ca51-45a3-8075-9443292e3a4a" path="/var/lib/kubelet/pods/20b0879a-ca51-45a3-8075-9443292e3a4a/volumes" Feb 02 10:35:05 crc kubenswrapper[4909]: I0202 10:35:05.022548 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dad69359-949f-4445-92da-5b7157fa3a07" path="/var/lib/kubelet/pods/dad69359-949f-4445-92da-5b7157fa3a07/volumes" Feb 02 10:35:05 crc kubenswrapper[4909]: I0202 10:35:05.773208 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" event={"ID":"93423862-598b-42a7-8758-6e27c9f8e0bf","Type":"ContainerStarted","Data":"ec3ce6c696c2cb6aa5e5b2174f2d9b69243dc28d39e15dcaf8470dc0da0c8f2c"} Feb 02 10:35:05 crc kubenswrapper[4909]: I0202 10:35:05.777160 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6fbd566777-lg7l4" Feb 02 10:35:05 crc kubenswrapper[4909]: I0202 10:35:05.793234 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" podStartSLOduration=3.793205168 podStartE2EDuration="3.793205168s" podCreationTimestamp="2026-02-02 10:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:05.788157295 +0000 UTC m=+231.534258050" watchObservedRunningTime="2026-02-02 10:35:05.793205168 +0000 UTC m=+231.539305903" Feb 02 10:35:06 crc kubenswrapper[4909]: I0202 10:35:06.778616 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:06 crc kubenswrapper[4909]: I0202 10:35:06.784356 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5bb4f48756-c5r8c" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.425951 4909 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.426910 4909 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.427046 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.427263 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc" gracePeriod=15 Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.427321 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630" gracePeriod=15 Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.427478 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880" gracePeriod=15 Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.427487 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a" gracePeriod=15 Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.427626 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe" gracePeriod=15 Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.432492 4909 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:35:07 crc kubenswrapper[4909]: E0202 10:35:07.432774 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.432798 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:07 crc kubenswrapper[4909]: E0202 10:35:07.432827 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.432835 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 02 10:35:07 crc kubenswrapper[4909]: E0202 10:35:07.432848 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.432855 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 10:35:07 crc kubenswrapper[4909]: E0202 10:35:07.432867 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.432873 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:07 crc kubenswrapper[4909]: E0202 10:35:07.432884 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.432891 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 10:35:07 crc kubenswrapper[4909]: E0202 10:35:07.432899 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.432906 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 10:35:07 crc kubenswrapper[4909]: E0202 10:35:07.432915 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.432923 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.433013 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.433021 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.433030 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.433039 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.433046 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.433055 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.433062 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:07 crc kubenswrapper[4909]: E0202 10:35:07.433143 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.433150 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 02 10:35:07 crc kubenswrapper[4909]: E0202 10:35:07.465020 4909 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.473943 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.474034 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.474102 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.474137 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.474385 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.474461 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.474482 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.474565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575573 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575614 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575638 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575666 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575696 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575719 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575729 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575753 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575750 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575739 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575776 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575800 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575820 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575831 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.575962 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.576051 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.766711 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.784533 4909 generic.go:334] "Generic (PLEG): container finished" podID="dc3e167c-a86d-4fb1-baf2-98233797a107" containerID="b7b7cd03e7f10147debac05bafb26dea18dbce1c59b7c71462f1a0471a06653b" exitCode=0 Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.784619 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc3e167c-a86d-4fb1-baf2-98233797a107","Type":"ContainerDied","Data":"b7b7cd03e7f10147debac05bafb26dea18dbce1c59b7c71462f1a0471a06653b"} Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.786961 4909 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.787998 4909 status_manager.go:851] "Failed to get status for pod" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.788725 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.789843 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.790891 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630" exitCode=0 Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.790927 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc" exitCode=0 Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.790951 4909 scope.go:117] "RemoveContainer" containerID="98974f3f5c9ddf30ee41e553d16c36536112e32393ea0b50f8531d9b9bdfa09a" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.790955 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880" exitCode=0 Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.791042 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a" exitCode=2 Feb 02 10:35:07 crc kubenswrapper[4909]: W0202 10:35:07.796331 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-55054f1c10b55d4cfd7535262264124939714a902dc8b89921d1b72417aaa2ed WatchSource:0}: Error finding container 55054f1c10b55d4cfd7535262264124939714a902dc8b89921d1b72417aaa2ed: Status 404 returned error can't find the container with id 55054f1c10b55d4cfd7535262264124939714a902dc8b89921d1b72417aaa2ed Feb 02 10:35:07 crc kubenswrapper[4909]: E0202 10:35:07.799246 4909 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1890678cc1a44be2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:35:07.798457314 +0000 UTC m=+233.544558049,LastTimestamp:2026-02-02 10:35:07.798457314 +0000 UTC m=+233.544558049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.854997 4909 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 02 10:35:07 crc kubenswrapper[4909]: I0202 10:35:07.855066 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 02 10:35:08 crc kubenswrapper[4909]: I0202 10:35:08.798557 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:35:08 crc kubenswrapper[4909]: I0202 10:35:08.800538 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d4f6ae2cd6fe1150fe7725bd998f91102e3d5cde1187afcc1b1368c67e018589"} Feb 02 10:35:08 crc kubenswrapper[4909]: I0202 10:35:08.800581 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"55054f1c10b55d4cfd7535262264124939714a902dc8b89921d1b72417aaa2ed"} Feb 02 10:35:08 crc kubenswrapper[4909]: E0202 10:35:08.801257 4909 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:08 crc kubenswrapper[4909]: I0202 10:35:08.801262 4909 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:08 crc kubenswrapper[4909]: I0202 10:35:08.801777 4909 status_manager.go:851] "Failed to get status for pod" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:09 crc kubenswrapper[4909]: E0202 10:35:09.088948 4909 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" volumeName="registry-storage" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.169005 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.169599 4909 status_manager.go:851] "Failed to get status for pod" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.195607 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-var-lock\") pod \"dc3e167c-a86d-4fb1-baf2-98233797a107\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.195681 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-kubelet-dir\") pod \"dc3e167c-a86d-4fb1-baf2-98233797a107\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.195726 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc3e167c-a86d-4fb1-baf2-98233797a107-kube-api-access\") pod \"dc3e167c-a86d-4fb1-baf2-98233797a107\" (UID: \"dc3e167c-a86d-4fb1-baf2-98233797a107\") " Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.195751 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-var-lock" (OuterVolumeSpecName: "var-lock") pod "dc3e167c-a86d-4fb1-baf2-98233797a107" (UID: "dc3e167c-a86d-4fb1-baf2-98233797a107"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.195817 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dc3e167c-a86d-4fb1-baf2-98233797a107" (UID: "dc3e167c-a86d-4fb1-baf2-98233797a107"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.195970 4909 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-var-lock\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.195985 4909 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dc3e167c-a86d-4fb1-baf2-98233797a107-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.200794 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc3e167c-a86d-4fb1-baf2-98233797a107-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dc3e167c-a86d-4fb1-baf2-98233797a107" (UID: "dc3e167c-a86d-4fb1-baf2-98233797a107"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.296876 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dc3e167c-a86d-4fb1-baf2-98233797a107-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.736328 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.737219 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.737820 4909 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.738179 4909 status_manager.go:851] "Failed to get status for pod" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.810323 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"dc3e167c-a86d-4fb1-baf2-98233797a107","Type":"ContainerDied","Data":"114e95411d95016419b44923a8f1df7c9557dd1eaf16822d7272542330cd7111"} Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.810369 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="114e95411d95016419b44923a8f1df7c9557dd1eaf16822d7272542330cd7111" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.810377 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.815396 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.816373 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe" exitCode=0 Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.816429 4909 scope.go:117] "RemoveContainer" containerID="19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.816501 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.828044 4909 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.828603 4909 status_manager.go:851] "Failed to get status for pod" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.830786 4909 scope.go:117] "RemoveContainer" containerID="7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.842374 4909 scope.go:117] "RemoveContainer" containerID="9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.853603 4909 scope.go:117] "RemoveContainer" containerID="27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.866945 4909 scope.go:117] "RemoveContainer" containerID="6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.882978 4909 scope.go:117] "RemoveContainer" containerID="570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.900116 4909 scope.go:117] "RemoveContainer" containerID="19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630" Feb 02 10:35:09 crc kubenswrapper[4909]: E0202 10:35:09.900521 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\": container with ID starting with 19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630 not found: ID does not exist" containerID="19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.900553 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630"} err="failed to get container status \"19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\": rpc error: code = NotFound desc = could not find container \"19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630\": container with ID starting with 19ef42c996e51c799b8d1a2ab415c176e6cad28f2363892ec7d72678faaf5630 not found: ID does not exist" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.900583 4909 scope.go:117] "RemoveContainer" containerID="7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc" Feb 02 10:35:09 crc kubenswrapper[4909]: E0202 10:35:09.900970 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\": container with ID starting with 7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc not found: ID does not exist" containerID="7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.900996 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc"} err="failed to get container status \"7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\": rpc error: code = NotFound desc = could not find container \"7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc\": container with ID starting with 7bda9c0beb55ea3ed52963e9fdf41da3d812319ca487c2d031afca64ae96d6bc not found: ID does not exist" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.901015 4909 scope.go:117] "RemoveContainer" containerID="9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880" Feb 02 10:35:09 crc kubenswrapper[4909]: E0202 10:35:09.901507 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\": container with ID starting with 9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880 not found: ID does not exist" containerID="9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.901571 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880"} err="failed to get container status \"9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\": rpc error: code = NotFound desc = could not find container \"9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880\": container with ID starting with 9d7729d0421b054909fdfc308261be4af4ef4e811f79d042f7f26ca6cdb10880 not found: ID does not exist" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.901609 4909 scope.go:117] "RemoveContainer" containerID="27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a" Feb 02 10:35:09 crc kubenswrapper[4909]: E0202 10:35:09.901970 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\": container with ID starting with 27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a not found: ID does not exist" containerID="27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.902008 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a"} err="failed to get container status \"27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\": rpc error: code = NotFound desc = could not find container \"27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a\": container with ID starting with 27da77e08501e74dc6ad8801706e4a4efb556c80b996e27db4cedd0a57c67d1a not found: ID does not exist" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.902029 4909 scope.go:117] "RemoveContainer" containerID="6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe" Feb 02 10:35:09 crc kubenswrapper[4909]: E0202 10:35:09.902332 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\": container with ID starting with 6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe not found: ID does not exist" containerID="6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.902359 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe"} err="failed to get container status \"6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\": rpc error: code = NotFound desc = could not find container \"6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe\": container with ID starting with 6556ad9ecb5ba89b8be6eedb093ac549a495b6efa36ecad99a416c61eae976fe not found: ID does not exist" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.902375 4909 scope.go:117] "RemoveContainer" containerID="570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4" Feb 02 10:35:09 crc kubenswrapper[4909]: E0202 10:35:09.902635 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\": container with ID starting with 570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4 not found: ID does not exist" containerID="570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.902698 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4"} err="failed to get container status \"570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\": rpc error: code = NotFound desc = could not find container \"570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4\": container with ID starting with 570a7b9cec1cf23c348f8058ea49ce78f2fe37eaa34cbc6085be31d90302aac4 not found: ID does not exist" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.904406 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.904462 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.904480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.904506 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.904579 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.904668 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.904688 4909 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:09 crc kubenswrapper[4909]: I0202 10:35:09.904701 4909 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:10 crc kubenswrapper[4909]: I0202 10:35:10.005601 4909 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:10 crc kubenswrapper[4909]: I0202 10:35:10.131194 4909 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:10 crc kubenswrapper[4909]: I0202 10:35:10.131553 4909 status_manager.go:851] "Failed to get status for pod" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:10 crc kubenswrapper[4909]: E0202 10:35:10.971289 4909 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1890678cc1a44be2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:35:07.798457314 +0000 UTC m=+233.544558049,LastTimestamp:2026-02-02 10:35:07.798457314 +0000 UTC m=+233.544558049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:35:11 crc kubenswrapper[4909]: I0202 10:35:11.023224 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 02 10:35:14 crc kubenswrapper[4909]: E0202 10:35:14.243427 4909 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:14 crc kubenswrapper[4909]: E0202 10:35:14.243628 4909 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:14 crc kubenswrapper[4909]: E0202 10:35:14.243773 4909 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:14 crc kubenswrapper[4909]: E0202 10:35:14.243954 4909 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:14 crc kubenswrapper[4909]: E0202 10:35:14.244209 4909 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:14 crc kubenswrapper[4909]: I0202 10:35:14.244256 4909 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 02 10:35:14 crc kubenswrapper[4909]: E0202 10:35:14.244652 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="200ms" Feb 02 10:35:14 crc kubenswrapper[4909]: E0202 10:35:14.445438 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="400ms" Feb 02 10:35:14 crc kubenswrapper[4909]: E0202 10:35:14.846979 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="800ms" Feb 02 10:35:15 crc kubenswrapper[4909]: I0202 10:35:15.019968 4909 status_manager.go:851] "Failed to get status for pod" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:15 crc kubenswrapper[4909]: E0202 10:35:15.647860 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="1.6s" Feb 02 10:35:17 crc kubenswrapper[4909]: E0202 10:35:17.248911 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="3.2s" Feb 02 10:35:20 crc kubenswrapper[4909]: E0202 10:35:20.450473 4909 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="6.4s" Feb 02 10:35:20 crc kubenswrapper[4909]: E0202 10:35:20.972157 4909 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1890678cc1a44be2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-02 10:35:07.798457314 +0000 UTC m=+233.544558049,LastTimestamp:2026-02-02 10:35:07.798457314 +0000 UTC m=+233.544558049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.015551 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.016371 4909 status_manager.go:851] "Failed to get status for pod" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.035787 4909 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e9f3ab5-f3f2-495a-8f51-ee432c06f828" Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.035844 4909 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e9f3ab5-f3f2-495a-8f51-ee432c06f828" Feb 02 10:35:21 crc kubenswrapper[4909]: E0202 10:35:21.036435 4909 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.037195 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:21 crc kubenswrapper[4909]: W0202 10:35:21.054106 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-6df0296e455c767bf9d7c7263ecf3241015e3926e3c7e3c21d12f3c8ec5871b9 WatchSource:0}: Error finding container 6df0296e455c767bf9d7c7263ecf3241015e3926e3c7e3c21d12f3c8ec5871b9: Status 404 returned error can't find the container with id 6df0296e455c767bf9d7c7263ecf3241015e3926e3c7e3c21d12f3c8ec5871b9 Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.888620 4909 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="310daa617e2e86735be319c6424a7aee6db71921093fd5dd54205ae6500a77ad" exitCode=0 Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.888707 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"310daa617e2e86735be319c6424a7aee6db71921093fd5dd54205ae6500a77ad"} Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.888934 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6df0296e455c767bf9d7c7263ecf3241015e3926e3c7e3c21d12f3c8ec5871b9"} Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.889199 4909 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e9f3ab5-f3f2-495a-8f51-ee432c06f828" Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.889214 4909 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e9f3ab5-f3f2-495a-8f51-ee432c06f828" Feb 02 10:35:21 crc kubenswrapper[4909]: I0202 10:35:21.889695 4909 status_manager.go:851] "Failed to get status for pod" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" Feb 02 10:35:21 crc kubenswrapper[4909]: E0202 10:35:21.889722 4909 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.30:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.897133 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.897380 4909 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a" exitCode=1 Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.897421 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a"} Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.897855 4909 scope.go:117] "RemoveContainer" containerID="e9be37368bc4381d5e11d2cff45e4d98862b68a4a3083a34a547c605424c5e2a" Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.901422 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8157b0722226357e06a5b0e337ba81384bcc4bbfaa423dc7a94d496855fb2a1c"} Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.901474 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3d2021967c6a53b5702b36d7dd438ebe0ec95b0ad61227b56fcaa5e813d6e469"} Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.901491 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ce902d4a9d921b6e1238e53a3db689352183ca1f42a126bc00c83f110f45a407"} Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.901504 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"22e3767640204922e241e1346c2811ee4e05d6d1618f33bf092bf43aed0a5b4f"} Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.901516 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d5b543207facaf0b330470baa4f8ff6fb91bd848f2bd28171050cda7b273c710"} Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.901660 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.901715 4909 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e9f3ab5-f3f2-495a-8f51-ee432c06f828" Feb 02 10:35:22 crc kubenswrapper[4909]: I0202 10:35:22.901731 4909 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e9f3ab5-f3f2-495a-8f51-ee432c06f828" Feb 02 10:35:23 crc kubenswrapper[4909]: I0202 10:35:23.908755 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 02 10:35:23 crc kubenswrapper[4909]: I0202 10:35:23.908904 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d52ae81b7d02476c632c95a6a12a8b7337e47d436c1da959180e59d1f3e85da3"} Feb 02 10:35:26 crc kubenswrapper[4909]: I0202 10:35:26.037995 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:26 crc kubenswrapper[4909]: I0202 10:35:26.038521 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:26 crc kubenswrapper[4909]: I0202 10:35:26.045879 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:28 crc kubenswrapper[4909]: I0202 10:35:28.558238 4909 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:28 crc kubenswrapper[4909]: I0202 10:35:28.650793 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bcb95764-5590-430b-a25d-8908c4dfa403" Feb 02 10:35:28 crc kubenswrapper[4909]: I0202 10:35:28.935390 4909 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e9f3ab5-f3f2-495a-8f51-ee432c06f828" Feb 02 10:35:28 crc kubenswrapper[4909]: I0202 10:35:28.935422 4909 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6e9f3ab5-f3f2-495a-8f51-ee432c06f828" Feb 02 10:35:28 crc kubenswrapper[4909]: I0202 10:35:28.940108 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bcb95764-5590-430b-a25d-8908c4dfa403" Feb 02 10:35:31 crc kubenswrapper[4909]: I0202 10:35:31.272181 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:35:31 crc kubenswrapper[4909]: I0202 10:35:31.529215 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:35:31 crc kubenswrapper[4909]: I0202 10:35:31.534754 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:35:38 crc kubenswrapper[4909]: I0202 10:35:38.824350 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 02 10:35:38 crc kubenswrapper[4909]: I0202 10:35:38.833743 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 02 10:35:39 crc kubenswrapper[4909]: I0202 10:35:39.200733 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 02 10:35:39 crc kubenswrapper[4909]: I0202 10:35:39.551681 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 02 10:35:39 crc kubenswrapper[4909]: I0202 10:35:39.745598 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 02 10:35:39 crc kubenswrapper[4909]: I0202 10:35:39.929549 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 02 10:35:40 crc kubenswrapper[4909]: I0202 10:35:40.053452 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 02 10:35:40 crc kubenswrapper[4909]: I0202 10:35:40.094159 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 02 10:35:40 crc kubenswrapper[4909]: I0202 10:35:40.190311 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 02 10:35:40 crc kubenswrapper[4909]: I0202 10:35:40.337953 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 02 10:35:40 crc kubenswrapper[4909]: I0202 10:35:40.788520 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 02 10:35:40 crc kubenswrapper[4909]: I0202 10:35:40.794057 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 02 10:35:40 crc kubenswrapper[4909]: I0202 10:35:40.904385 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 02 10:35:40 crc kubenswrapper[4909]: I0202 10:35:40.962537 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.068727 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.089945 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.211617 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.247335 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.278280 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.328689 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.477746 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.484522 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.582563 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.596549 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.716019 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.730485 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.756534 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 02 10:35:41 crc kubenswrapper[4909]: I0202 10:35:41.910255 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.019239 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.037838 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.107529 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.264351 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.290045 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.448786 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.479321 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.531616 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.553023 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.565643 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.623289 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.687318 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.822779 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.932143 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.939229 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.939677 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 02 10:35:42 crc kubenswrapper[4909]: I0202 10:35:42.950579 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 02 10:35:43 crc kubenswrapper[4909]: I0202 10:35:43.094920 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 02 10:35:43 crc kubenswrapper[4909]: I0202 10:35:43.186224 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 02 10:35:43 crc kubenswrapper[4909]: I0202 10:35:43.518291 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 02 10:35:43 crc kubenswrapper[4909]: I0202 10:35:43.656674 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 02 10:35:43 crc kubenswrapper[4909]: I0202 10:35:43.733073 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 02 10:35:43 crc kubenswrapper[4909]: I0202 10:35:43.786874 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 02 10:35:43 crc kubenswrapper[4909]: I0202 10:35:43.795532 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 02 10:35:43 crc kubenswrapper[4909]: I0202 10:35:43.939523 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.009568 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.058906 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.085128 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.113897 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.126573 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.174712 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.275437 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.316005 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.492842 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.543505 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.592117 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.706788 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.897058 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.939315 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 02 10:35:44 crc kubenswrapper[4909]: I0202 10:35:44.996001 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.136492 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.186601 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.233256 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.265368 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.289784 4909 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.321841 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.361532 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.415881 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.460445 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.462599 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.487334 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.513299 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.582945 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.685801 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.701512 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.753251 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.827780 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.956621 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 02 10:35:45 crc kubenswrapper[4909]: I0202 10:35:45.975777 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.038018 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.064227 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.117689 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.122282 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.193266 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.278500 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.314010 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.315998 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.321240 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.413060 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.443800 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.521154 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.530878 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.545750 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.583733 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.741493 4909 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.788931 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.824273 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.837915 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.855988 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.881881 4909 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.965820 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.983273 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.992676 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 02 10:35:46 crc kubenswrapper[4909]: I0202 10:35:46.995670 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.044695 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.048488 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.079393 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.080215 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.146779 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.153959 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.202894 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.203010 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.229463 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.329347 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.459276 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.577002 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.663646 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.674518 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.766663 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.774841 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.806072 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.827119 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.831690 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.837089 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.841408 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.845754 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.879372 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.910016 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 02 10:35:47 crc kubenswrapper[4909]: I0202 10:35:47.949302 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.049595 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.153619 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.180512 4909 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.185880 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.185925 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.189874 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.190469 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.198464 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.201873 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.201851556 podStartE2EDuration="20.201851556s" podCreationTimestamp="2026-02-02 10:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:35:48.200438797 +0000 UTC m=+273.946539552" watchObservedRunningTime="2026-02-02 10:35:48.201851556 +0000 UTC m=+273.947952291" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.222634 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.222947 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.489057 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.511108 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.527509 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.541718 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.606819 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.606836 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.672493 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.833443 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.836034 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.898674 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 02 10:35:48 crc kubenswrapper[4909]: I0202 10:35:48.908905 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.207373 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.228578 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.272373 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.288085 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.319599 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.433468 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.449279 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.481782 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.499420 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.586428 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.637092 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.651459 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.688486 4909 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.734989 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.737003 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.789348 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.825117 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.837674 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.859487 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.881855 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.946351 4909 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.946558 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d4f6ae2cd6fe1150fe7725bd998f91102e3d5cde1187afcc1b1368c67e018589" gracePeriod=5 Feb 02 10:35:49 crc kubenswrapper[4909]: I0202 10:35:49.959184 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.092578 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.128531 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.201536 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.203525 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.245541 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.296834 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.317051 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.336186 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.491006 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.594503 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.606019 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.656090 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.678752 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.686705 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.721266 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.793637 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.825823 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 02 10:35:50 crc kubenswrapper[4909]: I0202 10:35:50.994165 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.041968 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.133371 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.200520 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.223537 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.253108 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.253399 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.409759 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.575732 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.577122 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.633719 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.668645 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.688711 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.744671 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.751445 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.789614 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 02 10:35:51 crc kubenswrapper[4909]: I0202 10:35:51.867958 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.030999 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.094702 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.184949 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.196009 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.339697 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.356717 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.449643 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.469004 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.470463 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.491853 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.528902 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.712100 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 02 10:35:52 crc kubenswrapper[4909]: I0202 10:35:52.778856 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.142063 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.196752 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.253078 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.503222 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.517573 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.581292 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.605436 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.632799 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.724867 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.814498 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.853464 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.883853 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 02 10:35:53 crc kubenswrapper[4909]: I0202 10:35:53.912020 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 02 10:35:54 crc kubenswrapper[4909]: I0202 10:35:54.032753 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 02 10:35:54 crc kubenswrapper[4909]: I0202 10:35:54.086684 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 02 10:35:54 crc kubenswrapper[4909]: I0202 10:35:54.172267 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 02 10:35:54 crc kubenswrapper[4909]: I0202 10:35:54.440741 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 02 10:35:54 crc kubenswrapper[4909]: I0202 10:35:54.472295 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 02 10:35:54 crc kubenswrapper[4909]: I0202 10:35:54.692042 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.004963 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.064352 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.064422 4909 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d4f6ae2cd6fe1150fe7725bd998f91102e3d5cde1187afcc1b1368c67e018589" exitCode=137 Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.064458 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55054f1c10b55d4cfd7535262264124939714a902dc8b89921d1b72417aaa2ed" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.065479 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.071543 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.078781 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.078881 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.182895 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.182942 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.182968 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.182992 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.183017 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.183230 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.183253 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.183282 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.183315 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.190998 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.196255 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.244584 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.284761 4909 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.284801 4909 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.284855 4909 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.284865 4909 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.284872 4909 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.377207 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.381333 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.431617 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 02 10:35:55 crc kubenswrapper[4909]: I0202 10:35:55.907188 4909 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 02 10:35:56 crc kubenswrapper[4909]: I0202 10:35:56.069529 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 02 10:35:56 crc kubenswrapper[4909]: I0202 10:35:56.098546 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 02 10:35:56 crc kubenswrapper[4909]: I0202 10:35:56.721286 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 02 10:35:56 crc kubenswrapper[4909]: I0202 10:35:56.781881 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 02 10:35:57 crc kubenswrapper[4909]: I0202 10:35:57.021902 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 02 10:36:14 crc kubenswrapper[4909]: I0202 10:36:14.157678 4909 generic.go:334] "Generic (PLEG): container finished" podID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerID="fa9a9e026623d194f2e469fce2e458deac81c27f84db141aec5e4634c22f9654" exitCode=0 Feb 02 10:36:14 crc kubenswrapper[4909]: I0202 10:36:14.158604 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" event={"ID":"b575e8ec-7b85-4647-b0af-03274d67afc8","Type":"ContainerDied","Data":"fa9a9e026623d194f2e469fce2e458deac81c27f84db141aec5e4634c22f9654"} Feb 02 10:36:14 crc kubenswrapper[4909]: I0202 10:36:14.159223 4909 scope.go:117] "RemoveContainer" containerID="fa9a9e026623d194f2e469fce2e458deac81c27f84db141aec5e4634c22f9654" Feb 02 10:36:14 crc kubenswrapper[4909]: I0202 10:36:14.813959 4909 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 02 10:36:15 crc kubenswrapper[4909]: I0202 10:36:15.164395 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" event={"ID":"b575e8ec-7b85-4647-b0af-03274d67afc8","Type":"ContainerStarted","Data":"bb69e8cccdf6fc618d4e442b672cebb8e47697c4026ead7857ed3bafdfc3ffc9"} Feb 02 10:36:15 crc kubenswrapper[4909]: I0202 10:36:15.164770 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:36:15 crc kubenswrapper[4909]: I0202 10:36:15.167418 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:36:49 crc kubenswrapper[4909]: I0202 10:36:49.511323 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:36:49 crc kubenswrapper[4909]: I0202 10:36:49.511948 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.821894 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fgbbv"] Feb 02 10:36:56 crc kubenswrapper[4909]: E0202 10:36:56.822397 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.822408 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 10:36:56 crc kubenswrapper[4909]: E0202 10:36:56.822419 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" containerName="installer" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.822425 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" containerName="installer" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.822526 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc3e167c-a86d-4fb1-baf2-98233797a107" containerName="installer" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.822537 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.822930 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.831001 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fgbbv"] Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.977469 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpvnq\" (UniqueName: \"kubernetes.io/projected/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-kube-api-access-cpvnq\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.977539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.977567 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-bound-sa-token\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.977674 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-trusted-ca\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.977767 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-registry-tls\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.977873 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.977893 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:56 crc kubenswrapper[4909]: I0202 10:36:56.977989 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-registry-certificates\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.004625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.079383 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-registry-certificates\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.079429 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpvnq\" (UniqueName: \"kubernetes.io/projected/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-kube-api-access-cpvnq\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.079471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-bound-sa-token\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.079491 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-trusted-ca\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.079518 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-registry-tls\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.079546 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.079568 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.080052 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.080689 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-registry-certificates\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.081031 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-trusted-ca\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.085190 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-registry-tls\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.085215 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.100549 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpvnq\" (UniqueName: \"kubernetes.io/projected/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-kube-api-access-cpvnq\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.100571 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4a57aee1-288b-4b1b-ade3-2fe5003ab06e-bound-sa-token\") pod \"image-registry-66df7c8f76-fgbbv\" (UID: \"4a57aee1-288b-4b1b-ade3-2fe5003ab06e\") " pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.138059 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.343561 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-fgbbv"] Feb 02 10:36:57 crc kubenswrapper[4909]: I0202 10:36:57.404450 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" event={"ID":"4a57aee1-288b-4b1b-ade3-2fe5003ab06e","Type":"ContainerStarted","Data":"2dafc40e5bfdc7a241bfe6ddd4856b76cd8847b353a2950d7275a31c0ad34da0"} Feb 02 10:36:58 crc kubenswrapper[4909]: I0202 10:36:58.411974 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" event={"ID":"4a57aee1-288b-4b1b-ade3-2fe5003ab06e","Type":"ContainerStarted","Data":"eb7c1bab4d390288987c69f6c15ac9a997c53903b5afe95d7e30f309649615bf"} Feb 02 10:36:58 crc kubenswrapper[4909]: I0202 10:36:58.412315 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:36:58 crc kubenswrapper[4909]: I0202 10:36:58.436633 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" podStartSLOduration=2.436616827 podStartE2EDuration="2.436616827s" podCreationTimestamp="2026-02-02 10:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:36:58.434876098 +0000 UTC m=+344.180976863" watchObservedRunningTime="2026-02-02 10:36:58.436616827 +0000 UTC m=+344.182717552" Feb 02 10:37:17 crc kubenswrapper[4909]: I0202 10:37:17.143948 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-fgbbv" Feb 02 10:37:17 crc kubenswrapper[4909]: I0202 10:37:17.194635 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-slspb"] Feb 02 10:37:19 crc kubenswrapper[4909]: I0202 10:37:19.510650 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:37:19 crc kubenswrapper[4909]: I0202 10:37:19.510932 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.175283 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlhl9"] Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.176682 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mlhl9" podUID="7164d60d-218c-47e0-a74a-677793e589b0" containerName="registry-server" containerID="cri-o://55959a236926f75800be59e6899fb48a94b11dd291e4d485c7f7a8b7aade6bd5" gracePeriod=30 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.181641 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-72ltv"] Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.181915 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-72ltv" podUID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerName="registry-server" containerID="cri-o://fd84459d6b5dd305397ff426e6d97cea6881fbf3eb15e9df62a9513f52197354" gracePeriod=30 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.217063 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nt7q4"] Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.218051 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerName="marketplace-operator" containerID="cri-o://bb69e8cccdf6fc618d4e442b672cebb8e47697c4026ead7857ed3bafdfc3ffc9" gracePeriod=30 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.225144 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vn7ln"] Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.225655 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vn7ln" podUID="9483225f-edd3-4728-8e95-67f872692af9" containerName="registry-server" containerID="cri-o://87395505e45e0fa9c6d0c5b4105d2959b5928c2bf1d6cf5e4f8684baf8f3d33b" gracePeriod=30 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.232247 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9df28"] Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.233032 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.236431 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qpc6"] Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.236681 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2qpc6" podUID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerName="registry-server" containerID="cri-o://41298415446fe6302e48ab3ed17896c921c3e1f955919a0ae503353e8157769b" gracePeriod=30 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.240728 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9df28"] Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.335429 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a36d99a-11d4-4311-bc30-3852c1580fc1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9df28\" (UID: \"0a36d99a-11d4-4311-bc30-3852c1580fc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.335904 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq5qb\" (UniqueName: \"kubernetes.io/projected/0a36d99a-11d4-4311-bc30-3852c1580fc1-kube-api-access-pq5qb\") pod \"marketplace-operator-79b997595-9df28\" (UID: \"0a36d99a-11d4-4311-bc30-3852c1580fc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.335979 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a36d99a-11d4-4311-bc30-3852c1580fc1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9df28\" (UID: \"0a36d99a-11d4-4311-bc30-3852c1580fc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.437168 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a36d99a-11d4-4311-bc30-3852c1580fc1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9df28\" (UID: \"0a36d99a-11d4-4311-bc30-3852c1580fc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.437222 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq5qb\" (UniqueName: \"kubernetes.io/projected/0a36d99a-11d4-4311-bc30-3852c1580fc1-kube-api-access-pq5qb\") pod \"marketplace-operator-79b997595-9df28\" (UID: \"0a36d99a-11d4-4311-bc30-3852c1580fc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.437258 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a36d99a-11d4-4311-bc30-3852c1580fc1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9df28\" (UID: \"0a36d99a-11d4-4311-bc30-3852c1580fc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.442273 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a36d99a-11d4-4311-bc30-3852c1580fc1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9df28\" (UID: \"0a36d99a-11d4-4311-bc30-3852c1580fc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.443496 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0a36d99a-11d4-4311-bc30-3852c1580fc1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9df28\" (UID: \"0a36d99a-11d4-4311-bc30-3852c1580fc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.458578 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq5qb\" (UniqueName: \"kubernetes.io/projected/0a36d99a-11d4-4311-bc30-3852c1580fc1-kube-api-access-pq5qb\") pod \"marketplace-operator-79b997595-9df28\" (UID: \"0a36d99a-11d4-4311-bc30-3852c1580fc1\") " pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.602750 4909 generic.go:334] "Generic (PLEG): container finished" podID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerID="fd84459d6b5dd305397ff426e6d97cea6881fbf3eb15e9df62a9513f52197354" exitCode=0 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.602842 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72ltv" event={"ID":"58068665-fe9b-4bd9-ac11-a3d6c9ad888e","Type":"ContainerDied","Data":"fd84459d6b5dd305397ff426e6d97cea6881fbf3eb15e9df62a9513f52197354"} Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.603007 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72ltv" event={"ID":"58068665-fe9b-4bd9-ac11-a3d6c9ad888e","Type":"ContainerDied","Data":"7ce4853e060f2381f0c12ffd62d69c2294355588172203b587da8db6b890e5cf"} Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.603031 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ce4853e060f2381f0c12ffd62d69c2294355588172203b587da8db6b890e5cf" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.605460 4909 generic.go:334] "Generic (PLEG): container finished" podID="7164d60d-218c-47e0-a74a-677793e589b0" containerID="55959a236926f75800be59e6899fb48a94b11dd291e4d485c7f7a8b7aade6bd5" exitCode=0 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.605516 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlhl9" event={"ID":"7164d60d-218c-47e0-a74a-677793e589b0","Type":"ContainerDied","Data":"55959a236926f75800be59e6899fb48a94b11dd291e4d485c7f7a8b7aade6bd5"} Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.605538 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlhl9" event={"ID":"7164d60d-218c-47e0-a74a-677793e589b0","Type":"ContainerDied","Data":"09b5d6e00db532a19671a99815f400e9f9db763dd41c1b34fd494908bf963d7b"} Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.605550 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09b5d6e00db532a19671a99815f400e9f9db763dd41c1b34fd494908bf963d7b" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.608011 4909 generic.go:334] "Generic (PLEG): container finished" podID="9483225f-edd3-4728-8e95-67f872692af9" containerID="87395505e45e0fa9c6d0c5b4105d2959b5928c2bf1d6cf5e4f8684baf8f3d33b" exitCode=0 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.608081 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn7ln" event={"ID":"9483225f-edd3-4728-8e95-67f872692af9","Type":"ContainerDied","Data":"87395505e45e0fa9c6d0c5b4105d2959b5928c2bf1d6cf5e4f8684baf8f3d33b"} Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.608107 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vn7ln" event={"ID":"9483225f-edd3-4728-8e95-67f872692af9","Type":"ContainerDied","Data":"8dd5b04ea166b12c4f995cffeb401200d60e383f8a7e80d7f52ce5cfea93ae7f"} Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.608120 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dd5b04ea166b12c4f995cffeb401200d60e383f8a7e80d7f52ce5cfea93ae7f" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.610020 4909 generic.go:334] "Generic (PLEG): container finished" podID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerID="41298415446fe6302e48ab3ed17896c921c3e1f955919a0ae503353e8157769b" exitCode=0 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.610100 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qpc6" event={"ID":"20869d58-911c-44ab-8f33-07ffc1056b3b","Type":"ContainerDied","Data":"41298415446fe6302e48ab3ed17896c921c3e1f955919a0ae503353e8157769b"} Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.611675 4909 generic.go:334] "Generic (PLEG): container finished" podID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerID="bb69e8cccdf6fc618d4e442b672cebb8e47697c4026ead7857ed3bafdfc3ffc9" exitCode=0 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.611787 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" event={"ID":"b575e8ec-7b85-4647-b0af-03274d67afc8","Type":"ContainerDied","Data":"bb69e8cccdf6fc618d4e442b672cebb8e47697c4026ead7857ed3bafdfc3ffc9"} Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.611875 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" event={"ID":"b575e8ec-7b85-4647-b0af-03274d67afc8","Type":"ContainerDied","Data":"55af95ee6adcd8e541b2adcf7ef60a39a2f7e8dd151aec5dc3b1955a0893f448"} Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.611891 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55af95ee6adcd8e541b2adcf7ef60a39a2f7e8dd151aec5dc3b1955a0893f448" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.611907 4909 scope.go:117] "RemoveContainer" containerID="fa9a9e026623d194f2e469fce2e458deac81c27f84db141aec5e4634c22f9654" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.648987 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.652862 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.661396 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.678603 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.691506 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.699049 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.739373 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-catalog-content\") pod \"7164d60d-218c-47e0-a74a-677793e589b0\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.739452 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-utilities\") pod \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.739475 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx6mt\" (UniqueName: \"kubernetes.io/projected/9483225f-edd3-4728-8e95-67f872692af9-kube-api-access-lx6mt\") pod \"9483225f-edd3-4728-8e95-67f872692af9\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.739494 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntsgz\" (UniqueName: \"kubernetes.io/projected/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-kube-api-access-ntsgz\") pod \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.739514 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46jr9\" (UniqueName: \"kubernetes.io/projected/7164d60d-218c-47e0-a74a-677793e589b0-kube-api-access-46jr9\") pod \"7164d60d-218c-47e0-a74a-677793e589b0\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.739536 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-catalog-content\") pod \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\" (UID: \"58068665-fe9b-4bd9-ac11-a3d6c9ad888e\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.739562 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-utilities\") pod \"7164d60d-218c-47e0-a74a-677793e589b0\" (UID: \"7164d60d-218c-47e0-a74a-677793e589b0\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.739617 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-utilities\") pod \"9483225f-edd3-4728-8e95-67f872692af9\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.739652 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-catalog-content\") pod \"9483225f-edd3-4728-8e95-67f872692af9\" (UID: \"9483225f-edd3-4728-8e95-67f872692af9\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.742520 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-utilities" (OuterVolumeSpecName: "utilities") pod "7164d60d-218c-47e0-a74a-677793e589b0" (UID: "7164d60d-218c-47e0-a74a-677793e589b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.743133 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-utilities" (OuterVolumeSpecName: "utilities") pod "9483225f-edd3-4728-8e95-67f872692af9" (UID: "9483225f-edd3-4728-8e95-67f872692af9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.743686 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-utilities" (OuterVolumeSpecName: "utilities") pod "58068665-fe9b-4bd9-ac11-a3d6c9ad888e" (UID: "58068665-fe9b-4bd9-ac11-a3d6c9ad888e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.751129 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-kube-api-access-ntsgz" (OuterVolumeSpecName: "kube-api-access-ntsgz") pod "58068665-fe9b-4bd9-ac11-a3d6c9ad888e" (UID: "58068665-fe9b-4bd9-ac11-a3d6c9ad888e"). InnerVolumeSpecName "kube-api-access-ntsgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.752475 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7164d60d-218c-47e0-a74a-677793e589b0-kube-api-access-46jr9" (OuterVolumeSpecName: "kube-api-access-46jr9") pod "7164d60d-218c-47e0-a74a-677793e589b0" (UID: "7164d60d-218c-47e0-a74a-677793e589b0"). InnerVolumeSpecName "kube-api-access-46jr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.756084 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9483225f-edd3-4728-8e95-67f872692af9-kube-api-access-lx6mt" (OuterVolumeSpecName: "kube-api-access-lx6mt") pod "9483225f-edd3-4728-8e95-67f872692af9" (UID: "9483225f-edd3-4728-8e95-67f872692af9"). InnerVolumeSpecName "kube-api-access-lx6mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.802425 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9483225f-edd3-4728-8e95-67f872692af9" (UID: "9483225f-edd3-4728-8e95-67f872692af9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.814184 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7164d60d-218c-47e0-a74a-677793e589b0" (UID: "7164d60d-218c-47e0-a74a-677793e589b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.838143 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58068665-fe9b-4bd9-ac11-a3d6c9ad888e" (UID: "58068665-fe9b-4bd9-ac11-a3d6c9ad888e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.840409 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-trusted-ca\") pod \"b575e8ec-7b85-4647-b0af-03274d67afc8\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.840509 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nk27\" (UniqueName: \"kubernetes.io/projected/b575e8ec-7b85-4647-b0af-03274d67afc8-kube-api-access-4nk27\") pod \"b575e8ec-7b85-4647-b0af-03274d67afc8\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.840536 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8422q\" (UniqueName: \"kubernetes.io/projected/20869d58-911c-44ab-8f33-07ffc1056b3b-kube-api-access-8422q\") pod \"20869d58-911c-44ab-8f33-07ffc1056b3b\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.840592 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-operator-metrics\") pod \"b575e8ec-7b85-4647-b0af-03274d67afc8\" (UID: \"b575e8ec-7b85-4647-b0af-03274d67afc8\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.840618 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-catalog-content\") pod \"20869d58-911c-44ab-8f33-07ffc1056b3b\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.840661 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-utilities\") pod \"20869d58-911c-44ab-8f33-07ffc1056b3b\" (UID: \"20869d58-911c-44ab-8f33-07ffc1056b3b\") " Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.840946 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.840965 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.840978 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx6mt\" (UniqueName: \"kubernetes.io/projected/9483225f-edd3-4728-8e95-67f872692af9-kube-api-access-lx6mt\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.840993 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntsgz\" (UniqueName: \"kubernetes.io/projected/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-kube-api-access-ntsgz\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.841004 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46jr9\" (UniqueName: \"kubernetes.io/projected/7164d60d-218c-47e0-a74a-677793e589b0-kube-api-access-46jr9\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.841018 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58068665-fe9b-4bd9-ac11-a3d6c9ad888e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.841030 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7164d60d-218c-47e0-a74a-677793e589b0-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.841041 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.841051 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9483225f-edd3-4728-8e95-67f872692af9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.841778 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-utilities" (OuterVolumeSpecName: "utilities") pod "20869d58-911c-44ab-8f33-07ffc1056b3b" (UID: "20869d58-911c-44ab-8f33-07ffc1056b3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.842590 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b575e8ec-7b85-4647-b0af-03274d67afc8" (UID: "b575e8ec-7b85-4647-b0af-03274d67afc8"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.844010 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20869d58-911c-44ab-8f33-07ffc1056b3b-kube-api-access-8422q" (OuterVolumeSpecName: "kube-api-access-8422q") pod "20869d58-911c-44ab-8f33-07ffc1056b3b" (UID: "20869d58-911c-44ab-8f33-07ffc1056b3b"). InnerVolumeSpecName "kube-api-access-8422q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.845404 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b575e8ec-7b85-4647-b0af-03274d67afc8-kube-api-access-4nk27" (OuterVolumeSpecName: "kube-api-access-4nk27") pod "b575e8ec-7b85-4647-b0af-03274d67afc8" (UID: "b575e8ec-7b85-4647-b0af-03274d67afc8"). InnerVolumeSpecName "kube-api-access-4nk27". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.846756 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b575e8ec-7b85-4647-b0af-03274d67afc8" (UID: "b575e8ec-7b85-4647-b0af-03274d67afc8"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.869680 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9df28"] Feb 02 10:37:38 crc kubenswrapper[4909]: W0202 10:37:38.883299 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a36d99a_11d4_4311_bc30_3852c1580fc1.slice/crio-3f59c7aa6e011124fe5006cf1769eb0e6252938d3d8d949f7fbb00c43a3e8437 WatchSource:0}: Error finding container 3f59c7aa6e011124fe5006cf1769eb0e6252938d3d8d949f7fbb00c43a3e8437: Status 404 returned error can't find the container with id 3f59c7aa6e011124fe5006cf1769eb0e6252938d3d8d949f7fbb00c43a3e8437 Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.942654 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nk27\" (UniqueName: \"kubernetes.io/projected/b575e8ec-7b85-4647-b0af-03274d67afc8-kube-api-access-4nk27\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.942679 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8422q\" (UniqueName: \"kubernetes.io/projected/20869d58-911c-44ab-8f33-07ffc1056b3b-kube-api-access-8422q\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.942691 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.942700 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.942709 4909 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b575e8ec-7b85-4647-b0af-03274d67afc8-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:38 crc kubenswrapper[4909]: I0202 10:37:38.998519 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20869d58-911c-44ab-8f33-07ffc1056b3b" (UID: "20869d58-911c-44ab-8f33-07ffc1056b3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.043573 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20869d58-911c-44ab-8f33-07ffc1056b3b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.620182 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2qpc6" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.620171 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2qpc6" event={"ID":"20869d58-911c-44ab-8f33-07ffc1056b3b","Type":"ContainerDied","Data":"67b8c45b75d900ca86b2984496e2e08af3b612ac38275e0b8d9af0b8d7cc9837"} Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.621665 4909 scope.go:117] "RemoveContainer" containerID="41298415446fe6302e48ab3ed17896c921c3e1f955919a0ae503353e8157769b" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.623244 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-nt7q4" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.625620 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlhl9" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.626057 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9df28" event={"ID":"0a36d99a-11d4-4311-bc30-3852c1580fc1","Type":"ContainerStarted","Data":"95c3cd66712d4f93fdccb044aa6d2ce09ccb3da854ce03db5be516418df319ec"} Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.626125 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9df28" event={"ID":"0a36d99a-11d4-4311-bc30-3852c1580fc1","Type":"ContainerStarted","Data":"3f59c7aa6e011124fe5006cf1769eb0e6252938d3d8d949f7fbb00c43a3e8437"} Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.626259 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72ltv" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.626927 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vn7ln" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.627763 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.639714 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9df28" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.646049 4909 scope.go:117] "RemoveContainer" containerID="82b876000a029687ce66c11ed9367a0da960d149eb488f5543d1003426afefea" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.654575 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2qpc6"] Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.658973 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2qpc6"] Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.667199 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nt7q4"] Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.671747 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-nt7q4"] Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.673478 4909 scope.go:117] "RemoveContainer" containerID="8dd68818e21d052db74034a7b1ee30eb2a6a632a6c638f91cd8b059f576b004e" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.675025 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-72ltv"] Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.678149 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-72ltv"] Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.685063 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlhl9"] Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.687167 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mlhl9"] Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.702015 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9df28" podStartSLOduration=1.7019931000000001 podStartE2EDuration="1.7019931s" podCreationTimestamp="2026-02-02 10:37:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:37:39.698739288 +0000 UTC m=+385.444840023" watchObservedRunningTime="2026-02-02 10:37:39.7019931 +0000 UTC m=+385.448093835" Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.711048 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vn7ln"] Feb 02 10:37:39 crc kubenswrapper[4909]: I0202 10:37:39.721668 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vn7ln"] Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.187857 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zgmhk"] Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188355 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerName="marketplace-operator" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188369 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerName="marketplace-operator" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188380 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188387 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188398 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerName="extract-utilities" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188406 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerName="extract-utilities" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188415 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7164d60d-218c-47e0-a74a-677793e589b0" containerName="extract-utilities" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188423 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7164d60d-218c-47e0-a74a-677793e589b0" containerName="extract-utilities" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188433 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7164d60d-218c-47e0-a74a-677793e589b0" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188440 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7164d60d-218c-47e0-a74a-677793e589b0" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188448 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9483225f-edd3-4728-8e95-67f872692af9" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188456 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9483225f-edd3-4728-8e95-67f872692af9" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188467 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7164d60d-218c-47e0-a74a-677793e589b0" containerName="extract-content" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188476 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7164d60d-218c-47e0-a74a-677793e589b0" containerName="extract-content" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188489 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerName="extract-content" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188496 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerName="extract-content" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188574 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188582 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188598 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9483225f-edd3-4728-8e95-67f872692af9" containerName="extract-content" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188606 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9483225f-edd3-4728-8e95-67f872692af9" containerName="extract-content" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188617 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerName="extract-content" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188624 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerName="extract-content" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188635 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerName="extract-utilities" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188643 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerName="extract-utilities" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188652 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9483225f-edd3-4728-8e95-67f872692af9" containerName="extract-utilities" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188659 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9483225f-edd3-4728-8e95-67f872692af9" containerName="extract-utilities" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188763 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7164d60d-218c-47e0-a74a-677793e589b0" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188780 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="20869d58-911c-44ab-8f33-07ffc1056b3b" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188792 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188820 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerName="marketplace-operator" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188831 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerName="marketplace-operator" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188839 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9483225f-edd3-4728-8e95-67f872692af9" containerName="registry-server" Feb 02 10:37:40 crc kubenswrapper[4909]: E0202 10:37:40.188961 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerName="marketplace-operator" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.188971 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" containerName="marketplace-operator" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.189799 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.193046 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.202302 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgmhk"] Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.259133 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/417649dc-6430-472e-9f33-2eb65290602c-catalog-content\") pod \"certified-operators-zgmhk\" (UID: \"417649dc-6430-472e-9f33-2eb65290602c\") " pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.259232 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq7cx\" (UniqueName: \"kubernetes.io/projected/417649dc-6430-472e-9f33-2eb65290602c-kube-api-access-nq7cx\") pod \"certified-operators-zgmhk\" (UID: \"417649dc-6430-472e-9f33-2eb65290602c\") " pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.259284 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/417649dc-6430-472e-9f33-2eb65290602c-utilities\") pod \"certified-operators-zgmhk\" (UID: \"417649dc-6430-472e-9f33-2eb65290602c\") " pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.360267 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq7cx\" (UniqueName: \"kubernetes.io/projected/417649dc-6430-472e-9f33-2eb65290602c-kube-api-access-nq7cx\") pod \"certified-operators-zgmhk\" (UID: \"417649dc-6430-472e-9f33-2eb65290602c\") " pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.360327 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/417649dc-6430-472e-9f33-2eb65290602c-utilities\") pod \"certified-operators-zgmhk\" (UID: \"417649dc-6430-472e-9f33-2eb65290602c\") " pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.360365 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/417649dc-6430-472e-9f33-2eb65290602c-catalog-content\") pod \"certified-operators-zgmhk\" (UID: \"417649dc-6430-472e-9f33-2eb65290602c\") " pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.360871 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/417649dc-6430-472e-9f33-2eb65290602c-catalog-content\") pod \"certified-operators-zgmhk\" (UID: \"417649dc-6430-472e-9f33-2eb65290602c\") " pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.361047 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/417649dc-6430-472e-9f33-2eb65290602c-utilities\") pod \"certified-operators-zgmhk\" (UID: \"417649dc-6430-472e-9f33-2eb65290602c\") " pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.386565 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq7cx\" (UniqueName: \"kubernetes.io/projected/417649dc-6430-472e-9f33-2eb65290602c-kube-api-access-nq7cx\") pod \"certified-operators-zgmhk\" (UID: \"417649dc-6430-472e-9f33-2eb65290602c\") " pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.503249 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.660734 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgmhk"] Feb 02 10:37:40 crc kubenswrapper[4909]: W0202 10:37:40.666825 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod417649dc_6430_472e_9f33_2eb65290602c.slice/crio-c48aef7db02a8c6688cca5464259dc4bd7ab586a43757fb1178b65b7fa3a9523 WatchSource:0}: Error finding container c48aef7db02a8c6688cca5464259dc4bd7ab586a43757fb1178b65b7fa3a9523: Status 404 returned error can't find the container with id c48aef7db02a8c6688cca5464259dc4bd7ab586a43757fb1178b65b7fa3a9523 Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.788755 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mn5bb"] Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.789922 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.792001 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.802205 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn5bb"] Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.868626 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f-catalog-content\") pod \"redhat-marketplace-mn5bb\" (UID: \"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f\") " pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.868678 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htj6k\" (UniqueName: \"kubernetes.io/projected/f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f-kube-api-access-htj6k\") pod \"redhat-marketplace-mn5bb\" (UID: \"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f\") " pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.868769 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f-utilities\") pod \"redhat-marketplace-mn5bb\" (UID: \"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f\") " pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.970351 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f-utilities\") pod \"redhat-marketplace-mn5bb\" (UID: \"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f\") " pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.970400 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f-catalog-content\") pod \"redhat-marketplace-mn5bb\" (UID: \"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f\") " pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.970421 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htj6k\" (UniqueName: \"kubernetes.io/projected/f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f-kube-api-access-htj6k\") pod \"redhat-marketplace-mn5bb\" (UID: \"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f\") " pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.971115 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f-utilities\") pod \"redhat-marketplace-mn5bb\" (UID: \"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f\") " pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.971161 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f-catalog-content\") pod \"redhat-marketplace-mn5bb\" (UID: \"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f\") " pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:40 crc kubenswrapper[4909]: I0202 10:37:40.988663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htj6k\" (UniqueName: \"kubernetes.io/projected/f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f-kube-api-access-htj6k\") pod \"redhat-marketplace-mn5bb\" (UID: \"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f\") " pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.024731 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20869d58-911c-44ab-8f33-07ffc1056b3b" path="/var/lib/kubelet/pods/20869d58-911c-44ab-8f33-07ffc1056b3b/volumes" Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.025372 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58068665-fe9b-4bd9-ac11-a3d6c9ad888e" path="/var/lib/kubelet/pods/58068665-fe9b-4bd9-ac11-a3d6c9ad888e/volumes" Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.026003 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7164d60d-218c-47e0-a74a-677793e589b0" path="/var/lib/kubelet/pods/7164d60d-218c-47e0-a74a-677793e589b0/volumes" Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.027157 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9483225f-edd3-4728-8e95-67f872692af9" path="/var/lib/kubelet/pods/9483225f-edd3-4728-8e95-67f872692af9/volumes" Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.027877 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b575e8ec-7b85-4647-b0af-03274d67afc8" path="/var/lib/kubelet/pods/b575e8ec-7b85-4647-b0af-03274d67afc8/volumes" Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.110987 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.281637 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn5bb"] Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.639339 4909 generic.go:334] "Generic (PLEG): container finished" podID="f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f" containerID="322012e67849e7bf68b89a6e87dcd9731a5614414f8381ea1740370a6039290a" exitCode=0 Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.639385 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn5bb" event={"ID":"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f","Type":"ContainerDied","Data":"322012e67849e7bf68b89a6e87dcd9731a5614414f8381ea1740370a6039290a"} Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.639628 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn5bb" event={"ID":"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f","Type":"ContainerStarted","Data":"3ba82ff185c97ee4faa6c16eda2c088872f67b2d09ca9e57f58a7d4d884f13b1"} Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.643681 4909 generic.go:334] "Generic (PLEG): container finished" podID="417649dc-6430-472e-9f33-2eb65290602c" containerID="cc4e46de01a057ece86c99f2eeb695203a8a6f1425b66c5b4ca0c1f7d853f98f" exitCode=0 Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.643990 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmhk" event={"ID":"417649dc-6430-472e-9f33-2eb65290602c","Type":"ContainerDied","Data":"cc4e46de01a057ece86c99f2eeb695203a8a6f1425b66c5b4ca0c1f7d853f98f"} Feb 02 10:37:41 crc kubenswrapper[4909]: I0202 10:37:41.644037 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmhk" event={"ID":"417649dc-6430-472e-9f33-2eb65290602c","Type":"ContainerStarted","Data":"c48aef7db02a8c6688cca5464259dc4bd7ab586a43757fb1178b65b7fa3a9523"} Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.239050 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" podUID="6e75bc1a-54b9-4897-9da5-0a04a1d952cf" containerName="registry" containerID="cri-o://537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a" gracePeriod=30 Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.589301 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q8sjs"] Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.591239 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.593997 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.599244 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q8sjs"] Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.653708 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.666117 4909 generic.go:334] "Generic (PLEG): container finished" podID="417649dc-6430-472e-9f33-2eb65290602c" containerID="14e60b50d65fcfc15fa875ae956dbb76c33d9e984d3583374547cd70aaf839ab" exitCode=0 Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.666202 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmhk" event={"ID":"417649dc-6430-472e-9f33-2eb65290602c","Type":"ContainerDied","Data":"14e60b50d65fcfc15fa875ae956dbb76c33d9e984d3583374547cd70aaf839ab"} Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.669754 4909 generic.go:334] "Generic (PLEG): container finished" podID="6e75bc1a-54b9-4897-9da5-0a04a1d952cf" containerID="537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a" exitCode=0 Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.669832 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.669863 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" event={"ID":"6e75bc1a-54b9-4897-9da5-0a04a1d952cf","Type":"ContainerDied","Data":"537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a"} Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.669960 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-slspb" event={"ID":"6e75bc1a-54b9-4897-9da5-0a04a1d952cf","Type":"ContainerDied","Data":"c84f1af740a7a9fc310fc4d6ca91881d1a08c31155b9950c171aa5d8d311232e"} Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.669990 4909 scope.go:117] "RemoveContainer" containerID="537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.672399 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn5bb" event={"ID":"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f","Type":"ContainerStarted","Data":"c9ffcd0559ac80df6a90f44b226962a920fe000a56dec6d6c1c51cf2c4f9e106"} Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.708804 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53984a35-198b-4d3e-bfc6-948f14ab1a39-utilities\") pod \"redhat-operators-q8sjs\" (UID: \"53984a35-198b-4d3e-bfc6-948f14ab1a39\") " pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.708863 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgdxt\" (UniqueName: \"kubernetes.io/projected/53984a35-198b-4d3e-bfc6-948f14ab1a39-kube-api-access-pgdxt\") pod \"redhat-operators-q8sjs\" (UID: \"53984a35-198b-4d3e-bfc6-948f14ab1a39\") " pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.708897 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53984a35-198b-4d3e-bfc6-948f14ab1a39-catalog-content\") pod \"redhat-operators-q8sjs\" (UID: \"53984a35-198b-4d3e-bfc6-948f14ab1a39\") " pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.732567 4909 scope.go:117] "RemoveContainer" containerID="537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a" Feb 02 10:37:42 crc kubenswrapper[4909]: E0202 10:37:42.733324 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a\": container with ID starting with 537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a not found: ID does not exist" containerID="537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.733393 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a"} err="failed to get container status \"537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a\": rpc error: code = NotFound desc = could not find container \"537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a\": container with ID starting with 537045191ecb95920e1a42ac027fe9a1e92efd58f52bf485d065d443e471b70a not found: ID does not exist" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.810895 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-certificates\") pod \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.810989 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-trusted-ca\") pod \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.811018 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-tls\") pod \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.811055 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5zq2\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-kube-api-access-g5zq2\") pod \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.811080 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-bound-sa-token\") pod \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.811104 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-installation-pull-secrets\") pod \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.811326 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.811358 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-ca-trust-extracted\") pod \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\" (UID: \"6e75bc1a-54b9-4897-9da5-0a04a1d952cf\") " Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.811519 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53984a35-198b-4d3e-bfc6-948f14ab1a39-catalog-content\") pod \"redhat-operators-q8sjs\" (UID: \"53984a35-198b-4d3e-bfc6-948f14ab1a39\") " pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.811646 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53984a35-198b-4d3e-bfc6-948f14ab1a39-utilities\") pod \"redhat-operators-q8sjs\" (UID: \"53984a35-198b-4d3e-bfc6-948f14ab1a39\") " pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.811684 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgdxt\" (UniqueName: \"kubernetes.io/projected/53984a35-198b-4d3e-bfc6-948f14ab1a39-kube-api-access-pgdxt\") pod \"redhat-operators-q8sjs\" (UID: \"53984a35-198b-4d3e-bfc6-948f14ab1a39\") " pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.812870 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6e75bc1a-54b9-4897-9da5-0a04a1d952cf" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.813199 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6e75bc1a-54b9-4897-9da5-0a04a1d952cf" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.815017 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53984a35-198b-4d3e-bfc6-948f14ab1a39-catalog-content\") pod \"redhat-operators-q8sjs\" (UID: \"53984a35-198b-4d3e-bfc6-948f14ab1a39\") " pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.815397 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53984a35-198b-4d3e-bfc6-948f14ab1a39-utilities\") pod \"redhat-operators-q8sjs\" (UID: \"53984a35-198b-4d3e-bfc6-948f14ab1a39\") " pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.827039 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6e75bc1a-54b9-4897-9da5-0a04a1d952cf" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.827693 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-kube-api-access-g5zq2" (OuterVolumeSpecName: "kube-api-access-g5zq2") pod "6e75bc1a-54b9-4897-9da5-0a04a1d952cf" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf"). InnerVolumeSpecName "kube-api-access-g5zq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.827951 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6e75bc1a-54b9-4897-9da5-0a04a1d952cf" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.828160 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6e75bc1a-54b9-4897-9da5-0a04a1d952cf" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.829238 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgdxt\" (UniqueName: \"kubernetes.io/projected/53984a35-198b-4d3e-bfc6-948f14ab1a39-kube-api-access-pgdxt\") pod \"redhat-operators-q8sjs\" (UID: \"53984a35-198b-4d3e-bfc6-948f14ab1a39\") " pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.832424 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "6e75bc1a-54b9-4897-9da5-0a04a1d952cf" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.843083 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6e75bc1a-54b9-4897-9da5-0a04a1d952cf" (UID: "6e75bc1a-54b9-4897-9da5-0a04a1d952cf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.913174 4909 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.913207 4909 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.913218 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.913226 4909 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.913235 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5zq2\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-kube-api-access-g5zq2\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.913243 4909 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.913253 4909 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6e75bc1a-54b9-4897-9da5-0a04a1d952cf-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 02 10:37:42 crc kubenswrapper[4909]: I0202 10:37:42.999897 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-slspb"] Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.004158 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-slspb"] Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.022425 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e75bc1a-54b9-4897-9da5-0a04a1d952cf" path="/var/lib/kubelet/pods/6e75bc1a-54b9-4897-9da5-0a04a1d952cf/volumes" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.027450 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.196911 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-254vv"] Feb 02 10:37:43 crc kubenswrapper[4909]: E0202 10:37:43.197410 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e75bc1a-54b9-4897-9da5-0a04a1d952cf" containerName="registry" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.197427 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e75bc1a-54b9-4897-9da5-0a04a1d952cf" containerName="registry" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.197561 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e75bc1a-54b9-4897-9da5-0a04a1d952cf" containerName="registry" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.198388 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.202256 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.212226 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-254vv"] Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.232105 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q8sjs"] Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.318562 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzrr8\" (UniqueName: \"kubernetes.io/projected/15f674f6-58e6-4a73-8044-12919a852001-kube-api-access-pzrr8\") pod \"community-operators-254vv\" (UID: \"15f674f6-58e6-4a73-8044-12919a852001\") " pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.318614 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f674f6-58e6-4a73-8044-12919a852001-utilities\") pod \"community-operators-254vv\" (UID: \"15f674f6-58e6-4a73-8044-12919a852001\") " pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.318669 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f674f6-58e6-4a73-8044-12919a852001-catalog-content\") pod \"community-operators-254vv\" (UID: \"15f674f6-58e6-4a73-8044-12919a852001\") " pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.419471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f674f6-58e6-4a73-8044-12919a852001-catalog-content\") pod \"community-operators-254vv\" (UID: \"15f674f6-58e6-4a73-8044-12919a852001\") " pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.419577 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzrr8\" (UniqueName: \"kubernetes.io/projected/15f674f6-58e6-4a73-8044-12919a852001-kube-api-access-pzrr8\") pod \"community-operators-254vv\" (UID: \"15f674f6-58e6-4a73-8044-12919a852001\") " pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.419597 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f674f6-58e6-4a73-8044-12919a852001-utilities\") pod \"community-operators-254vv\" (UID: \"15f674f6-58e6-4a73-8044-12919a852001\") " pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.420188 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15f674f6-58e6-4a73-8044-12919a852001-utilities\") pod \"community-operators-254vv\" (UID: \"15f674f6-58e6-4a73-8044-12919a852001\") " pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.420205 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15f674f6-58e6-4a73-8044-12919a852001-catalog-content\") pod \"community-operators-254vv\" (UID: \"15f674f6-58e6-4a73-8044-12919a852001\") " pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.445007 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzrr8\" (UniqueName: \"kubernetes.io/projected/15f674f6-58e6-4a73-8044-12919a852001-kube-api-access-pzrr8\") pod \"community-operators-254vv\" (UID: \"15f674f6-58e6-4a73-8044-12919a852001\") " pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.518736 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.687475 4909 generic.go:334] "Generic (PLEG): container finished" podID="f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f" containerID="c9ffcd0559ac80df6a90f44b226962a920fe000a56dec6d6c1c51cf2c4f9e106" exitCode=0 Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.687523 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn5bb" event={"ID":"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f","Type":"ContainerDied","Data":"c9ffcd0559ac80df6a90f44b226962a920fe000a56dec6d6c1c51cf2c4f9e106"} Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.687549 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn5bb" event={"ID":"f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f","Type":"ContainerStarted","Data":"cf447640ed18497e00a11a37371736578aa5dc6f3dfb28b794b8737f53622e4b"} Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.689992 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgmhk" event={"ID":"417649dc-6430-472e-9f33-2eb65290602c","Type":"ContainerStarted","Data":"306f5b1f9979d0d8311b7022237962a7709aa5f955acf130498ace0240e91e9d"} Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.691996 4909 generic.go:334] "Generic (PLEG): container finished" podID="53984a35-198b-4d3e-bfc6-948f14ab1a39" containerID="84d7d63f610ad0fa0ae9e0da5096344e27abd1c8b5c3e88add43651da70e98e7" exitCode=0 Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.692019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8sjs" event={"ID":"53984a35-198b-4d3e-bfc6-948f14ab1a39","Type":"ContainerDied","Data":"84d7d63f610ad0fa0ae9e0da5096344e27abd1c8b5c3e88add43651da70e98e7"} Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.692034 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8sjs" event={"ID":"53984a35-198b-4d3e-bfc6-948f14ab1a39","Type":"ContainerStarted","Data":"5d18a54f19c35e064cbcca7150eb4f6f2c49446143bd71c7066caca7f5a71309"} Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.699414 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-254vv"] Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.714219 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mn5bb" podStartSLOduration=2.175257939 podStartE2EDuration="3.714186873s" podCreationTimestamp="2026-02-02 10:37:40 +0000 UTC" firstStartedPulling="2026-02-02 10:37:41.640756063 +0000 UTC m=+387.386856798" lastFinishedPulling="2026-02-02 10:37:43.179684997 +0000 UTC m=+388.925785732" observedRunningTime="2026-02-02 10:37:43.709117541 +0000 UTC m=+389.455218296" watchObservedRunningTime="2026-02-02 10:37:43.714186873 +0000 UTC m=+389.460287608" Feb 02 10:37:43 crc kubenswrapper[4909]: W0202 10:37:43.714345 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15f674f6_58e6_4a73_8044_12919a852001.slice/crio-a240a643838b314383b1062f908a22e50895511da59920bc28997e91b1c6bde5 WatchSource:0}: Error finding container a240a643838b314383b1062f908a22e50895511da59920bc28997e91b1c6bde5: Status 404 returned error can't find the container with id a240a643838b314383b1062f908a22e50895511da59920bc28997e91b1c6bde5 Feb 02 10:37:43 crc kubenswrapper[4909]: I0202 10:37:43.735534 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zgmhk" podStartSLOduration=2.310003808 podStartE2EDuration="3.735510563s" podCreationTimestamp="2026-02-02 10:37:40 +0000 UTC" firstStartedPulling="2026-02-02 10:37:41.645005593 +0000 UTC m=+387.391106328" lastFinishedPulling="2026-02-02 10:37:43.070512348 +0000 UTC m=+388.816613083" observedRunningTime="2026-02-02 10:37:43.731402447 +0000 UTC m=+389.477503182" watchObservedRunningTime="2026-02-02 10:37:43.735510563 +0000 UTC m=+389.481611298" Feb 02 10:37:44 crc kubenswrapper[4909]: I0202 10:37:44.699012 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8sjs" event={"ID":"53984a35-198b-4d3e-bfc6-948f14ab1a39","Type":"ContainerStarted","Data":"f65428532ecaf6f4af59542ebe3c53cfa6d73799288816f52074af3a99ef53cb"} Feb 02 10:37:44 crc kubenswrapper[4909]: I0202 10:37:44.701680 4909 generic.go:334] "Generic (PLEG): container finished" podID="15f674f6-58e6-4a73-8044-12919a852001" containerID="008f634e4f3094c8bd5667e859c9ba8a28d0ac3f7109e8049dfec00a192c4740" exitCode=0 Feb 02 10:37:44 crc kubenswrapper[4909]: I0202 10:37:44.702851 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-254vv" event={"ID":"15f674f6-58e6-4a73-8044-12919a852001","Type":"ContainerDied","Data":"008f634e4f3094c8bd5667e859c9ba8a28d0ac3f7109e8049dfec00a192c4740"} Feb 02 10:37:44 crc kubenswrapper[4909]: I0202 10:37:44.702971 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-254vv" event={"ID":"15f674f6-58e6-4a73-8044-12919a852001","Type":"ContainerStarted","Data":"a240a643838b314383b1062f908a22e50895511da59920bc28997e91b1c6bde5"} Feb 02 10:37:45 crc kubenswrapper[4909]: I0202 10:37:45.719985 4909 generic.go:334] "Generic (PLEG): container finished" podID="15f674f6-58e6-4a73-8044-12919a852001" containerID="1028fb620c3f50fbce3aac139773ef83b05245265ecf1547b80ffbc9ec27e263" exitCode=0 Feb 02 10:37:45 crc kubenswrapper[4909]: I0202 10:37:45.720075 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-254vv" event={"ID":"15f674f6-58e6-4a73-8044-12919a852001","Type":"ContainerDied","Data":"1028fb620c3f50fbce3aac139773ef83b05245265ecf1547b80ffbc9ec27e263"} Feb 02 10:37:45 crc kubenswrapper[4909]: I0202 10:37:45.724333 4909 generic.go:334] "Generic (PLEG): container finished" podID="53984a35-198b-4d3e-bfc6-948f14ab1a39" containerID="f65428532ecaf6f4af59542ebe3c53cfa6d73799288816f52074af3a99ef53cb" exitCode=0 Feb 02 10:37:45 crc kubenswrapper[4909]: I0202 10:37:45.724389 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8sjs" event={"ID":"53984a35-198b-4d3e-bfc6-948f14ab1a39","Type":"ContainerDied","Data":"f65428532ecaf6f4af59542ebe3c53cfa6d73799288816f52074af3a99ef53cb"} Feb 02 10:37:46 crc kubenswrapper[4909]: I0202 10:37:46.732383 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8sjs" event={"ID":"53984a35-198b-4d3e-bfc6-948f14ab1a39","Type":"ContainerStarted","Data":"8bbdfd9d370433112f1aa00a93c3daf08b12a6a75d8b25508adde008fb1e3138"} Feb 02 10:37:46 crc kubenswrapper[4909]: I0202 10:37:46.734674 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-254vv" event={"ID":"15f674f6-58e6-4a73-8044-12919a852001","Type":"ContainerStarted","Data":"77e5a82452e090280b98e3855b41aab0b1be48ea3b9c6d834107ed0a186919bd"} Feb 02 10:37:46 crc kubenswrapper[4909]: I0202 10:37:46.766984 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q8sjs" podStartSLOduration=2.233555499 podStartE2EDuration="4.766964433s" podCreationTimestamp="2026-02-02 10:37:42 +0000 UTC" firstStartedPulling="2026-02-02 10:37:43.693001998 +0000 UTC m=+389.439102733" lastFinishedPulling="2026-02-02 10:37:46.226410932 +0000 UTC m=+391.972511667" observedRunningTime="2026-02-02 10:37:46.750944043 +0000 UTC m=+392.497044788" watchObservedRunningTime="2026-02-02 10:37:46.766964433 +0000 UTC m=+392.513065168" Feb 02 10:37:46 crc kubenswrapper[4909]: I0202 10:37:46.767579 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-254vv" podStartSLOduration=2.315239894 podStartE2EDuration="3.76757224s" podCreationTimestamp="2026-02-02 10:37:43 +0000 UTC" firstStartedPulling="2026-02-02 10:37:44.703630048 +0000 UTC m=+390.449730783" lastFinishedPulling="2026-02-02 10:37:46.155962394 +0000 UTC m=+391.902063129" observedRunningTime="2026-02-02 10:37:46.765176323 +0000 UTC m=+392.511277058" watchObservedRunningTime="2026-02-02 10:37:46.76757224 +0000 UTC m=+392.513672975" Feb 02 10:37:49 crc kubenswrapper[4909]: I0202 10:37:49.511120 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:37:49 crc kubenswrapper[4909]: I0202 10:37:49.511669 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:37:49 crc kubenswrapper[4909]: I0202 10:37:49.511723 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:37:49 crc kubenswrapper[4909]: I0202 10:37:49.512688 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55fd471bfde53741a19ef0c82ccf0a2fc7d599b14ec95f480294c68c01189727"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:37:49 crc kubenswrapper[4909]: I0202 10:37:49.512753 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://55fd471bfde53741a19ef0c82ccf0a2fc7d599b14ec95f480294c68c01189727" gracePeriod=600 Feb 02 10:37:49 crc kubenswrapper[4909]: I0202 10:37:49.752051 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="55fd471bfde53741a19ef0c82ccf0a2fc7d599b14ec95f480294c68c01189727" exitCode=0 Feb 02 10:37:49 crc kubenswrapper[4909]: I0202 10:37:49.752131 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"55fd471bfde53741a19ef0c82ccf0a2fc7d599b14ec95f480294c68c01189727"} Feb 02 10:37:49 crc kubenswrapper[4909]: I0202 10:37:49.752427 4909 scope.go:117] "RemoveContainer" containerID="57158c7f1bb82761a2d89c6d387d7bf2743b574ed97a86684b55ed0855c7f013" Feb 02 10:37:50 crc kubenswrapper[4909]: I0202 10:37:50.504113 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:50 crc kubenswrapper[4909]: I0202 10:37:50.504452 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:50 crc kubenswrapper[4909]: I0202 10:37:50.541503 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:50 crc kubenswrapper[4909]: I0202 10:37:50.761499 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"fd03e9658ae912ac531e07a183a5497530ad4917f1b07362f010e9175550e2e4"} Feb 02 10:37:50 crc kubenswrapper[4909]: I0202 10:37:50.811003 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zgmhk" Feb 02 10:37:51 crc kubenswrapper[4909]: I0202 10:37:51.111510 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:51 crc kubenswrapper[4909]: I0202 10:37:51.111551 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:51 crc kubenswrapper[4909]: I0202 10:37:51.157289 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:51 crc kubenswrapper[4909]: I0202 10:37:51.823391 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mn5bb" Feb 02 10:37:53 crc kubenswrapper[4909]: I0202 10:37:53.028147 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:53 crc kubenswrapper[4909]: I0202 10:37:53.029736 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:53 crc kubenswrapper[4909]: I0202 10:37:53.073252 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:53 crc kubenswrapper[4909]: I0202 10:37:53.519511 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:53 crc kubenswrapper[4909]: I0202 10:37:53.519588 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:53 crc kubenswrapper[4909]: I0202 10:37:53.558709 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-254vv" Feb 02 10:37:53 crc kubenswrapper[4909]: I0202 10:37:53.823285 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q8sjs" Feb 02 10:37:53 crc kubenswrapper[4909]: I0202 10:37:53.825495 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-254vv" Feb 02 10:40:15 crc kubenswrapper[4909]: I0202 10:40:15.200078 4909 scope.go:117] "RemoveContainer" containerID="ff969be085ae8d43c0f5ab045162c9554085bfd5d2d418f2e8787d3210432184" Feb 02 10:40:15 crc kubenswrapper[4909]: I0202 10:40:15.220004 4909 scope.go:117] "RemoveContainer" containerID="6b75b836db82aca720c06f2c34d41058856e207d97d5f8462a43887cf1d653fd" Feb 02 10:40:15 crc kubenswrapper[4909]: I0202 10:40:15.238829 4909 scope.go:117] "RemoveContainer" containerID="df52aced6438fbde03f7ed1f02ce7aeedd49844457c5a0459216293dcbd47f52" Feb 02 10:40:19 crc kubenswrapper[4909]: I0202 10:40:19.511378 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:40:19 crc kubenswrapper[4909]: I0202 10:40:19.511781 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:40:49 crc kubenswrapper[4909]: I0202 10:40:49.511234 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:40:49 crc kubenswrapper[4909]: I0202 10:40:49.512760 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:41:15 crc kubenswrapper[4909]: I0202 10:41:15.269011 4909 scope.go:117] "RemoveContainer" containerID="87395505e45e0fa9c6d0c5b4105d2959b5928c2bf1d6cf5e4f8684baf8f3d33b" Feb 02 10:41:15 crc kubenswrapper[4909]: I0202 10:41:15.287373 4909 scope.go:117] "RemoveContainer" containerID="0f2ec9f22118bd1fd91af8be07d418cd8978c48f9b91c8a1d147f55b2e62fc89" Feb 02 10:41:15 crc kubenswrapper[4909]: I0202 10:41:15.301746 4909 scope.go:117] "RemoveContainer" containerID="b47d02dda8a78823a1afc5a2e2ee50f690dfa3af87f0498872aa2f8b9a2df4e9" Feb 02 10:41:15 crc kubenswrapper[4909]: I0202 10:41:15.317465 4909 scope.go:117] "RemoveContainer" containerID="55959a236926f75800be59e6899fb48a94b11dd291e4d485c7f7a8b7aade6bd5" Feb 02 10:41:15 crc kubenswrapper[4909]: I0202 10:41:15.327098 4909 scope.go:117] "RemoveContainer" containerID="fd84459d6b5dd305397ff426e6d97cea6881fbf3eb15e9df62a9513f52197354" Feb 02 10:41:15 crc kubenswrapper[4909]: I0202 10:41:15.339894 4909 scope.go:117] "RemoveContainer" containerID="36a34c9180ca4fb5850058b6c97fcc54de5ec05ad1814483e266ffd72db9f50e" Feb 02 10:41:15 crc kubenswrapper[4909]: I0202 10:41:15.353985 4909 scope.go:117] "RemoveContainer" containerID="a0f5998bd01e5e9a2c083e1c74d42ac73c83dd2e6ebcfb63dbae3eccd5d92323" Feb 02 10:41:15 crc kubenswrapper[4909]: I0202 10:41:15.368145 4909 scope.go:117] "RemoveContainer" containerID="d4f6ae2cd6fe1150fe7725bd998f91102e3d5cde1187afcc1b1368c67e018589" Feb 02 10:41:19 crc kubenswrapper[4909]: I0202 10:41:19.511485 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:41:19 crc kubenswrapper[4909]: I0202 10:41:19.511777 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:41:19 crc kubenswrapper[4909]: I0202 10:41:19.511844 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:41:19 crc kubenswrapper[4909]: I0202 10:41:19.512414 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd03e9658ae912ac531e07a183a5497530ad4917f1b07362f010e9175550e2e4"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:41:19 crc kubenswrapper[4909]: I0202 10:41:19.512477 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://fd03e9658ae912ac531e07a183a5497530ad4917f1b07362f010e9175550e2e4" gracePeriod=600 Feb 02 10:41:19 crc kubenswrapper[4909]: I0202 10:41:19.890978 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="fd03e9658ae912ac531e07a183a5497530ad4917f1b07362f010e9175550e2e4" exitCode=0 Feb 02 10:41:19 crc kubenswrapper[4909]: I0202 10:41:19.891072 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"fd03e9658ae912ac531e07a183a5497530ad4917f1b07362f010e9175550e2e4"} Feb 02 10:41:19 crc kubenswrapper[4909]: I0202 10:41:19.891366 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"93779139e6330b1d279baec90b6f5cebca5bcec1fa26d6a2c9986b097b6f7fb9"} Feb 02 10:41:19 crc kubenswrapper[4909]: I0202 10:41:19.891407 4909 scope.go:117] "RemoveContainer" containerID="55fd471bfde53741a19ef0c82ccf0a2fc7d599b14ec95f480294c68c01189727" Feb 02 10:42:15 crc kubenswrapper[4909]: I0202 10:42:15.440763 4909 scope.go:117] "RemoveContainer" containerID="bb69e8cccdf6fc618d4e442b672cebb8e47697c4026ead7857ed3bafdfc3ffc9" Feb 02 10:43:19 crc kubenswrapper[4909]: I0202 10:43:19.510897 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:43:19 crc kubenswrapper[4909]: I0202 10:43:19.511649 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:43:48 crc kubenswrapper[4909]: I0202 10:43:48.473513 4909 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 02 10:43:49 crc kubenswrapper[4909]: I0202 10:43:49.511503 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:43:49 crc kubenswrapper[4909]: I0202 10:43:49.511571 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.152667 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h6q5x"] Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.155045 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.159585 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6q5x"] Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.330728 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-982cl\" (UniqueName: \"kubernetes.io/projected/0dbae5d6-d6ad-477a-9add-ffc5796df097-kube-api-access-982cl\") pod \"redhat-marketplace-h6q5x\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.330785 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-utilities\") pod \"redhat-marketplace-h6q5x\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.330821 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-catalog-content\") pod \"redhat-marketplace-h6q5x\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.431673 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-982cl\" (UniqueName: \"kubernetes.io/projected/0dbae5d6-d6ad-477a-9add-ffc5796df097-kube-api-access-982cl\") pod \"redhat-marketplace-h6q5x\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.431746 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-utilities\") pod \"redhat-marketplace-h6q5x\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.431771 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-catalog-content\") pod \"redhat-marketplace-h6q5x\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.432314 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-catalog-content\") pod \"redhat-marketplace-h6q5x\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.432444 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-utilities\") pod \"redhat-marketplace-h6q5x\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.457107 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-982cl\" (UniqueName: \"kubernetes.io/projected/0dbae5d6-d6ad-477a-9add-ffc5796df097-kube-api-access-982cl\") pod \"redhat-marketplace-h6q5x\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.470926 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:43:55 crc kubenswrapper[4909]: I0202 10:43:55.662100 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6q5x"] Feb 02 10:43:56 crc kubenswrapper[4909]: I0202 10:43:56.650140 4909 generic.go:334] "Generic (PLEG): container finished" podID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerID="463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac" exitCode=0 Feb 02 10:43:56 crc kubenswrapper[4909]: I0202 10:43:56.650180 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6q5x" event={"ID":"0dbae5d6-d6ad-477a-9add-ffc5796df097","Type":"ContainerDied","Data":"463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac"} Feb 02 10:43:56 crc kubenswrapper[4909]: I0202 10:43:56.650209 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6q5x" event={"ID":"0dbae5d6-d6ad-477a-9add-ffc5796df097","Type":"ContainerStarted","Data":"281ed299c27ef1ad809088ea0864902e9d611b4ac567ee765b52afe7e9816a5a"} Feb 02 10:43:56 crc kubenswrapper[4909]: I0202 10:43:56.651836 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:43:57 crc kubenswrapper[4909]: I0202 10:43:57.658200 4909 generic.go:334] "Generic (PLEG): container finished" podID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerID="cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1" exitCode=0 Feb 02 10:43:57 crc kubenswrapper[4909]: I0202 10:43:57.658392 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6q5x" event={"ID":"0dbae5d6-d6ad-477a-9add-ffc5796df097","Type":"ContainerDied","Data":"cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1"} Feb 02 10:43:58 crc kubenswrapper[4909]: I0202 10:43:58.665547 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6q5x" event={"ID":"0dbae5d6-d6ad-477a-9add-ffc5796df097","Type":"ContainerStarted","Data":"49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0"} Feb 02 10:43:58 crc kubenswrapper[4909]: I0202 10:43:58.688462 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h6q5x" podStartSLOduration=2.300036834 podStartE2EDuration="3.68841286s" podCreationTimestamp="2026-02-02 10:43:55 +0000 UTC" firstStartedPulling="2026-02-02 10:43:56.651534455 +0000 UTC m=+762.397635190" lastFinishedPulling="2026-02-02 10:43:58.039910481 +0000 UTC m=+763.786011216" observedRunningTime="2026-02-02 10:43:58.686144755 +0000 UTC m=+764.432245490" watchObservedRunningTime="2026-02-02 10:43:58.68841286 +0000 UTC m=+764.434513595" Feb 02 10:44:05 crc kubenswrapper[4909]: I0202 10:44:05.471476 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:44:05 crc kubenswrapper[4909]: I0202 10:44:05.472080 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:44:05 crc kubenswrapper[4909]: I0202 10:44:05.508883 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:44:05 crc kubenswrapper[4909]: I0202 10:44:05.735956 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:44:05 crc kubenswrapper[4909]: I0202 10:44:05.778498 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6q5x"] Feb 02 10:44:07 crc kubenswrapper[4909]: I0202 10:44:07.711386 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h6q5x" podUID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerName="registry-server" containerID="cri-o://49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0" gracePeriod=2 Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.034385 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.183248 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-utilities\") pod \"0dbae5d6-d6ad-477a-9add-ffc5796df097\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.183335 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-982cl\" (UniqueName: \"kubernetes.io/projected/0dbae5d6-d6ad-477a-9add-ffc5796df097-kube-api-access-982cl\") pod \"0dbae5d6-d6ad-477a-9add-ffc5796df097\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.183379 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-catalog-content\") pod \"0dbae5d6-d6ad-477a-9add-ffc5796df097\" (UID: \"0dbae5d6-d6ad-477a-9add-ffc5796df097\") " Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.185181 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-utilities" (OuterVolumeSpecName: "utilities") pod "0dbae5d6-d6ad-477a-9add-ffc5796df097" (UID: "0dbae5d6-d6ad-477a-9add-ffc5796df097"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.193033 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dbae5d6-d6ad-477a-9add-ffc5796df097-kube-api-access-982cl" (OuterVolumeSpecName: "kube-api-access-982cl") pod "0dbae5d6-d6ad-477a-9add-ffc5796df097" (UID: "0dbae5d6-d6ad-477a-9add-ffc5796df097"). InnerVolumeSpecName "kube-api-access-982cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.206517 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dbae5d6-d6ad-477a-9add-ffc5796df097" (UID: "0dbae5d6-d6ad-477a-9add-ffc5796df097"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.284973 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-982cl\" (UniqueName: \"kubernetes.io/projected/0dbae5d6-d6ad-477a-9add-ffc5796df097-kube-api-access-982cl\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.285005 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.285014 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dbae5d6-d6ad-477a-9add-ffc5796df097-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.719624 4909 generic.go:334] "Generic (PLEG): container finished" podID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerID="49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0" exitCode=0 Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.719667 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6q5x" event={"ID":"0dbae5d6-d6ad-477a-9add-ffc5796df097","Type":"ContainerDied","Data":"49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0"} Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.719693 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6q5x" event={"ID":"0dbae5d6-d6ad-477a-9add-ffc5796df097","Type":"ContainerDied","Data":"281ed299c27ef1ad809088ea0864902e9d611b4ac567ee765b52afe7e9816a5a"} Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.719704 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6q5x" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.719711 4909 scope.go:117] "RemoveContainer" containerID="49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.734677 4909 scope.go:117] "RemoveContainer" containerID="cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.751009 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6q5x"] Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.755375 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6q5x"] Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.770715 4909 scope.go:117] "RemoveContainer" containerID="463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.785571 4909 scope.go:117] "RemoveContainer" containerID="49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0" Feb 02 10:44:08 crc kubenswrapper[4909]: E0202 10:44:08.786103 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0\": container with ID starting with 49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0 not found: ID does not exist" containerID="49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.786149 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0"} err="failed to get container status \"49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0\": rpc error: code = NotFound desc = could not find container \"49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0\": container with ID starting with 49e536929011ec8bf1c427dc076f3a6170507f4e896c62fc2bf5302610af9ff0 not found: ID does not exist" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.786177 4909 scope.go:117] "RemoveContainer" containerID="cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1" Feb 02 10:44:08 crc kubenswrapper[4909]: E0202 10:44:08.786481 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1\": container with ID starting with cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1 not found: ID does not exist" containerID="cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.786514 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1"} err="failed to get container status \"cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1\": rpc error: code = NotFound desc = could not find container \"cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1\": container with ID starting with cb8ec86b8297bea235934dbbb01df146768179f9c3be7330b7a31a3302de9cd1 not found: ID does not exist" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.786527 4909 scope.go:117] "RemoveContainer" containerID="463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac" Feb 02 10:44:08 crc kubenswrapper[4909]: E0202 10:44:08.786760 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac\": container with ID starting with 463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac not found: ID does not exist" containerID="463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac" Feb 02 10:44:08 crc kubenswrapper[4909]: I0202 10:44:08.786801 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac"} err="failed to get container status \"463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac\": rpc error: code = NotFound desc = could not find container \"463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac\": container with ID starting with 463dd916fcd215d74bc3183d98ea873d33461228debea49139c05d694f6ea8ac not found: ID does not exist" Feb 02 10:44:09 crc kubenswrapper[4909]: I0202 10:44:09.022374 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dbae5d6-d6ad-477a-9add-ffc5796df097" path="/var/lib/kubelet/pods/0dbae5d6-d6ad-477a-9add-ffc5796df097/volumes" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.026870 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8vqmg"] Feb 02 10:44:15 crc kubenswrapper[4909]: E0202 10:44:15.027835 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerName="extract-utilities" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.027851 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerName="extract-utilities" Feb 02 10:44:15 crc kubenswrapper[4909]: E0202 10:44:15.027870 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerName="registry-server" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.027877 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerName="registry-server" Feb 02 10:44:15 crc kubenswrapper[4909]: E0202 10:44:15.027915 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerName="extract-content" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.027925 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerName="extract-content" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.028091 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dbae5d6-d6ad-477a-9add-ffc5796df097" containerName="registry-server" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.029341 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.039588 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vqmg"] Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.056464 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-utilities\") pod \"community-operators-8vqmg\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.056503 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7swc\" (UniqueName: \"kubernetes.io/projected/04b8c107-1b45-4bc4-8b39-d958dc28818a-kube-api-access-t7swc\") pod \"community-operators-8vqmg\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.056543 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-catalog-content\") pod \"community-operators-8vqmg\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.157584 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-utilities\") pod \"community-operators-8vqmg\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.157637 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7swc\" (UniqueName: \"kubernetes.io/projected/04b8c107-1b45-4bc4-8b39-d958dc28818a-kube-api-access-t7swc\") pod \"community-operators-8vqmg\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.157677 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-catalog-content\") pod \"community-operators-8vqmg\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.158245 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-catalog-content\") pod \"community-operators-8vqmg\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.158288 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-utilities\") pod \"community-operators-8vqmg\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.180958 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7swc\" (UniqueName: \"kubernetes.io/projected/04b8c107-1b45-4bc4-8b39-d958dc28818a-kube-api-access-t7swc\") pod \"community-operators-8vqmg\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.354747 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:15 crc kubenswrapper[4909]: I0202 10:44:15.785306 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8vqmg"] Feb 02 10:44:16 crc kubenswrapper[4909]: I0202 10:44:16.763308 4909 generic.go:334] "Generic (PLEG): container finished" podID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerID="e5c291b1683154c2879bb0ac41398876ba8d668e2006637910dc15eaa9b31a3c" exitCode=0 Feb 02 10:44:16 crc kubenswrapper[4909]: I0202 10:44:16.763358 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmg" event={"ID":"04b8c107-1b45-4bc4-8b39-d958dc28818a","Type":"ContainerDied","Data":"e5c291b1683154c2879bb0ac41398876ba8d668e2006637910dc15eaa9b31a3c"} Feb 02 10:44:16 crc kubenswrapper[4909]: I0202 10:44:16.765373 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmg" event={"ID":"04b8c107-1b45-4bc4-8b39-d958dc28818a","Type":"ContainerStarted","Data":"541b2cba6c79139e29bc5d8c22f3502a0ed6deffb947eba301e5aa77990106ef"} Feb 02 10:44:17 crc kubenswrapper[4909]: I0202 10:44:17.770974 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmg" event={"ID":"04b8c107-1b45-4bc4-8b39-d958dc28818a","Type":"ContainerStarted","Data":"b70d6a955eb7b640acc5398cf7e6c36c074fc66b9183853bcc5ba586682891d7"} Feb 02 10:44:18 crc kubenswrapper[4909]: I0202 10:44:18.777251 4909 generic.go:334] "Generic (PLEG): container finished" podID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerID="b70d6a955eb7b640acc5398cf7e6c36c074fc66b9183853bcc5ba586682891d7" exitCode=0 Feb 02 10:44:18 crc kubenswrapper[4909]: I0202 10:44:18.777296 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmg" event={"ID":"04b8c107-1b45-4bc4-8b39-d958dc28818a","Type":"ContainerDied","Data":"b70d6a955eb7b640acc5398cf7e6c36c074fc66b9183853bcc5ba586682891d7"} Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.511384 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.511802 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.512027 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.512538 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93779139e6330b1d279baec90b6f5cebca5bcec1fa26d6a2c9986b097b6f7fb9"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.512600 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://93779139e6330b1d279baec90b6f5cebca5bcec1fa26d6a2c9986b097b6f7fb9" gracePeriod=600 Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.784216 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmg" event={"ID":"04b8c107-1b45-4bc4-8b39-d958dc28818a","Type":"ContainerStarted","Data":"fcd2c3dce8feb0de84cd701cd5d746cc6268c9416a4f15fee99bdc106f96f479"} Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.787710 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="93779139e6330b1d279baec90b6f5cebca5bcec1fa26d6a2c9986b097b6f7fb9" exitCode=0 Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.787772 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"93779139e6330b1d279baec90b6f5cebca5bcec1fa26d6a2c9986b097b6f7fb9"} Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.787833 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"e0831b6285fe2493141946d0a4e8629f9b6b1551f717985b11e1d8a63f78fa44"} Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.787869 4909 scope.go:117] "RemoveContainer" containerID="fd03e9658ae912ac531e07a183a5497530ad4917f1b07362f010e9175550e2e4" Feb 02 10:44:19 crc kubenswrapper[4909]: I0202 10:44:19.803161 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8vqmg" podStartSLOduration=2.396141702 podStartE2EDuration="4.803142612s" podCreationTimestamp="2026-02-02 10:44:15 +0000 UTC" firstStartedPulling="2026-02-02 10:44:16.765714177 +0000 UTC m=+782.511814912" lastFinishedPulling="2026-02-02 10:44:19.172715097 +0000 UTC m=+784.918815822" observedRunningTime="2026-02-02 10:44:19.799786257 +0000 UTC m=+785.545886992" watchObservedRunningTime="2026-02-02 10:44:19.803142612 +0000 UTC m=+785.549243347" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.085178 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-775zr"] Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.086209 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovn-controller" containerID="cri-o://1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b" gracePeriod=30 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.086299 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="nbdb" containerID="cri-o://034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f" gracePeriod=30 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.086336 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177" gracePeriod=30 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.086322 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="northd" containerID="cri-o://56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85" gracePeriod=30 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.086375 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovn-acl-logging" containerID="cri-o://55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2" gracePeriod=30 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.086398 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="kube-rbac-proxy-node" containerID="cri-o://2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983" gracePeriod=30 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.086654 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="sbdb" containerID="cri-o://311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700" gracePeriod=30 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.116542 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" containerID="cri-o://28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57" gracePeriod=30 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.430079 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/3.log" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.432123 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovn-acl-logging/0.log" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.432512 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovn-controller/0.log" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.432868 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.486722 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8rmxm"] Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.486922 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="kube-rbac-proxy-node" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.486933 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="kube-rbac-proxy-node" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.486941 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.486947 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.486954 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.486960 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.486968 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.486974 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.486985 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.486992 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.487000 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovn-acl-logging" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487007 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovn-acl-logging" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.487016 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="nbdb" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487022 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="nbdb" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.487031 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="kubecfg-setup" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487037 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="kubecfg-setup" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.487044 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487050 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.487062 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="sbdb" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487069 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="sbdb" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.487077 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovn-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487083 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovn-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.487091 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="northd" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487096 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="northd" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487176 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487184 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487195 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="nbdb" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487204 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="sbdb" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487210 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovn-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487219 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="kube-rbac-proxy-node" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487225 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovn-acl-logging" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487232 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487239 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="northd" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487247 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="kube-rbac-proxy-ovn-metrics" Feb 02 10:44:21 crc kubenswrapper[4909]: E0202 10:44:21.487341 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487348 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487425 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.487574 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerName="ovnkube-controller" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.488780 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630343 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-systemd\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630390 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-bin\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630420 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-env-overrides\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630439 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630457 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-script-lib\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630479 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-netns\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630500 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-ovn\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630490 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630517 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-var-lib-openvswitch\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630535 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-systemd-units\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630552 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tkj6\" (UniqueName: \"kubernetes.io/projected/ca5084bc-8bd1-4964-9a52-384222fc8374-kube-api-access-4tkj6\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630569 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-ovn-kubernetes\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630590 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-config\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630532 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630581 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630619 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630610 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-etc-openvswitch\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630638 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630641 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630705 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-node-log\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630682 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630725 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-kubelet\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630742 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-slash\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630746 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-node-log" (OuterVolumeSpecName: "node-log") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630759 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-log-socket\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630780 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630822 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630840 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-openvswitch\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630865 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-netd\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.630887 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca5084bc-8bd1-4964-9a52-384222fc8374-ovn-node-metrics-cert\") pod \"ca5084bc-8bd1-4964-9a52-384222fc8374\" (UID: \"ca5084bc-8bd1-4964-9a52-384222fc8374\") " Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631108 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631157 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-log-socket" (OuterVolumeSpecName: "log-socket") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631155 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-var-lib-openvswitch\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631191 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631200 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631220 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631229 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-slash" (OuterVolumeSpecName: "host-slash") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631236 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631247 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-slash\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631294 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68srp\" (UniqueName: \"kubernetes.io/projected/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-kube-api-access-68srp\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631367 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-ovnkube-config\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631393 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-ovnkube-script-lib\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631420 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-run-openvswitch\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631449 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-node-log\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631610 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-cni-netd\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631667 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-ovn-node-metrics-cert\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631701 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631751 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-run-ovn\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631774 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-run-netns\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631841 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-run-systemd\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631871 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-log-socket\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631901 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-env-overrides\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.631998 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-etc-openvswitch\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632015 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-cni-bin\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632030 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-systemd-units\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632056 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-kubelet\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632132 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632248 4909 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632264 4909 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632279 4909 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632291 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632305 4909 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632319 4909 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632329 4909 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632340 4909 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632352 4909 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632364 4909 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ca5084bc-8bd1-4964-9a52-384222fc8374-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632375 4909 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632387 4909 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-node-log\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632398 4909 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632409 4909 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-slash\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632420 4909 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-log-socket\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632432 4909 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.632443 4909 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.635970 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5084bc-8bd1-4964-9a52-384222fc8374-kube-api-access-4tkj6" (OuterVolumeSpecName: "kube-api-access-4tkj6") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "kube-api-access-4tkj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.636364 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5084bc-8bd1-4964-9a52-384222fc8374-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.643538 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ca5084bc-8bd1-4964-9a52-384222fc8374" (UID: "ca5084bc-8bd1-4964-9a52-384222fc8374"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.732851 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-run-systemd\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.732912 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-log-socket\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.732937 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-env-overrides\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.732965 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-etc-openvswitch\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.732979 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-cni-bin\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.732995 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-systemd-units\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733010 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-kubelet\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733030 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733049 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-var-lib-openvswitch\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733065 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-slash\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733081 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68srp\" (UniqueName: \"kubernetes.io/projected/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-kube-api-access-68srp\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733108 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-ovnkube-config\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733122 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-ovnkube-script-lib\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733135 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-run-openvswitch\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733150 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-node-log\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733164 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-cni-netd\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733187 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-ovn-node-metrics-cert\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733232 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-run-ovn\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733245 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-run-netns\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733277 4909 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ca5084bc-8bd1-4964-9a52-384222fc8374-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733290 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tkj6\" (UniqueName: \"kubernetes.io/projected/ca5084bc-8bd1-4964-9a52-384222fc8374-kube-api-access-4tkj6\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733302 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ca5084bc-8bd1-4964-9a52-384222fc8374-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733303 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-var-lib-openvswitch\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733340 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-run-netns\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733373 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-slash\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733375 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-run-systemd\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733403 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-log-socket\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733795 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-cni-netd\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.733921 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-run-openvswitch\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734246 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-env-overrides\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734294 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734306 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-kubelet\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734324 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734359 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-run-ovn\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734404 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-etc-openvswitch\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734436 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-host-cni-bin\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734482 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-systemd-units\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734557 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-ovnkube-config\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734258 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-node-log\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.734717 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-ovnkube-script-lib\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.737323 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-ovn-node-metrics-cert\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.760868 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68srp\" (UniqueName: \"kubernetes.io/projected/6cda0d68-7af3-4be1-bd2e-569c5b0f21b5-kube-api-access-68srp\") pod \"ovnkube-node-8rmxm\" (UID: \"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.802032 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.810868 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnbvb_bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af/kube-multus/2.log" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.815307 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnbvb_bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af/kube-multus/1.log" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.815438 4909 generic.go:334] "Generic (PLEG): container finished" podID="bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af" containerID="8d522d4b79e66f21f5dd0d4fc865f589c28370d17f77d11076d1291b57cee4aa" exitCode=2 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.815586 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnbvb" event={"ID":"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af","Type":"ContainerDied","Data":"8d522d4b79e66f21f5dd0d4fc865f589c28370d17f77d11076d1291b57cee4aa"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.815690 4909 scope.go:117] "RemoveContainer" containerID="8fcd8c7992c7e8e9185f32fb8196e0709891ff501292ae2ae4dd5b85d017fa76" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.816617 4909 scope.go:117] "RemoveContainer" containerID="8d522d4b79e66f21f5dd0d4fc865f589c28370d17f77d11076d1291b57cee4aa" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.818500 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovnkube-controller/3.log" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.823016 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovn-acl-logging/0.log" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.823605 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-775zr_ca5084bc-8bd1-4964-9a52-384222fc8374/ovn-controller/0.log" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824414 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57" exitCode=0 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824444 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700" exitCode=0 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824466 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f" exitCode=0 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824477 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85" exitCode=0 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824487 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177" exitCode=0 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824495 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983" exitCode=0 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824524 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824542 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2" exitCode=143 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824557 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca5084bc-8bd1-4964-9a52-384222fc8374" containerID="1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b" exitCode=143 Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824563 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824588 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824604 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824617 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824636 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824652 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824665 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824675 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824693 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824701 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824709 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824716 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824724 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824731 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824739 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824773 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824795 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824821 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824980 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.824830 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.825972 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826064 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826128 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826190 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826255 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826394 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826450 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826518 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826585 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826643 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826690 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826741 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826787 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826850 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.826926 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.827005 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.827098 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.827179 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.827245 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-775zr" event={"ID":"ca5084bc-8bd1-4964-9a52-384222fc8374","Type":"ContainerDied","Data":"35583ab6ade1e2b1d5fd17a8639ef2ac48ee7c4d5bc253dd5ba9f97f41e8182d"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.827302 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.827352 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.827398 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.828105 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.828300 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.828368 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.828454 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.828507 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.828555 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.828606 4909 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae"} Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.866018 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-775zr"] Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.868964 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-775zr"] Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.872539 4909 scope.go:117] "RemoveContainer" containerID="28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.901221 4909 scope.go:117] "RemoveContainer" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.919847 4909 scope.go:117] "RemoveContainer" containerID="311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.934767 4909 scope.go:117] "RemoveContainer" containerID="034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f" Feb 02 10:44:21 crc kubenswrapper[4909]: I0202 10:44:21.952350 4909 scope.go:117] "RemoveContainer" containerID="56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.024248 4909 scope.go:117] "RemoveContainer" containerID="2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.036960 4909 scope.go:117] "RemoveContainer" containerID="2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.054020 4909 scope.go:117] "RemoveContainer" containerID="55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.084833 4909 scope.go:117] "RemoveContainer" containerID="1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.107988 4909 scope.go:117] "RemoveContainer" containerID="08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.122168 4909 scope.go:117] "RemoveContainer" containerID="28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57" Feb 02 10:44:22 crc kubenswrapper[4909]: E0202 10:44:22.122794 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57\": container with ID starting with 28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57 not found: ID does not exist" containerID="28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.122901 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57"} err="failed to get container status \"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57\": rpc error: code = NotFound desc = could not find container \"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57\": container with ID starting with 28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.122944 4909 scope.go:117] "RemoveContainer" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:44:22 crc kubenswrapper[4909]: E0202 10:44:22.123479 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\": container with ID starting with f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f not found: ID does not exist" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.123593 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f"} err="failed to get container status \"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\": rpc error: code = NotFound desc = could not find container \"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\": container with ID starting with f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.123668 4909 scope.go:117] "RemoveContainer" containerID="311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700" Feb 02 10:44:22 crc kubenswrapper[4909]: E0202 10:44:22.124120 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\": container with ID starting with 311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700 not found: ID does not exist" containerID="311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.124153 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700"} err="failed to get container status \"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\": rpc error: code = NotFound desc = could not find container \"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\": container with ID starting with 311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.124173 4909 scope.go:117] "RemoveContainer" containerID="034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f" Feb 02 10:44:22 crc kubenswrapper[4909]: E0202 10:44:22.124558 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\": container with ID starting with 034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f not found: ID does not exist" containerID="034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.124616 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f"} err="failed to get container status \"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\": rpc error: code = NotFound desc = could not find container \"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\": container with ID starting with 034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.124653 4909 scope.go:117] "RemoveContainer" containerID="56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85" Feb 02 10:44:22 crc kubenswrapper[4909]: E0202 10:44:22.124995 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\": container with ID starting with 56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85 not found: ID does not exist" containerID="56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.125048 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85"} err="failed to get container status \"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\": rpc error: code = NotFound desc = could not find container \"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\": container with ID starting with 56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.125071 4909 scope.go:117] "RemoveContainer" containerID="2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177" Feb 02 10:44:22 crc kubenswrapper[4909]: E0202 10:44:22.125343 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\": container with ID starting with 2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177 not found: ID does not exist" containerID="2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.125369 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177"} err="failed to get container status \"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\": rpc error: code = NotFound desc = could not find container \"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\": container with ID starting with 2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.125385 4909 scope.go:117] "RemoveContainer" containerID="2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983" Feb 02 10:44:22 crc kubenswrapper[4909]: E0202 10:44:22.125632 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\": container with ID starting with 2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983 not found: ID does not exist" containerID="2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.125673 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983"} err="failed to get container status \"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\": rpc error: code = NotFound desc = could not find container \"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\": container with ID starting with 2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.125700 4909 scope.go:117] "RemoveContainer" containerID="55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2" Feb 02 10:44:22 crc kubenswrapper[4909]: E0202 10:44:22.125976 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\": container with ID starting with 55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2 not found: ID does not exist" containerID="55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.126006 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2"} err="failed to get container status \"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\": rpc error: code = NotFound desc = could not find container \"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\": container with ID starting with 55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.126024 4909 scope.go:117] "RemoveContainer" containerID="1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b" Feb 02 10:44:22 crc kubenswrapper[4909]: E0202 10:44:22.126289 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\": container with ID starting with 1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b not found: ID does not exist" containerID="1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.126328 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b"} err="failed to get container status \"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\": rpc error: code = NotFound desc = could not find container \"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\": container with ID starting with 1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.126350 4909 scope.go:117] "RemoveContainer" containerID="08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae" Feb 02 10:44:22 crc kubenswrapper[4909]: E0202 10:44:22.126593 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\": container with ID starting with 08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae not found: ID does not exist" containerID="08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.126637 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae"} err="failed to get container status \"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\": rpc error: code = NotFound desc = could not find container \"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\": container with ID starting with 08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.126658 4909 scope.go:117] "RemoveContainer" containerID="28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.126897 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57"} err="failed to get container status \"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57\": rpc error: code = NotFound desc = could not find container \"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57\": container with ID starting with 28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.126928 4909 scope.go:117] "RemoveContainer" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.127305 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f"} err="failed to get container status \"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\": rpc error: code = NotFound desc = could not find container \"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\": container with ID starting with f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.127358 4909 scope.go:117] "RemoveContainer" containerID="311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.127651 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700"} err="failed to get container status \"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\": rpc error: code = NotFound desc = could not find container \"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\": container with ID starting with 311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.127691 4909 scope.go:117] "RemoveContainer" containerID="034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.128022 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f"} err="failed to get container status \"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\": rpc error: code = NotFound desc = could not find container \"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\": container with ID starting with 034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.128079 4909 scope.go:117] "RemoveContainer" containerID="56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.128405 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85"} err="failed to get container status \"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\": rpc error: code = NotFound desc = could not find container \"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\": container with ID starting with 56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.128435 4909 scope.go:117] "RemoveContainer" containerID="2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.128688 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177"} err="failed to get container status \"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\": rpc error: code = NotFound desc = could not find container \"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\": container with ID starting with 2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.128744 4909 scope.go:117] "RemoveContainer" containerID="2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.129008 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983"} err="failed to get container status \"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\": rpc error: code = NotFound desc = could not find container \"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\": container with ID starting with 2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.129031 4909 scope.go:117] "RemoveContainer" containerID="55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.129343 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2"} err="failed to get container status \"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\": rpc error: code = NotFound desc = could not find container \"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\": container with ID starting with 55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.129398 4909 scope.go:117] "RemoveContainer" containerID="1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.129617 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b"} err="failed to get container status \"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\": rpc error: code = NotFound desc = could not find container \"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\": container with ID starting with 1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.129641 4909 scope.go:117] "RemoveContainer" containerID="08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.129860 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae"} err="failed to get container status \"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\": rpc error: code = NotFound desc = could not find container \"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\": container with ID starting with 08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.129890 4909 scope.go:117] "RemoveContainer" containerID="28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.130129 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57"} err="failed to get container status \"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57\": rpc error: code = NotFound desc = could not find container \"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57\": container with ID starting with 28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.130156 4909 scope.go:117] "RemoveContainer" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.130378 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f"} err="failed to get container status \"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\": rpc error: code = NotFound desc = could not find container \"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\": container with ID starting with f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.130409 4909 scope.go:117] "RemoveContainer" containerID="311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.130611 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700"} err="failed to get container status \"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\": rpc error: code = NotFound desc = could not find container \"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\": container with ID starting with 311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.130631 4909 scope.go:117] "RemoveContainer" containerID="034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.130823 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f"} err="failed to get container status \"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\": rpc error: code = NotFound desc = could not find container \"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\": container with ID starting with 034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.130850 4909 scope.go:117] "RemoveContainer" containerID="56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.131087 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85"} err="failed to get container status \"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\": rpc error: code = NotFound desc = could not find container \"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\": container with ID starting with 56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.131112 4909 scope.go:117] "RemoveContainer" containerID="2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.131386 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177"} err="failed to get container status \"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\": rpc error: code = NotFound desc = could not find container \"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\": container with ID starting with 2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.131441 4909 scope.go:117] "RemoveContainer" containerID="2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.131721 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983"} err="failed to get container status \"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\": rpc error: code = NotFound desc = could not find container \"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\": container with ID starting with 2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.131747 4909 scope.go:117] "RemoveContainer" containerID="55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.132093 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2"} err="failed to get container status \"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\": rpc error: code = NotFound desc = could not find container \"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\": container with ID starting with 55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.132158 4909 scope.go:117] "RemoveContainer" containerID="1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.132481 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b"} err="failed to get container status \"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\": rpc error: code = NotFound desc = could not find container \"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\": container with ID starting with 1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.132514 4909 scope.go:117] "RemoveContainer" containerID="08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.132798 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae"} err="failed to get container status \"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\": rpc error: code = NotFound desc = could not find container \"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\": container with ID starting with 08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.132897 4909 scope.go:117] "RemoveContainer" containerID="28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.133213 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57"} err="failed to get container status \"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57\": rpc error: code = NotFound desc = could not find container \"28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57\": container with ID starting with 28b948b744e9529c1d816bc1333b22ae6c1a0e285c6a6f2496dc36a257d50c57 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.133294 4909 scope.go:117] "RemoveContainer" containerID="f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.133592 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f"} err="failed to get container status \"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\": rpc error: code = NotFound desc = could not find container \"f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f\": container with ID starting with f2149ff3f6d16651e2f010a9807148d40050dbaaa59a9d631d95067eda75ad7f not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.133628 4909 scope.go:117] "RemoveContainer" containerID="311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.133939 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700"} err="failed to get container status \"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\": rpc error: code = NotFound desc = could not find container \"311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700\": container with ID starting with 311be5f47b1e72f26a9a362b09a0c463cf8af5de3cbd5812436274f202650700 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.133972 4909 scope.go:117] "RemoveContainer" containerID="034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.134280 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f"} err="failed to get container status \"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\": rpc error: code = NotFound desc = could not find container \"034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f\": container with ID starting with 034adf4676d28d076b3e7e8cbec89d6bdeefc1b21e62d8aba01682f428a5151f not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.134322 4909 scope.go:117] "RemoveContainer" containerID="56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.134639 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85"} err="failed to get container status \"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\": rpc error: code = NotFound desc = could not find container \"56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85\": container with ID starting with 56aacdbd2e6d723776eb7cb54adab073f2a9a848005c5c2d3b975398d1aa8b85 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.134679 4909 scope.go:117] "RemoveContainer" containerID="2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.134967 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177"} err="failed to get container status \"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\": rpc error: code = NotFound desc = could not find container \"2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177\": container with ID starting with 2b963f1b6c9a13482ce3993e699c86a622fd332a2000fe3b6d6190f904dd5177 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.135001 4909 scope.go:117] "RemoveContainer" containerID="2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.135252 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983"} err="failed to get container status \"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\": rpc error: code = NotFound desc = could not find container \"2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983\": container with ID starting with 2f64edf0e6f606149d4a70a6a9ef5ca1201676b31d6504496069174ab7342983 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.135331 4909 scope.go:117] "RemoveContainer" containerID="55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.135585 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2"} err="failed to get container status \"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\": rpc error: code = NotFound desc = could not find container \"55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2\": container with ID starting with 55727388235de4a9f405556beb3ccfd3a9f40f0ae0e5b4499cb49c36263ef2c2 not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.135615 4909 scope.go:117] "RemoveContainer" containerID="1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.135967 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b"} err="failed to get container status \"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\": rpc error: code = NotFound desc = could not find container \"1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b\": container with ID starting with 1a73a93bb2e0a230133878b77e62dcfffe73ff8c3b8c14f010ca3c09e879504b not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.136017 4909 scope.go:117] "RemoveContainer" containerID="08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.136331 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae"} err="failed to get container status \"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\": rpc error: code = NotFound desc = could not find container \"08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae\": container with ID starting with 08f1f0ecf873b4da9eb2e969e3e1d9383a88f40689725df64c74345d2d4d1dae not found: ID does not exist" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.832711 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnbvb_bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af/kube-multus/2.log" Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.832858 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnbvb" event={"ID":"bcf9f2a3-7bc1-4fb0-a51f-1855bf92b3af","Type":"ContainerStarted","Data":"b3cc6803fb41c55fc1cedf0e516aa9748b36b3e20a3c29163ba6e85afd6ab648"} Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.836220 4909 generic.go:334] "Generic (PLEG): container finished" podID="6cda0d68-7af3-4be1-bd2e-569c5b0f21b5" containerID="d944f4476e3f34acb4e61fbeecd46f400bb24a351ec77299025e1bbfc7c501c0" exitCode=0 Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.836267 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" event={"ID":"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5","Type":"ContainerDied","Data":"d944f4476e3f34acb4e61fbeecd46f400bb24a351ec77299025e1bbfc7c501c0"} Feb 02 10:44:22 crc kubenswrapper[4909]: I0202 10:44:22.836295 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" event={"ID":"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5","Type":"ContainerStarted","Data":"bd5f3f8a18c820ed42280878bd4454fb5ddd3fb27f5946ca79a686bb69d3d0e2"} Feb 02 10:44:23 crc kubenswrapper[4909]: I0202 10:44:23.023831 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca5084bc-8bd1-4964-9a52-384222fc8374" path="/var/lib/kubelet/pods/ca5084bc-8bd1-4964-9a52-384222fc8374/volumes" Feb 02 10:44:23 crc kubenswrapper[4909]: I0202 10:44:23.843577 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" event={"ID":"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5","Type":"ContainerStarted","Data":"9849dd6929df64567531ed3e9037d7ac6fc131224a7ac1dc6e019d1b18301b94"} Feb 02 10:44:23 crc kubenswrapper[4909]: I0202 10:44:23.844292 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" event={"ID":"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5","Type":"ContainerStarted","Data":"726635e63adddac18e6498f0d90d53b6ddce3ebc8276fe5b325c3935f96d74b7"} Feb 02 10:44:23 crc kubenswrapper[4909]: I0202 10:44:23.844308 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" event={"ID":"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5","Type":"ContainerStarted","Data":"eeaab8a91ccfcc0c3a47044c5ebb7bd973f1b225d63f70780c176b2aac9ef3a6"} Feb 02 10:44:23 crc kubenswrapper[4909]: I0202 10:44:23.844319 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" event={"ID":"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5","Type":"ContainerStarted","Data":"65219f6f2d4a6c82152bb643fb90809b4313a75e2140ac333cef0cde7b0b1bb9"} Feb 02 10:44:23 crc kubenswrapper[4909]: I0202 10:44:23.844329 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" event={"ID":"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5","Type":"ContainerStarted","Data":"69a967ae1499d5f98d4206429ff68c16113c5951f8cbf41818c431757fd3a210"} Feb 02 10:44:23 crc kubenswrapper[4909]: I0202 10:44:23.844339 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" event={"ID":"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5","Type":"ContainerStarted","Data":"d784f370da97a94f240b9358e6498c1576ad05d12e3440e613677e6a3cc94b51"} Feb 02 10:44:24 crc kubenswrapper[4909]: I0202 10:44:24.851998 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-5zcch"] Feb 02 10:44:24 crc kubenswrapper[4909]: I0202 10:44:24.852633 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:24 crc kubenswrapper[4909]: I0202 10:44:24.854742 4909 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-d4gwn" Feb 02 10:44:24 crc kubenswrapper[4909]: I0202 10:44:24.854976 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 02 10:44:24 crc kubenswrapper[4909]: I0202 10:44:24.855940 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 02 10:44:24 crc kubenswrapper[4909]: I0202 10:44:24.856716 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 02 10:44:24 crc kubenswrapper[4909]: I0202 10:44:24.983269 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3f2752f4-33ed-4162-a9b6-481c1fc80957-node-mnt\") pod \"crc-storage-crc-5zcch\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:24 crc kubenswrapper[4909]: I0202 10:44:24.983363 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3f2752f4-33ed-4162-a9b6-481c1fc80957-crc-storage\") pod \"crc-storage-crc-5zcch\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:24 crc kubenswrapper[4909]: I0202 10:44:24.983410 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzr8z\" (UniqueName: \"kubernetes.io/projected/3f2752f4-33ed-4162-a9b6-481c1fc80957-kube-api-access-qzr8z\") pod \"crc-storage-crc-5zcch\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.084322 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3f2752f4-33ed-4162-a9b6-481c1fc80957-node-mnt\") pod \"crc-storage-crc-5zcch\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.084373 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3f2752f4-33ed-4162-a9b6-481c1fc80957-crc-storage\") pod \"crc-storage-crc-5zcch\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.084395 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzr8z\" (UniqueName: \"kubernetes.io/projected/3f2752f4-33ed-4162-a9b6-481c1fc80957-kube-api-access-qzr8z\") pod \"crc-storage-crc-5zcch\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.084658 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3f2752f4-33ed-4162-a9b6-481c1fc80957-node-mnt\") pod \"crc-storage-crc-5zcch\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.085164 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3f2752f4-33ed-4162-a9b6-481c1fc80957-crc-storage\") pod \"crc-storage-crc-5zcch\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.103468 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzr8z\" (UniqueName: \"kubernetes.io/projected/3f2752f4-33ed-4162-a9b6-481c1fc80957-kube-api-access-qzr8z\") pod \"crc-storage-crc-5zcch\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.166387 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:25 crc kubenswrapper[4909]: E0202 10:44:25.186936 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5zcch_crc-storage_3f2752f4-33ed-4162-a9b6-481c1fc80957_0(0934d7db3b948f17df450d0fe0a4f70f99382c534569fb24fbb643f6aa17e2aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:44:25 crc kubenswrapper[4909]: E0202 10:44:25.186996 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5zcch_crc-storage_3f2752f4-33ed-4162-a9b6-481c1fc80957_0(0934d7db3b948f17df450d0fe0a4f70f99382c534569fb24fbb643f6aa17e2aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:25 crc kubenswrapper[4909]: E0202 10:44:25.187015 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5zcch_crc-storage_3f2752f4-33ed-4162-a9b6-481c1fc80957_0(0934d7db3b948f17df450d0fe0a4f70f99382c534569fb24fbb643f6aa17e2aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:25 crc kubenswrapper[4909]: E0202 10:44:25.187064 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-5zcch_crc-storage(3f2752f4-33ed-4162-a9b6-481c1fc80957)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-5zcch_crc-storage(3f2752f4-33ed-4162-a9b6-481c1fc80957)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5zcch_crc-storage_3f2752f4-33ed-4162-a9b6-481c1fc80957_0(0934d7db3b948f17df450d0fe0a4f70f99382c534569fb24fbb643f6aa17e2aa): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-5zcch" podUID="3f2752f4-33ed-4162-a9b6-481c1fc80957" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.355644 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.355679 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.405488 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.918115 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:25 crc kubenswrapper[4909]: I0202 10:44:25.964343 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vqmg"] Feb 02 10:44:26 crc kubenswrapper[4909]: I0202 10:44:26.864105 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" event={"ID":"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5","Type":"ContainerStarted","Data":"f7116fda656980bb8f6778d66dc523fcada1eab66f229dd1e589b42a1cebab56"} Feb 02 10:44:27 crc kubenswrapper[4909]: I0202 10:44:27.869098 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8vqmg" podUID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerName="registry-server" containerID="cri-o://fcd2c3dce8feb0de84cd701cd5d746cc6268c9416a4f15fee99bdc106f96f479" gracePeriod=2 Feb 02 10:44:28 crc kubenswrapper[4909]: I0202 10:44:28.803543 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5zcch"] Feb 02 10:44:28 crc kubenswrapper[4909]: I0202 10:44:28.803985 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:28 crc kubenswrapper[4909]: I0202 10:44:28.804416 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:28 crc kubenswrapper[4909]: E0202 10:44:28.839237 4909 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5zcch_crc-storage_3f2752f4-33ed-4162-a9b6-481c1fc80957_0(06fa0fd5499c442e297c1f53d0a0b18b980d8555ab9a8524bfdf3226979ed4d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 02 10:44:28 crc kubenswrapper[4909]: E0202 10:44:28.839305 4909 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5zcch_crc-storage_3f2752f4-33ed-4162-a9b6-481c1fc80957_0(06fa0fd5499c442e297c1f53d0a0b18b980d8555ab9a8524bfdf3226979ed4d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:28 crc kubenswrapper[4909]: E0202 10:44:28.839327 4909 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5zcch_crc-storage_3f2752f4-33ed-4162-a9b6-481c1fc80957_0(06fa0fd5499c442e297c1f53d0a0b18b980d8555ab9a8524bfdf3226979ed4d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:28 crc kubenswrapper[4909]: E0202 10:44:28.839386 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-5zcch_crc-storage(3f2752f4-33ed-4162-a9b6-481c1fc80957)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-5zcch_crc-storage(3f2752f4-33ed-4162-a9b6-481c1fc80957)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5zcch_crc-storage_3f2752f4-33ed-4162-a9b6-481c1fc80957_0(06fa0fd5499c442e297c1f53d0a0b18b980d8555ab9a8524bfdf3226979ed4d7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-5zcch" podUID="3f2752f4-33ed-4162-a9b6-481c1fc80957" Feb 02 10:44:28 crc kubenswrapper[4909]: I0202 10:44:28.878374 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" event={"ID":"6cda0d68-7af3-4be1-bd2e-569c5b0f21b5","Type":"ContainerStarted","Data":"c21dbd306a7cbce8a5f66a8d832ca3f565ecfc794742825408623c2536d1b6ce"} Feb 02 10:44:28 crc kubenswrapper[4909]: I0202 10:44:28.878704 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:28 crc kubenswrapper[4909]: I0202 10:44:28.878736 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:28 crc kubenswrapper[4909]: I0202 10:44:28.878754 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:28 crc kubenswrapper[4909]: I0202 10:44:28.908346 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" podStartSLOduration=7.908325123 podStartE2EDuration="7.908325123s" podCreationTimestamp="2026-02-02 10:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:44:28.90749865 +0000 UTC m=+794.653599385" watchObservedRunningTime="2026-02-02 10:44:28.908325123 +0000 UTC m=+794.654425858" Feb 02 10:44:28 crc kubenswrapper[4909]: I0202 10:44:28.916066 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:28 crc kubenswrapper[4909]: I0202 10:44:28.925456 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:29 crc kubenswrapper[4909]: I0202 10:44:29.885655 4909 generic.go:334] "Generic (PLEG): container finished" podID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerID="fcd2c3dce8feb0de84cd701cd5d746cc6268c9416a4f15fee99bdc106f96f479" exitCode=0 Feb 02 10:44:29 crc kubenswrapper[4909]: I0202 10:44:29.885783 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmg" event={"ID":"04b8c107-1b45-4bc4-8b39-d958dc28818a","Type":"ContainerDied","Data":"fcd2c3dce8feb0de84cd701cd5d746cc6268c9416a4f15fee99bdc106f96f479"} Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.740629 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.857040 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7swc\" (UniqueName: \"kubernetes.io/projected/04b8c107-1b45-4bc4-8b39-d958dc28818a-kube-api-access-t7swc\") pod \"04b8c107-1b45-4bc4-8b39-d958dc28818a\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.857147 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-catalog-content\") pod \"04b8c107-1b45-4bc4-8b39-d958dc28818a\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.857193 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-utilities\") pod \"04b8c107-1b45-4bc4-8b39-d958dc28818a\" (UID: \"04b8c107-1b45-4bc4-8b39-d958dc28818a\") " Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.858317 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-utilities" (OuterVolumeSpecName: "utilities") pod "04b8c107-1b45-4bc4-8b39-d958dc28818a" (UID: "04b8c107-1b45-4bc4-8b39-d958dc28818a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.862610 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b8c107-1b45-4bc4-8b39-d958dc28818a-kube-api-access-t7swc" (OuterVolumeSpecName: "kube-api-access-t7swc") pod "04b8c107-1b45-4bc4-8b39-d958dc28818a" (UID: "04b8c107-1b45-4bc4-8b39-d958dc28818a"). InnerVolumeSpecName "kube-api-access-t7swc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.893162 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8vqmg" event={"ID":"04b8c107-1b45-4bc4-8b39-d958dc28818a","Type":"ContainerDied","Data":"541b2cba6c79139e29bc5d8c22f3502a0ed6deffb947eba301e5aa77990106ef"} Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.893225 4909 scope.go:117] "RemoveContainer" containerID="fcd2c3dce8feb0de84cd701cd5d746cc6268c9416a4f15fee99bdc106f96f479" Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.893180 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8vqmg" Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.908484 4909 scope.go:117] "RemoveContainer" containerID="b70d6a955eb7b640acc5398cf7e6c36c074fc66b9183853bcc5ba586682891d7" Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.912340 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04b8c107-1b45-4bc4-8b39-d958dc28818a" (UID: "04b8c107-1b45-4bc4-8b39-d958dc28818a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.922288 4909 scope.go:117] "RemoveContainer" containerID="e5c291b1683154c2879bb0ac41398876ba8d668e2006637910dc15eaa9b31a3c" Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.958916 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.958960 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04b8c107-1b45-4bc4-8b39-d958dc28818a-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:30 crc kubenswrapper[4909]: I0202 10:44:30.958994 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7swc\" (UniqueName: \"kubernetes.io/projected/04b8c107-1b45-4bc4-8b39-d958dc28818a-kube-api-access-t7swc\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:31 crc kubenswrapper[4909]: I0202 10:44:31.210478 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8vqmg"] Feb 02 10:44:31 crc kubenswrapper[4909]: I0202 10:44:31.214212 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8vqmg"] Feb 02 10:44:33 crc kubenswrapper[4909]: I0202 10:44:33.022391 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b8c107-1b45-4bc4-8b39-d958dc28818a" path="/var/lib/kubelet/pods/04b8c107-1b45-4bc4-8b39-d958dc28818a/volumes" Feb 02 10:44:37 crc kubenswrapper[4909]: I0202 10:44:37.921392 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5dmd7"] Feb 02 10:44:37 crc kubenswrapper[4909]: E0202 10:44:37.921977 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerName="registry-server" Feb 02 10:44:37 crc kubenswrapper[4909]: I0202 10:44:37.921997 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerName="registry-server" Feb 02 10:44:37 crc kubenswrapper[4909]: E0202 10:44:37.922015 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerName="extract-content" Feb 02 10:44:37 crc kubenswrapper[4909]: I0202 10:44:37.922023 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerName="extract-content" Feb 02 10:44:37 crc kubenswrapper[4909]: E0202 10:44:37.922037 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerName="extract-utilities" Feb 02 10:44:37 crc kubenswrapper[4909]: I0202 10:44:37.922045 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerName="extract-utilities" Feb 02 10:44:37 crc kubenswrapper[4909]: I0202 10:44:37.922165 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b8c107-1b45-4bc4-8b39-d958dc28818a" containerName="registry-server" Feb 02 10:44:37 crc kubenswrapper[4909]: I0202 10:44:37.923060 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:37 crc kubenswrapper[4909]: I0202 10:44:37.940969 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5dmd7"] Feb 02 10:44:37 crc kubenswrapper[4909]: I0202 10:44:37.943624 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbq57\" (UniqueName: \"kubernetes.io/projected/ac86f546-05a5-45b6-ae25-1229d7cfdae8-kube-api-access-pbq57\") pod \"redhat-operators-5dmd7\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:37 crc kubenswrapper[4909]: I0202 10:44:37.943681 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-catalog-content\") pod \"redhat-operators-5dmd7\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:37 crc kubenswrapper[4909]: I0202 10:44:37.943730 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-utilities\") pod \"redhat-operators-5dmd7\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.044714 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbq57\" (UniqueName: \"kubernetes.io/projected/ac86f546-05a5-45b6-ae25-1229d7cfdae8-kube-api-access-pbq57\") pod \"redhat-operators-5dmd7\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.045148 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-catalog-content\") pod \"redhat-operators-5dmd7\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.045187 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-utilities\") pod \"redhat-operators-5dmd7\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.045625 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-utilities\") pod \"redhat-operators-5dmd7\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.045903 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-catalog-content\") pod \"redhat-operators-5dmd7\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.064237 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbq57\" (UniqueName: \"kubernetes.io/projected/ac86f546-05a5-45b6-ae25-1229d7cfdae8-kube-api-access-pbq57\") pod \"redhat-operators-5dmd7\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.251574 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.646474 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5dmd7"] Feb 02 10:44:38 crc kubenswrapper[4909]: W0202 10:44:38.653520 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac86f546_05a5_45b6_ae25_1229d7cfdae8.slice/crio-2693301be313183d88731cc934147c0af572c1ad113b0c8762884e41c6c88e71 WatchSource:0}: Error finding container 2693301be313183d88731cc934147c0af572c1ad113b0c8762884e41c6c88e71: Status 404 returned error can't find the container with id 2693301be313183d88731cc934147c0af572c1ad113b0c8762884e41c6c88e71 Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.934483 4909 generic.go:334] "Generic (PLEG): container finished" podID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerID="c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8" exitCode=0 Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.934527 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dmd7" event={"ID":"ac86f546-05a5-45b6-ae25-1229d7cfdae8","Type":"ContainerDied","Data":"c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8"} Feb 02 10:44:38 crc kubenswrapper[4909]: I0202 10:44:38.934552 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dmd7" event={"ID":"ac86f546-05a5-45b6-ae25-1229d7cfdae8","Type":"ContainerStarted","Data":"2693301be313183d88731cc934147c0af572c1ad113b0c8762884e41c6c88e71"} Feb 02 10:44:40 crc kubenswrapper[4909]: I0202 10:44:40.015655 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:40 crc kubenswrapper[4909]: I0202 10:44:40.016758 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:40 crc kubenswrapper[4909]: I0202 10:44:40.204940 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5zcch"] Feb 02 10:44:40 crc kubenswrapper[4909]: W0202 10:44:40.210503 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f2752f4_33ed_4162_a9b6_481c1fc80957.slice/crio-eae55ab1e01bfdfabfa25383848ebcd2c6ba1b3d91c92a3d43ce333c4b02e102 WatchSource:0}: Error finding container eae55ab1e01bfdfabfa25383848ebcd2c6ba1b3d91c92a3d43ce333c4b02e102: Status 404 returned error can't find the container with id eae55ab1e01bfdfabfa25383848ebcd2c6ba1b3d91c92a3d43ce333c4b02e102 Feb 02 10:44:40 crc kubenswrapper[4909]: I0202 10:44:40.947784 4909 generic.go:334] "Generic (PLEG): container finished" podID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerID="e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40" exitCode=0 Feb 02 10:44:40 crc kubenswrapper[4909]: I0202 10:44:40.947849 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dmd7" event={"ID":"ac86f546-05a5-45b6-ae25-1229d7cfdae8","Type":"ContainerDied","Data":"e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40"} Feb 02 10:44:40 crc kubenswrapper[4909]: I0202 10:44:40.949437 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5zcch" event={"ID":"3f2752f4-33ed-4162-a9b6-481c1fc80957","Type":"ContainerStarted","Data":"eae55ab1e01bfdfabfa25383848ebcd2c6ba1b3d91c92a3d43ce333c4b02e102"} Feb 02 10:44:41 crc kubenswrapper[4909]: I0202 10:44:41.961070 4909 generic.go:334] "Generic (PLEG): container finished" podID="3f2752f4-33ed-4162-a9b6-481c1fc80957" containerID="1c4c423389ba1bc8c182ae50d3353d3b414a7fea0a568cc4c2a881d651e4a59d" exitCode=0 Feb 02 10:44:41 crc kubenswrapper[4909]: I0202 10:44:41.961189 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5zcch" event={"ID":"3f2752f4-33ed-4162-a9b6-481c1fc80957","Type":"ContainerDied","Data":"1c4c423389ba1bc8c182ae50d3353d3b414a7fea0a568cc4c2a881d651e4a59d"} Feb 02 10:44:41 crc kubenswrapper[4909]: I0202 10:44:41.964376 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dmd7" event={"ID":"ac86f546-05a5-45b6-ae25-1229d7cfdae8","Type":"ContainerStarted","Data":"a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861"} Feb 02 10:44:41 crc kubenswrapper[4909]: I0202 10:44:41.994668 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5dmd7" podStartSLOduration=2.44238824 podStartE2EDuration="4.994648549s" podCreationTimestamp="2026-02-02 10:44:37 +0000 UTC" firstStartedPulling="2026-02-02 10:44:38.935721052 +0000 UTC m=+804.681821777" lastFinishedPulling="2026-02-02 10:44:41.487981361 +0000 UTC m=+807.234082086" observedRunningTime="2026-02-02 10:44:41.990887002 +0000 UTC m=+807.736987747" watchObservedRunningTime="2026-02-02 10:44:41.994648549 +0000 UTC m=+807.740749284" Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.186043 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.208356 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzr8z\" (UniqueName: \"kubernetes.io/projected/3f2752f4-33ed-4162-a9b6-481c1fc80957-kube-api-access-qzr8z\") pod \"3f2752f4-33ed-4162-a9b6-481c1fc80957\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.208457 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3f2752f4-33ed-4162-a9b6-481c1fc80957-crc-storage\") pod \"3f2752f4-33ed-4162-a9b6-481c1fc80957\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.208532 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3f2752f4-33ed-4162-a9b6-481c1fc80957-node-mnt\") pod \"3f2752f4-33ed-4162-a9b6-481c1fc80957\" (UID: \"3f2752f4-33ed-4162-a9b6-481c1fc80957\") " Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.208691 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f2752f4-33ed-4162-a9b6-481c1fc80957-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "3f2752f4-33ed-4162-a9b6-481c1fc80957" (UID: "3f2752f4-33ed-4162-a9b6-481c1fc80957"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.209019 4909 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/3f2752f4-33ed-4162-a9b6-481c1fc80957-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.214182 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f2752f4-33ed-4162-a9b6-481c1fc80957-kube-api-access-qzr8z" (OuterVolumeSpecName: "kube-api-access-qzr8z") pod "3f2752f4-33ed-4162-a9b6-481c1fc80957" (UID: "3f2752f4-33ed-4162-a9b6-481c1fc80957"). InnerVolumeSpecName "kube-api-access-qzr8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.222350 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f2752f4-33ed-4162-a9b6-481c1fc80957-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "3f2752f4-33ed-4162-a9b6-481c1fc80957" (UID: "3f2752f4-33ed-4162-a9b6-481c1fc80957"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.310179 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzr8z\" (UniqueName: \"kubernetes.io/projected/3f2752f4-33ed-4162-a9b6-481c1fc80957-kube-api-access-qzr8z\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.310220 4909 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/3f2752f4-33ed-4162-a9b6-481c1fc80957-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.979639 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5zcch" event={"ID":"3f2752f4-33ed-4162-a9b6-481c1fc80957","Type":"ContainerDied","Data":"eae55ab1e01bfdfabfa25383848ebcd2c6ba1b3d91c92a3d43ce333c4b02e102"} Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.979754 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eae55ab1e01bfdfabfa25383848ebcd2c6ba1b3d91c92a3d43ce333c4b02e102" Feb 02 10:44:43 crc kubenswrapper[4909]: I0202 10:44:43.979680 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5zcch" Feb 02 10:44:48 crc kubenswrapper[4909]: I0202 10:44:48.252374 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:48 crc kubenswrapper[4909]: I0202 10:44:48.252971 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:48 crc kubenswrapper[4909]: I0202 10:44:48.288650 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:49 crc kubenswrapper[4909]: I0202 10:44:49.055574 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.118015 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5dmd7"] Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.771403 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f"] Feb 02 10:44:50 crc kubenswrapper[4909]: E0202 10:44:50.772000 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f2752f4-33ed-4162-a9b6-481c1fc80957" containerName="storage" Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.772030 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f2752f4-33ed-4162-a9b6-481c1fc80957" containerName="storage" Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.772214 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f2752f4-33ed-4162-a9b6-481c1fc80957" containerName="storage" Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.773207 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.775156 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.787715 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f"] Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.898071 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.898203 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6sb9\" (UniqueName: \"kubernetes.io/projected/a163eaee-108d-441f-b814-93c208605cd2-kube-api-access-n6sb9\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.898274 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:50 crc kubenswrapper[4909]: I0202 10:44:50.999678 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:50.999780 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6sb9\" (UniqueName: \"kubernetes.io/projected/a163eaee-108d-441f-b814-93c208605cd2-kube-api-access-n6sb9\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:50.999897 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.000509 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.000526 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.012361 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5dmd7" podUID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerName="registry-server" containerID="cri-o://a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861" gracePeriod=2 Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.059361 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6sb9\" (UniqueName: \"kubernetes.io/projected/a163eaee-108d-441f-b814-93c208605cd2-kube-api-access-n6sb9\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.092578 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.385652 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.421870 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-catalog-content\") pod \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.522446 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-utilities\") pod \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.522520 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbq57\" (UniqueName: \"kubernetes.io/projected/ac86f546-05a5-45b6-ae25-1229d7cfdae8-kube-api-access-pbq57\") pod \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\" (UID: \"ac86f546-05a5-45b6-ae25-1229d7cfdae8\") " Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.523283 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-utilities" (OuterVolumeSpecName: "utilities") pod "ac86f546-05a5-45b6-ae25-1229d7cfdae8" (UID: "ac86f546-05a5-45b6-ae25-1229d7cfdae8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.526310 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac86f546-05a5-45b6-ae25-1229d7cfdae8-kube-api-access-pbq57" (OuterVolumeSpecName: "kube-api-access-pbq57") pod "ac86f546-05a5-45b6-ae25-1229d7cfdae8" (UID: "ac86f546-05a5-45b6-ae25-1229d7cfdae8"). InnerVolumeSpecName "kube-api-access-pbq57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.544379 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac86f546-05a5-45b6-ae25-1229d7cfdae8" (UID: "ac86f546-05a5-45b6-ae25-1229d7cfdae8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.566025 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f"] Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.624117 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbq57\" (UniqueName: \"kubernetes.io/projected/ac86f546-05a5-45b6-ae25-1229d7cfdae8-kube-api-access-pbq57\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.624173 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.624183 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac86f546-05a5-45b6-ae25-1229d7cfdae8-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:51 crc kubenswrapper[4909]: I0202 10:44:51.827043 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8rmxm" Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.018890 4909 generic.go:334] "Generic (PLEG): container finished" podID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerID="a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861" exitCode=0 Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.018930 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5dmd7" Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.018955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dmd7" event={"ID":"ac86f546-05a5-45b6-ae25-1229d7cfdae8","Type":"ContainerDied","Data":"a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861"} Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.018978 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5dmd7" event={"ID":"ac86f546-05a5-45b6-ae25-1229d7cfdae8","Type":"ContainerDied","Data":"2693301be313183d88731cc934147c0af572c1ad113b0c8762884e41c6c88e71"} Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.018998 4909 scope.go:117] "RemoveContainer" containerID="a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861" Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.021244 4909 generic.go:334] "Generic (PLEG): container finished" podID="a163eaee-108d-441f-b814-93c208605cd2" containerID="7ba189e5365842eb4e86fb3c554312ed4a587e0dfad60d57c5fd6adeb5d77228" exitCode=0 Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.021287 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" event={"ID":"a163eaee-108d-441f-b814-93c208605cd2","Type":"ContainerDied","Data":"7ba189e5365842eb4e86fb3c554312ed4a587e0dfad60d57c5fd6adeb5d77228"} Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.021312 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" event={"ID":"a163eaee-108d-441f-b814-93c208605cd2","Type":"ContainerStarted","Data":"a7c9dd833b11f7469706b8440895fcce11480ba67c632db6bc1a2b3ee300affc"} Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.069540 4909 scope.go:117] "RemoveContainer" containerID="e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40" Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.075974 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5dmd7"] Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.079903 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5dmd7"] Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.093956 4909 scope.go:117] "RemoveContainer" containerID="c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8" Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.112229 4909 scope.go:117] "RemoveContainer" containerID="a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861" Feb 02 10:44:52 crc kubenswrapper[4909]: E0202 10:44:52.112693 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861\": container with ID starting with a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861 not found: ID does not exist" containerID="a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861" Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.112739 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861"} err="failed to get container status \"a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861\": rpc error: code = NotFound desc = could not find container \"a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861\": container with ID starting with a92037f6017b98c8b0646d18da55c10dc2c09469356d2a40a7cc9aa965a39861 not found: ID does not exist" Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.112772 4909 scope.go:117] "RemoveContainer" containerID="e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40" Feb 02 10:44:52 crc kubenswrapper[4909]: E0202 10:44:52.113047 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40\": container with ID starting with e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40 not found: ID does not exist" containerID="e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40" Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.113079 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40"} err="failed to get container status \"e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40\": rpc error: code = NotFound desc = could not find container \"e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40\": container with ID starting with e01abdb11b9e4b0f4b903df91502b3b7855b369379ac6a8ddad8a8e4e7c22c40 not found: ID does not exist" Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.113098 4909 scope.go:117] "RemoveContainer" containerID="c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8" Feb 02 10:44:52 crc kubenswrapper[4909]: E0202 10:44:52.113433 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8\": container with ID starting with c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8 not found: ID does not exist" containerID="c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8" Feb 02 10:44:52 crc kubenswrapper[4909]: I0202 10:44:52.113460 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8"} err="failed to get container status \"c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8\": rpc error: code = NotFound desc = could not find container \"c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8\": container with ID starting with c74886b0ddd092e8bc4964aaf93feab4a59e52cfc23dbde06ed9760525a9f2a8 not found: ID does not exist" Feb 02 10:44:53 crc kubenswrapper[4909]: I0202 10:44:53.022976 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" path="/var/lib/kubelet/pods/ac86f546-05a5-45b6-ae25-1229d7cfdae8/volumes" Feb 02 10:44:54 crc kubenswrapper[4909]: I0202 10:44:54.031948 4909 generic.go:334] "Generic (PLEG): container finished" podID="a163eaee-108d-441f-b814-93c208605cd2" containerID="a406801dde492163238dfa97f493da0413a7113804f1cbb95e61d8eb60ea0a73" exitCode=0 Feb 02 10:44:54 crc kubenswrapper[4909]: I0202 10:44:54.032019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" event={"ID":"a163eaee-108d-441f-b814-93c208605cd2","Type":"ContainerDied","Data":"a406801dde492163238dfa97f493da0413a7113804f1cbb95e61d8eb60ea0a73"} Feb 02 10:44:55 crc kubenswrapper[4909]: I0202 10:44:55.039700 4909 generic.go:334] "Generic (PLEG): container finished" podID="a163eaee-108d-441f-b814-93c208605cd2" containerID="0ec5ae776db09c5762c5077d6067ec9aa7ff74182bc6535a85f0a83d9aa54039" exitCode=0 Feb 02 10:44:55 crc kubenswrapper[4909]: I0202 10:44:55.039752 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" event={"ID":"a163eaee-108d-441f-b814-93c208605cd2","Type":"ContainerDied","Data":"0ec5ae776db09c5762c5077d6067ec9aa7ff74182bc6535a85f0a83d9aa54039"} Feb 02 10:44:56 crc kubenswrapper[4909]: I0202 10:44:56.259871 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:56 crc kubenswrapper[4909]: I0202 10:44:56.379167 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-util\") pod \"a163eaee-108d-441f-b814-93c208605cd2\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " Feb 02 10:44:56 crc kubenswrapper[4909]: I0202 10:44:56.379236 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-bundle\") pod \"a163eaee-108d-441f-b814-93c208605cd2\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " Feb 02 10:44:56 crc kubenswrapper[4909]: I0202 10:44:56.379277 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6sb9\" (UniqueName: \"kubernetes.io/projected/a163eaee-108d-441f-b814-93c208605cd2-kube-api-access-n6sb9\") pod \"a163eaee-108d-441f-b814-93c208605cd2\" (UID: \"a163eaee-108d-441f-b814-93c208605cd2\") " Feb 02 10:44:56 crc kubenswrapper[4909]: I0202 10:44:56.380156 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-bundle" (OuterVolumeSpecName: "bundle") pod "a163eaee-108d-441f-b814-93c208605cd2" (UID: "a163eaee-108d-441f-b814-93c208605cd2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:44:56 crc kubenswrapper[4909]: I0202 10:44:56.384650 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a163eaee-108d-441f-b814-93c208605cd2-kube-api-access-n6sb9" (OuterVolumeSpecName: "kube-api-access-n6sb9") pod "a163eaee-108d-441f-b814-93c208605cd2" (UID: "a163eaee-108d-441f-b814-93c208605cd2"). InnerVolumeSpecName "kube-api-access-n6sb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:44:56 crc kubenswrapper[4909]: I0202 10:44:56.393500 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-util" (OuterVolumeSpecName: "util") pod "a163eaee-108d-441f-b814-93c208605cd2" (UID: "a163eaee-108d-441f-b814-93c208605cd2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:44:56 crc kubenswrapper[4909]: I0202 10:44:56.480975 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6sb9\" (UniqueName: \"kubernetes.io/projected/a163eaee-108d-441f-b814-93c208605cd2-kube-api-access-n6sb9\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:56 crc kubenswrapper[4909]: I0202 10:44:56.481009 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:56 crc kubenswrapper[4909]: I0202 10:44:56.481020 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a163eaee-108d-441f-b814-93c208605cd2-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:44:57 crc kubenswrapper[4909]: I0202 10:44:57.053235 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" event={"ID":"a163eaee-108d-441f-b814-93c208605cd2","Type":"ContainerDied","Data":"a7c9dd833b11f7469706b8440895fcce11480ba67c632db6bc1a2b3ee300affc"} Feb 02 10:44:57 crc kubenswrapper[4909]: I0202 10:44:57.053282 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7c9dd833b11f7469706b8440895fcce11480ba67c632db6bc1a2b3ee300affc" Feb 02 10:44:57 crc kubenswrapper[4909]: I0202 10:44:57.053307 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.212101 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-vvhvt"] Feb 02 10:44:59 crc kubenswrapper[4909]: E0202 10:44:59.212316 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerName="extract-utilities" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.212328 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerName="extract-utilities" Feb 02 10:44:59 crc kubenswrapper[4909]: E0202 10:44:59.212341 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a163eaee-108d-441f-b814-93c208605cd2" containerName="pull" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.212347 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a163eaee-108d-441f-b814-93c208605cd2" containerName="pull" Feb 02 10:44:59 crc kubenswrapper[4909]: E0202 10:44:59.212359 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a163eaee-108d-441f-b814-93c208605cd2" containerName="extract" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.212365 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a163eaee-108d-441f-b814-93c208605cd2" containerName="extract" Feb 02 10:44:59 crc kubenswrapper[4909]: E0202 10:44:59.212372 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerName="extract-content" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.212379 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerName="extract-content" Feb 02 10:44:59 crc kubenswrapper[4909]: E0202 10:44:59.212392 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a163eaee-108d-441f-b814-93c208605cd2" containerName="util" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.212397 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a163eaee-108d-441f-b814-93c208605cd2" containerName="util" Feb 02 10:44:59 crc kubenswrapper[4909]: E0202 10:44:59.212406 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerName="registry-server" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.212411 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerName="registry-server" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.212492 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac86f546-05a5-45b6-ae25-1229d7cfdae8" containerName="registry-server" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.212501 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a163eaee-108d-441f-b814-93c208605cd2" containerName="extract" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.212913 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-vvhvt" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.216072 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mhk9k" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.216175 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.216367 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.229639 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-vvhvt"] Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.313655 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfvrp\" (UniqueName: \"kubernetes.io/projected/7a287d80-b25f-49de-a25b-5cc2ab9b3096-kube-api-access-dfvrp\") pod \"nmstate-operator-646758c888-vvhvt\" (UID: \"7a287d80-b25f-49de-a25b-5cc2ab9b3096\") " pod="openshift-nmstate/nmstate-operator-646758c888-vvhvt" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.415559 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfvrp\" (UniqueName: \"kubernetes.io/projected/7a287d80-b25f-49de-a25b-5cc2ab9b3096-kube-api-access-dfvrp\") pod \"nmstate-operator-646758c888-vvhvt\" (UID: \"7a287d80-b25f-49de-a25b-5cc2ab9b3096\") " pod="openshift-nmstate/nmstate-operator-646758c888-vvhvt" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.436291 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfvrp\" (UniqueName: \"kubernetes.io/projected/7a287d80-b25f-49de-a25b-5cc2ab9b3096-kube-api-access-dfvrp\") pod \"nmstate-operator-646758c888-vvhvt\" (UID: \"7a287d80-b25f-49de-a25b-5cc2ab9b3096\") " pod="openshift-nmstate/nmstate-operator-646758c888-vvhvt" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.528088 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-vvhvt" Feb 02 10:44:59 crc kubenswrapper[4909]: I0202 10:44:59.920187 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-vvhvt"] Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.069047 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-vvhvt" event={"ID":"7a287d80-b25f-49de-a25b-5cc2ab9b3096","Type":"ContainerStarted","Data":"062d55b78a3b8c797ecdaf9b56d9f66a47749adbf72a5a8246347b58e022dc6f"} Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.150932 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj"] Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.151673 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.154107 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.154122 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.161526 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj"] Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.325227 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z764b\" (UniqueName: \"kubernetes.io/projected/995fb696-26f8-4ae2-9552-14bea880b2ff-kube-api-access-z764b\") pod \"collect-profiles-29500485-vsktj\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.325317 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/995fb696-26f8-4ae2-9552-14bea880b2ff-secret-volume\") pod \"collect-profiles-29500485-vsktj\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.325355 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995fb696-26f8-4ae2-9552-14bea880b2ff-config-volume\") pod \"collect-profiles-29500485-vsktj\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.426960 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z764b\" (UniqueName: \"kubernetes.io/projected/995fb696-26f8-4ae2-9552-14bea880b2ff-kube-api-access-z764b\") pod \"collect-profiles-29500485-vsktj\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.427910 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/995fb696-26f8-4ae2-9552-14bea880b2ff-secret-volume\") pod \"collect-profiles-29500485-vsktj\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.428956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995fb696-26f8-4ae2-9552-14bea880b2ff-config-volume\") pod \"collect-profiles-29500485-vsktj\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.429831 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995fb696-26f8-4ae2-9552-14bea880b2ff-config-volume\") pod \"collect-profiles-29500485-vsktj\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.432481 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/995fb696-26f8-4ae2-9552-14bea880b2ff-secret-volume\") pod \"collect-profiles-29500485-vsktj\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.445139 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z764b\" (UniqueName: \"kubernetes.io/projected/995fb696-26f8-4ae2-9552-14bea880b2ff-kube-api-access-z764b\") pod \"collect-profiles-29500485-vsktj\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.479903 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:00 crc kubenswrapper[4909]: I0202 10:45:00.651559 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj"] Feb 02 10:45:01 crc kubenswrapper[4909]: I0202 10:45:01.076195 4909 generic.go:334] "Generic (PLEG): container finished" podID="995fb696-26f8-4ae2-9552-14bea880b2ff" containerID="c84ec98d2cc34adc0a281112c51ec90de99eb89f53680dc68950becfac8ae100" exitCode=0 Feb 02 10:45:01 crc kubenswrapper[4909]: I0202 10:45:01.076460 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" event={"ID":"995fb696-26f8-4ae2-9552-14bea880b2ff","Type":"ContainerDied","Data":"c84ec98d2cc34adc0a281112c51ec90de99eb89f53680dc68950becfac8ae100"} Feb 02 10:45:01 crc kubenswrapper[4909]: I0202 10:45:01.076619 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" event={"ID":"995fb696-26f8-4ae2-9552-14bea880b2ff","Type":"ContainerStarted","Data":"6fd9dd225711b227ea1f93f0a5e603eea69998082d17e3a255612c310e8eabbc"} Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.082268 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-vvhvt" event={"ID":"7a287d80-b25f-49de-a25b-5cc2ab9b3096","Type":"ContainerStarted","Data":"18c3b1e3873be09a0ba96a2728b21f511a60c3c52d08b64c030d63ae07a767fc"} Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.100535 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-vvhvt" podStartSLOduration=1.245357794 podStartE2EDuration="3.100511374s" podCreationTimestamp="2026-02-02 10:44:59 +0000 UTC" firstStartedPulling="2026-02-02 10:44:59.936793706 +0000 UTC m=+825.682894441" lastFinishedPulling="2026-02-02 10:45:01.791947286 +0000 UTC m=+827.538048021" observedRunningTime="2026-02-02 10:45:02.096732447 +0000 UTC m=+827.842833192" watchObservedRunningTime="2026-02-02 10:45:02.100511374 +0000 UTC m=+827.846612109" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.320918 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.473640 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995fb696-26f8-4ae2-9552-14bea880b2ff-config-volume\") pod \"995fb696-26f8-4ae2-9552-14bea880b2ff\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.473764 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z764b\" (UniqueName: \"kubernetes.io/projected/995fb696-26f8-4ae2-9552-14bea880b2ff-kube-api-access-z764b\") pod \"995fb696-26f8-4ae2-9552-14bea880b2ff\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.473877 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/995fb696-26f8-4ae2-9552-14bea880b2ff-secret-volume\") pod \"995fb696-26f8-4ae2-9552-14bea880b2ff\" (UID: \"995fb696-26f8-4ae2-9552-14bea880b2ff\") " Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.477840 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/995fb696-26f8-4ae2-9552-14bea880b2ff-config-volume" (OuterVolumeSpecName: "config-volume") pod "995fb696-26f8-4ae2-9552-14bea880b2ff" (UID: "995fb696-26f8-4ae2-9552-14bea880b2ff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.485077 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/995fb696-26f8-4ae2-9552-14bea880b2ff-kube-api-access-z764b" (OuterVolumeSpecName: "kube-api-access-z764b") pod "995fb696-26f8-4ae2-9552-14bea880b2ff" (UID: "995fb696-26f8-4ae2-9552-14bea880b2ff"). InnerVolumeSpecName "kube-api-access-z764b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.493559 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/995fb696-26f8-4ae2-9552-14bea880b2ff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "995fb696-26f8-4ae2-9552-14bea880b2ff" (UID: "995fb696-26f8-4ae2-9552-14bea880b2ff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.575136 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z764b\" (UniqueName: \"kubernetes.io/projected/995fb696-26f8-4ae2-9552-14bea880b2ff-kube-api-access-z764b\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.575174 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/995fb696-26f8-4ae2-9552-14bea880b2ff-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.575186 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/995fb696-26f8-4ae2-9552-14bea880b2ff-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.896823 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-vgpns"] Feb 02 10:45:02 crc kubenswrapper[4909]: E0202 10:45:02.897057 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="995fb696-26f8-4ae2-9552-14bea880b2ff" containerName="collect-profiles" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.897076 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="995fb696-26f8-4ae2-9552-14bea880b2ff" containerName="collect-profiles" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.897199 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="995fb696-26f8-4ae2-9552-14bea880b2ff" containerName="collect-profiles" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.897869 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-vgpns" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.900569 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-sxmw4" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.906861 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5"] Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.907614 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.909243 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.929660 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bc74h"] Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.930419 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.935229 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5"] Feb 02 10:45:02 crc kubenswrapper[4909]: I0202 10:45:02.969889 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-vgpns"] Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.057616 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4"] Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.059392 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.069648 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.069885 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.070473 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-8xnpk" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.080865 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfnrj\" (UniqueName: \"kubernetes.io/projected/c888a2e4-c3b1-4cf7-8058-c724fbf2cc74-kube-api-access-zfnrj\") pod \"nmstate-metrics-54757c584b-vgpns\" (UID: \"c888a2e4-c3b1-4cf7-8058-c724fbf2cc74\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-vgpns" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.080949 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-ovs-socket\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.080999 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-dbus-socket\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.081019 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/96651658-986f-45ff-a87b-c84f9d98848b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-hjgm5\" (UID: \"96651658-986f-45ff-a87b-c84f9d98848b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.081033 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz8kg\" (UniqueName: \"kubernetes.io/projected/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-kube-api-access-cz8kg\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.081087 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-nmstate-lock\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.081154 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp4fm\" (UniqueName: \"kubernetes.io/projected/96651658-986f-45ff-a87b-c84f9d98848b-kube-api-access-tp4fm\") pod \"nmstate-webhook-8474b5b9d8-hjgm5\" (UID: \"96651658-986f-45ff-a87b-c84f9d98848b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.082892 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4"] Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.126287 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.126618 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj" event={"ID":"995fb696-26f8-4ae2-9552-14bea880b2ff","Type":"ContainerDied","Data":"6fd9dd225711b227ea1f93f0a5e603eea69998082d17e3a255612c310e8eabbc"} Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.126733 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fd9dd225711b227ea1f93f0a5e603eea69998082d17e3a255612c310e8eabbc" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182034 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-ovs-socket\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182310 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-dbus-socket\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182192 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-ovs-socket\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182444 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/96651658-986f-45ff-a87b-c84f9d98848b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-hjgm5\" (UID: \"96651658-986f-45ff-a87b-c84f9d98848b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182510 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz8kg\" (UniqueName: \"kubernetes.io/projected/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-kube-api-access-cz8kg\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182591 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/397b3083-61cc-455c-aadb-80b21941b774-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-lmkj4\" (UID: \"397b3083-61cc-455c-aadb-80b21941b774\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: E0202 10:45:03.182641 4909 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 02 10:45:03 crc kubenswrapper[4909]: E0202 10:45:03.182721 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96651658-986f-45ff-a87b-c84f9d98848b-tls-key-pair podName:96651658-986f-45ff-a87b-c84f9d98848b nodeName:}" failed. No retries permitted until 2026-02-02 10:45:03.682701378 +0000 UTC m=+829.428802183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/96651658-986f-45ff-a87b-c84f9d98848b-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-hjgm5" (UID: "96651658-986f-45ff-a87b-c84f9d98848b") : secret "openshift-nmstate-webhook" not found Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/397b3083-61cc-455c-aadb-80b21941b774-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-lmkj4\" (UID: \"397b3083-61cc-455c-aadb-80b21941b774\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182746 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-dbus-socket\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182830 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-nmstate-lock\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182906 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlkpg\" (UniqueName: \"kubernetes.io/projected/397b3083-61cc-455c-aadb-80b21941b774-kube-api-access-rlkpg\") pod \"nmstate-console-plugin-7754f76f8b-lmkj4\" (UID: \"397b3083-61cc-455c-aadb-80b21941b774\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-nmstate-lock\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.182943 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp4fm\" (UniqueName: \"kubernetes.io/projected/96651658-986f-45ff-a87b-c84f9d98848b-kube-api-access-tp4fm\") pod \"nmstate-webhook-8474b5b9d8-hjgm5\" (UID: \"96651658-986f-45ff-a87b-c84f9d98848b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.183014 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfnrj\" (UniqueName: \"kubernetes.io/projected/c888a2e4-c3b1-4cf7-8058-c724fbf2cc74-kube-api-access-zfnrj\") pod \"nmstate-metrics-54757c584b-vgpns\" (UID: \"c888a2e4-c3b1-4cf7-8058-c724fbf2cc74\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-vgpns" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.201443 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp4fm\" (UniqueName: \"kubernetes.io/projected/96651658-986f-45ff-a87b-c84f9d98848b-kube-api-access-tp4fm\") pod \"nmstate-webhook-8474b5b9d8-hjgm5\" (UID: \"96651658-986f-45ff-a87b-c84f9d98848b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.204524 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfnrj\" (UniqueName: \"kubernetes.io/projected/c888a2e4-c3b1-4cf7-8058-c724fbf2cc74-kube-api-access-zfnrj\") pod \"nmstate-metrics-54757c584b-vgpns\" (UID: \"c888a2e4-c3b1-4cf7-8058-c724fbf2cc74\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-vgpns" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.210555 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz8kg\" (UniqueName: \"kubernetes.io/projected/9240a23a-f4b9-48ca-a3a6-42dd07d0e461-kube-api-access-cz8kg\") pod \"nmstate-handler-bc74h\" (UID: \"9240a23a-f4b9-48ca-a3a6-42dd07d0e461\") " pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.217476 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-vgpns" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.244313 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.284128 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/397b3083-61cc-455c-aadb-80b21941b774-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-lmkj4\" (UID: \"397b3083-61cc-455c-aadb-80b21941b774\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.284179 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/397b3083-61cc-455c-aadb-80b21941b774-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-lmkj4\" (UID: \"397b3083-61cc-455c-aadb-80b21941b774\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.284231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlkpg\" (UniqueName: \"kubernetes.io/projected/397b3083-61cc-455c-aadb-80b21941b774-kube-api-access-rlkpg\") pod \"nmstate-console-plugin-7754f76f8b-lmkj4\" (UID: \"397b3083-61cc-455c-aadb-80b21941b774\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.285454 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/397b3083-61cc-455c-aadb-80b21941b774-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-lmkj4\" (UID: \"397b3083-61cc-455c-aadb-80b21941b774\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.296792 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/397b3083-61cc-455c-aadb-80b21941b774-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-lmkj4\" (UID: \"397b3083-61cc-455c-aadb-80b21941b774\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.297951 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-756dbcc568-5j6h5"] Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.298841 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.315059 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlkpg\" (UniqueName: \"kubernetes.io/projected/397b3083-61cc-455c-aadb-80b21941b774-kube-api-access-rlkpg\") pod \"nmstate-console-plugin-7754f76f8b-lmkj4\" (UID: \"397b3083-61cc-455c-aadb-80b21941b774\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.315647 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756dbcc568-5j6h5"] Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.390556 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.487354 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fae38c46-8413-475d-9e4b-ff53055fb5ac-console-oauth-config\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.488004 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-oauth-serving-cert\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.488057 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fae38c46-8413-475d-9e4b-ff53055fb5ac-console-serving-cert\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.488188 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-service-ca\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.488228 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzw6q\" (UniqueName: \"kubernetes.io/projected/fae38c46-8413-475d-9e4b-ff53055fb5ac-kube-api-access-gzw6q\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.488255 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-trusted-ca-bundle\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.488434 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-console-config\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.589381 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-console-config\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.589465 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fae38c46-8413-475d-9e4b-ff53055fb5ac-console-oauth-config\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.589495 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-oauth-serving-cert\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.589521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fae38c46-8413-475d-9e4b-ff53055fb5ac-console-serving-cert\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.589560 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-service-ca\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.589579 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzw6q\" (UniqueName: \"kubernetes.io/projected/fae38c46-8413-475d-9e4b-ff53055fb5ac-kube-api-access-gzw6q\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.589599 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-trusted-ca-bundle\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.591047 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-service-ca\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.591607 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-oauth-serving-cert\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.592299 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-console-config\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.592548 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fae38c46-8413-475d-9e4b-ff53055fb5ac-trusted-ca-bundle\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.596936 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fae38c46-8413-475d-9e4b-ff53055fb5ac-console-oauth-config\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.602598 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fae38c46-8413-475d-9e4b-ff53055fb5ac-console-serving-cert\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.610266 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzw6q\" (UniqueName: \"kubernetes.io/projected/fae38c46-8413-475d-9e4b-ff53055fb5ac-kube-api-access-gzw6q\") pod \"console-756dbcc568-5j6h5\" (UID: \"fae38c46-8413-475d-9e4b-ff53055fb5ac\") " pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.634053 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4"] Feb 02 10:45:03 crc kubenswrapper[4909]: W0202 10:45:03.639119 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod397b3083_61cc_455c_aadb_80b21941b774.slice/crio-04dc62127f21478a3d46bcd03670ab454b04cabb5ce72712ba70e0d182c95901 WatchSource:0}: Error finding container 04dc62127f21478a3d46bcd03670ab454b04cabb5ce72712ba70e0d182c95901: Status 404 returned error can't find the container with id 04dc62127f21478a3d46bcd03670ab454b04cabb5ce72712ba70e0d182c95901 Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.643193 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.686444 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-vgpns"] Feb 02 10:45:03 crc kubenswrapper[4909]: W0202 10:45:03.689861 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc888a2e4_c3b1_4cf7_8058_c724fbf2cc74.slice/crio-a74a712b31e619b4ad0f78432bb1d3291829cd4495913a0803e96ed84f4c5ec4 WatchSource:0}: Error finding container a74a712b31e619b4ad0f78432bb1d3291829cd4495913a0803e96ed84f4c5ec4: Status 404 returned error can't find the container with id a74a712b31e619b4ad0f78432bb1d3291829cd4495913a0803e96ed84f4c5ec4 Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.690529 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/96651658-986f-45ff-a87b-c84f9d98848b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-hjgm5\" (UID: \"96651658-986f-45ff-a87b-c84f9d98848b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.699426 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/96651658-986f-45ff-a87b-c84f9d98848b-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-hjgm5\" (UID: \"96651658-986f-45ff-a87b-c84f9d98848b\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.826303 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:03 crc kubenswrapper[4909]: I0202 10:45:03.827228 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756dbcc568-5j6h5"] Feb 02 10:45:04 crc kubenswrapper[4909]: I0202 10:45:04.131425 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756dbcc568-5j6h5" event={"ID":"fae38c46-8413-475d-9e4b-ff53055fb5ac","Type":"ContainerStarted","Data":"46f5f4e6647678332c21ef4c11969945011b75a9f4a82517d7fa227da19e6e02"} Feb 02 10:45:04 crc kubenswrapper[4909]: I0202 10:45:04.131692 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756dbcc568-5j6h5" event={"ID":"fae38c46-8413-475d-9e4b-ff53055fb5ac","Type":"ContainerStarted","Data":"c2aee5dd3483e77ad0c61f0cb186c0331957034be28fc87b27357a6cdfb1088b"} Feb 02 10:45:04 crc kubenswrapper[4909]: I0202 10:45:04.132767 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bc74h" event={"ID":"9240a23a-f4b9-48ca-a3a6-42dd07d0e461","Type":"ContainerStarted","Data":"b59c2a6195d67b9b61e27c32db5fed44eea9b07eb5f476f36746585d21f4baf7"} Feb 02 10:45:04 crc kubenswrapper[4909]: I0202 10:45:04.133649 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" event={"ID":"397b3083-61cc-455c-aadb-80b21941b774","Type":"ContainerStarted","Data":"04dc62127f21478a3d46bcd03670ab454b04cabb5ce72712ba70e0d182c95901"} Feb 02 10:45:04 crc kubenswrapper[4909]: I0202 10:45:04.134523 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-vgpns" event={"ID":"c888a2e4-c3b1-4cf7-8058-c724fbf2cc74","Type":"ContainerStarted","Data":"a74a712b31e619b4ad0f78432bb1d3291829cd4495913a0803e96ed84f4c5ec4"} Feb 02 10:45:04 crc kubenswrapper[4909]: I0202 10:45:04.150665 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-756dbcc568-5j6h5" podStartSLOduration=1.150646455 podStartE2EDuration="1.150646455s" podCreationTimestamp="2026-02-02 10:45:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:45:04.146157988 +0000 UTC m=+829.892258723" watchObservedRunningTime="2026-02-02 10:45:04.150646455 +0000 UTC m=+829.896747190" Feb 02 10:45:04 crc kubenswrapper[4909]: I0202 10:45:04.240569 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5"] Feb 02 10:45:04 crc kubenswrapper[4909]: W0202 10:45:04.250400 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96651658_986f_45ff_a87b_c84f9d98848b.slice/crio-22e155e6368350038da960be6341d3b95a5872feaaf0959fcf6e418430f5359c WatchSource:0}: Error finding container 22e155e6368350038da960be6341d3b95a5872feaaf0959fcf6e418430f5359c: Status 404 returned error can't find the container with id 22e155e6368350038da960be6341d3b95a5872feaaf0959fcf6e418430f5359c Feb 02 10:45:05 crc kubenswrapper[4909]: I0202 10:45:05.141559 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" event={"ID":"96651658-986f-45ff-a87b-c84f9d98848b","Type":"ContainerStarted","Data":"22e155e6368350038da960be6341d3b95a5872feaaf0959fcf6e418430f5359c"} Feb 02 10:45:06 crc kubenswrapper[4909]: I0202 10:45:06.147563 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bc74h" event={"ID":"9240a23a-f4b9-48ca-a3a6-42dd07d0e461","Type":"ContainerStarted","Data":"9b52d1d21a5a35738f1f1fb4cd932df0b7453a5ab18bfcc979a4a7d2aaa57a22"} Feb 02 10:45:06 crc kubenswrapper[4909]: I0202 10:45:06.147870 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:06 crc kubenswrapper[4909]: I0202 10:45:06.149754 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" event={"ID":"397b3083-61cc-455c-aadb-80b21941b774","Type":"ContainerStarted","Data":"bcb9e3f8c648173bd3b4b48379aaacd39a6a8a69ad30fefc3406e8958bba65db"} Feb 02 10:45:06 crc kubenswrapper[4909]: I0202 10:45:06.151456 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" event={"ID":"96651658-986f-45ff-a87b-c84f9d98848b","Type":"ContainerStarted","Data":"20d98987eac0ca08fd0e28597f116673bd42b24c5e835170578a56ac97f79f7d"} Feb 02 10:45:06 crc kubenswrapper[4909]: I0202 10:45:06.151583 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:06 crc kubenswrapper[4909]: I0202 10:45:06.153029 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-vgpns" event={"ID":"c888a2e4-c3b1-4cf7-8058-c724fbf2cc74","Type":"ContainerStarted","Data":"0e4835fd336684b0a8d9d24ab2f3dccb13607df4a86f296bfa2492456634177c"} Feb 02 10:45:06 crc kubenswrapper[4909]: I0202 10:45:06.168022 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bc74h" podStartSLOduration=1.6875837470000001 podStartE2EDuration="4.168003384s" podCreationTimestamp="2026-02-02 10:45:02 +0000 UTC" firstStartedPulling="2026-02-02 10:45:03.276420752 +0000 UTC m=+829.022521487" lastFinishedPulling="2026-02-02 10:45:05.756840389 +0000 UTC m=+831.502941124" observedRunningTime="2026-02-02 10:45:06.167426408 +0000 UTC m=+831.913527143" watchObservedRunningTime="2026-02-02 10:45:06.168003384 +0000 UTC m=+831.914104119" Feb 02 10:45:06 crc kubenswrapper[4909]: I0202 10:45:06.184037 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" podStartSLOduration=2.679514465 podStartE2EDuration="4.184017349s" podCreationTimestamp="2026-02-02 10:45:02 +0000 UTC" firstStartedPulling="2026-02-02 10:45:04.25321179 +0000 UTC m=+829.999312525" lastFinishedPulling="2026-02-02 10:45:05.757714674 +0000 UTC m=+831.503815409" observedRunningTime="2026-02-02 10:45:06.181542909 +0000 UTC m=+831.927643654" watchObservedRunningTime="2026-02-02 10:45:06.184017349 +0000 UTC m=+831.930118084" Feb 02 10:45:08 crc kubenswrapper[4909]: I0202 10:45:08.166641 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-vgpns" event={"ID":"c888a2e4-c3b1-4cf7-8058-c724fbf2cc74","Type":"ContainerStarted","Data":"3569a195357ebbb06308f69843db5ea2d86777561375c51a791a4080e8594632"} Feb 02 10:45:08 crc kubenswrapper[4909]: I0202 10:45:08.184303 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-lmkj4" podStartSLOduration=3.069966258 podStartE2EDuration="5.184287422s" podCreationTimestamp="2026-02-02 10:45:03 +0000 UTC" firstStartedPulling="2026-02-02 10:45:03.641218028 +0000 UTC m=+829.387318763" lastFinishedPulling="2026-02-02 10:45:05.755539192 +0000 UTC m=+831.501639927" observedRunningTime="2026-02-02 10:45:06.214167526 +0000 UTC m=+831.960268261" watchObservedRunningTime="2026-02-02 10:45:08.184287422 +0000 UTC m=+833.930388157" Feb 02 10:45:08 crc kubenswrapper[4909]: I0202 10:45:08.186210 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-vgpns" podStartSLOduration=1.935030067 podStartE2EDuration="6.186203906s" podCreationTimestamp="2026-02-02 10:45:02 +0000 UTC" firstStartedPulling="2026-02-02 10:45:03.692917718 +0000 UTC m=+829.439018453" lastFinishedPulling="2026-02-02 10:45:07.944091557 +0000 UTC m=+833.690192292" observedRunningTime="2026-02-02 10:45:08.182572853 +0000 UTC m=+833.928673598" watchObservedRunningTime="2026-02-02 10:45:08.186203906 +0000 UTC m=+833.932304631" Feb 02 10:45:13 crc kubenswrapper[4909]: I0202 10:45:13.261446 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bc74h" Feb 02 10:45:13 crc kubenswrapper[4909]: I0202 10:45:13.644539 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:13 crc kubenswrapper[4909]: I0202 10:45:13.644607 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:13 crc kubenswrapper[4909]: I0202 10:45:13.649479 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:14 crc kubenswrapper[4909]: I0202 10:45:14.213763 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-756dbcc568-5j6h5" Feb 02 10:45:14 crc kubenswrapper[4909]: I0202 10:45:14.280629 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-thqss"] Feb 02 10:45:23 crc kubenswrapper[4909]: I0202 10:45:23.835564 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-hjgm5" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.574846 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q"] Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.576360 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.579250 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.585026 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q"] Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.709006 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.709413 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnhh9\" (UniqueName: \"kubernetes.io/projected/4c0bd62a-0449-491c-aa8e-41bf967bd421-kube-api-access-jnhh9\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.709648 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.810618 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.810682 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnhh9\" (UniqueName: \"kubernetes.io/projected/4c0bd62a-0449-491c-aa8e-41bf967bd421-kube-api-access-jnhh9\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.810735 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.811126 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.811148 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.838826 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnhh9\" (UniqueName: \"kubernetes.io/projected/4c0bd62a-0449-491c-aa8e-41bf967bd421-kube-api-access-jnhh9\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:35 crc kubenswrapper[4909]: I0202 10:45:35.927315 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:36 crc kubenswrapper[4909]: I0202 10:45:36.136015 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q"] Feb 02 10:45:36 crc kubenswrapper[4909]: I0202 10:45:36.323381 4909 generic.go:334] "Generic (PLEG): container finished" podID="4c0bd62a-0449-491c-aa8e-41bf967bd421" containerID="77a0a5c07583b38805089eb93cb75f092d17f74df65bf35a38e0376f56bc45b1" exitCode=0 Feb 02 10:45:36 crc kubenswrapper[4909]: I0202 10:45:36.323428 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" event={"ID":"4c0bd62a-0449-491c-aa8e-41bf967bd421","Type":"ContainerDied","Data":"77a0a5c07583b38805089eb93cb75f092d17f74df65bf35a38e0376f56bc45b1"} Feb 02 10:45:36 crc kubenswrapper[4909]: I0202 10:45:36.323453 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" event={"ID":"4c0bd62a-0449-491c-aa8e-41bf967bd421","Type":"ContainerStarted","Data":"f44685432859b6c90f34ebbfb8fa22d4ca426c84ac2904422a8b91af11b530ca"} Feb 02 10:45:38 crc kubenswrapper[4909]: I0202 10:45:38.334689 4909 generic.go:334] "Generic (PLEG): container finished" podID="4c0bd62a-0449-491c-aa8e-41bf967bd421" containerID="dffe3be89533185fcc6fdf5df772944b6d1d22c98ca01cad416750e25f776a3b" exitCode=0 Feb 02 10:45:38 crc kubenswrapper[4909]: I0202 10:45:38.334823 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" event={"ID":"4c0bd62a-0449-491c-aa8e-41bf967bd421","Type":"ContainerDied","Data":"dffe3be89533185fcc6fdf5df772944b6d1d22c98ca01cad416750e25f776a3b"} Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.316069 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-thqss" podUID="e6595d49-3b53-44fc-a253-a252a53333a2" containerName="console" containerID="cri-o://8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3" gracePeriod=15 Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.342528 4909 generic.go:334] "Generic (PLEG): container finished" podID="4c0bd62a-0449-491c-aa8e-41bf967bd421" containerID="1ca353ad56276533631409c7c09db8043eee5306ac97d08d5df676a8bf24e7cd" exitCode=0 Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.342577 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" event={"ID":"4c0bd62a-0449-491c-aa8e-41bf967bd421","Type":"ContainerDied","Data":"1ca353ad56276533631409c7c09db8043eee5306ac97d08d5df676a8bf24e7cd"} Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.636465 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-thqss_e6595d49-3b53-44fc-a253-a252a53333a2/console/0.log" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.636523 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.760498 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-trusted-ca-bundle\") pod \"e6595d49-3b53-44fc-a253-a252a53333a2\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.760902 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-serving-cert\") pod \"e6595d49-3b53-44fc-a253-a252a53333a2\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.760934 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b9wk\" (UniqueName: \"kubernetes.io/projected/e6595d49-3b53-44fc-a253-a252a53333a2-kube-api-access-8b9wk\") pod \"e6595d49-3b53-44fc-a253-a252a53333a2\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.760970 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-oauth-config\") pod \"e6595d49-3b53-44fc-a253-a252a53333a2\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.760990 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-console-config\") pod \"e6595d49-3b53-44fc-a253-a252a53333a2\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.761014 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-service-ca\") pod \"e6595d49-3b53-44fc-a253-a252a53333a2\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.761050 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-oauth-serving-cert\") pod \"e6595d49-3b53-44fc-a253-a252a53333a2\" (UID: \"e6595d49-3b53-44fc-a253-a252a53333a2\") " Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.761085 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e6595d49-3b53-44fc-a253-a252a53333a2" (UID: "e6595d49-3b53-44fc-a253-a252a53333a2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.761458 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-console-config" (OuterVolumeSpecName: "console-config") pod "e6595d49-3b53-44fc-a253-a252a53333a2" (UID: "e6595d49-3b53-44fc-a253-a252a53333a2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.761639 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-service-ca" (OuterVolumeSpecName: "service-ca") pod "e6595d49-3b53-44fc-a253-a252a53333a2" (UID: "e6595d49-3b53-44fc-a253-a252a53333a2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.761927 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e6595d49-3b53-44fc-a253-a252a53333a2" (UID: "e6595d49-3b53-44fc-a253-a252a53333a2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.762300 4909 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-console-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.762328 4909 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-service-ca\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.762338 4909 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.762349 4909 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6595d49-3b53-44fc-a253-a252a53333a2-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.767322 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6595d49-3b53-44fc-a253-a252a53333a2-kube-api-access-8b9wk" (OuterVolumeSpecName: "kube-api-access-8b9wk") pod "e6595d49-3b53-44fc-a253-a252a53333a2" (UID: "e6595d49-3b53-44fc-a253-a252a53333a2"). InnerVolumeSpecName "kube-api-access-8b9wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.767384 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e6595d49-3b53-44fc-a253-a252a53333a2" (UID: "e6595d49-3b53-44fc-a253-a252a53333a2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.767973 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e6595d49-3b53-44fc-a253-a252a53333a2" (UID: "e6595d49-3b53-44fc-a253-a252a53333a2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.863309 4909 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.863352 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b9wk\" (UniqueName: \"kubernetes.io/projected/e6595d49-3b53-44fc-a253-a252a53333a2-kube-api-access-8b9wk\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:39 crc kubenswrapper[4909]: I0202 10:45:39.863373 4909 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e6595d49-3b53-44fc-a253-a252a53333a2-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.353190 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-thqss_e6595d49-3b53-44fc-a253-a252a53333a2/console/0.log" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.353288 4909 generic.go:334] "Generic (PLEG): container finished" podID="e6595d49-3b53-44fc-a253-a252a53333a2" containerID="8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3" exitCode=2 Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.353422 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-thqss" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.353410 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-thqss" event={"ID":"e6595d49-3b53-44fc-a253-a252a53333a2","Type":"ContainerDied","Data":"8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3"} Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.353546 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-thqss" event={"ID":"e6595d49-3b53-44fc-a253-a252a53333a2","Type":"ContainerDied","Data":"181978bc01aa6ffc24bb048edec6fdc8696962aff1b749918e63f791fc400b2f"} Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.353603 4909 scope.go:117] "RemoveContainer" containerID="8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.382531 4909 scope.go:117] "RemoveContainer" containerID="8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3" Feb 02 10:45:40 crc kubenswrapper[4909]: E0202 10:45:40.383569 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3\": container with ID starting with 8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3 not found: ID does not exist" containerID="8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.383620 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3"} err="failed to get container status \"8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3\": rpc error: code = NotFound desc = could not find container \"8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3\": container with ID starting with 8d117dee701883ac44f5756c5ca8ba7d23dba807e5bb426f196d3e1e0d88aae3 not found: ID does not exist" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.397688 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-thqss"] Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.402009 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-thqss"] Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.654145 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.672083 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-bundle\") pod \"4c0bd62a-0449-491c-aa8e-41bf967bd421\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.672123 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnhh9\" (UniqueName: \"kubernetes.io/projected/4c0bd62a-0449-491c-aa8e-41bf967bd421-kube-api-access-jnhh9\") pod \"4c0bd62a-0449-491c-aa8e-41bf967bd421\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.674143 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-bundle" (OuterVolumeSpecName: "bundle") pod "4c0bd62a-0449-491c-aa8e-41bf967bd421" (UID: "4c0bd62a-0449-491c-aa8e-41bf967bd421"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.676407 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0bd62a-0449-491c-aa8e-41bf967bd421-kube-api-access-jnhh9" (OuterVolumeSpecName: "kube-api-access-jnhh9") pod "4c0bd62a-0449-491c-aa8e-41bf967bd421" (UID: "4c0bd62a-0449-491c-aa8e-41bf967bd421"). InnerVolumeSpecName "kube-api-access-jnhh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.773038 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-util\") pod \"4c0bd62a-0449-491c-aa8e-41bf967bd421\" (UID: \"4c0bd62a-0449-491c-aa8e-41bf967bd421\") " Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.773218 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.773230 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnhh9\" (UniqueName: \"kubernetes.io/projected/4c0bd62a-0449-491c-aa8e-41bf967bd421-kube-api-access-jnhh9\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.786696 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-util" (OuterVolumeSpecName: "util") pod "4c0bd62a-0449-491c-aa8e-41bf967bd421" (UID: "4c0bd62a-0449-491c-aa8e-41bf967bd421"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:45:40 crc kubenswrapper[4909]: I0202 10:45:40.874003 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4c0bd62a-0449-491c-aa8e-41bf967bd421-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:45:41 crc kubenswrapper[4909]: I0202 10:45:41.022751 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6595d49-3b53-44fc-a253-a252a53333a2" path="/var/lib/kubelet/pods/e6595d49-3b53-44fc-a253-a252a53333a2/volumes" Feb 02 10:45:41 crc kubenswrapper[4909]: I0202 10:45:41.361365 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" event={"ID":"4c0bd62a-0449-491c-aa8e-41bf967bd421","Type":"ContainerDied","Data":"f44685432859b6c90f34ebbfb8fa22d4ca426c84ac2904422a8b91af11b530ca"} Feb 02 10:45:41 crc kubenswrapper[4909]: I0202 10:45:41.361400 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f44685432859b6c90f34ebbfb8fa22d4ca426c84ac2904422a8b91af11b530ca" Feb 02 10:45:41 crc kubenswrapper[4909]: I0202 10:45:41.361413 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.345377 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k"] Feb 02 10:45:49 crc kubenswrapper[4909]: E0202 10:45:49.346415 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6595d49-3b53-44fc-a253-a252a53333a2" containerName="console" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.346432 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6595d49-3b53-44fc-a253-a252a53333a2" containerName="console" Feb 02 10:45:49 crc kubenswrapper[4909]: E0202 10:45:49.346442 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0bd62a-0449-491c-aa8e-41bf967bd421" containerName="pull" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.346449 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0bd62a-0449-491c-aa8e-41bf967bd421" containerName="pull" Feb 02 10:45:49 crc kubenswrapper[4909]: E0202 10:45:49.346462 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0bd62a-0449-491c-aa8e-41bf967bd421" containerName="extract" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.346468 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0bd62a-0449-491c-aa8e-41bf967bd421" containerName="extract" Feb 02 10:45:49 crc kubenswrapper[4909]: E0202 10:45:49.346480 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0bd62a-0449-491c-aa8e-41bf967bd421" containerName="util" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.346486 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0bd62a-0449-491c-aa8e-41bf967bd421" containerName="util" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.346614 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6595d49-3b53-44fc-a253-a252a53333a2" containerName="console" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.346625 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0bd62a-0449-491c-aa8e-41bf967bd421" containerName="extract" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.347149 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.349437 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.349671 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.349717 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-cz5tr" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.349792 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.350610 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.373448 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k"] Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.470946 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wph2\" (UniqueName: \"kubernetes.io/projected/f5a1b826-7555-4329-bad8-41387595bcdd-kube-api-access-4wph2\") pod \"metallb-operator-controller-manager-86855cd4c5-5mr2k\" (UID: \"f5a1b826-7555-4329-bad8-41387595bcdd\") " pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.471027 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5a1b826-7555-4329-bad8-41387595bcdd-apiservice-cert\") pod \"metallb-operator-controller-manager-86855cd4c5-5mr2k\" (UID: \"f5a1b826-7555-4329-bad8-41387595bcdd\") " pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.471081 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5a1b826-7555-4329-bad8-41387595bcdd-webhook-cert\") pod \"metallb-operator-controller-manager-86855cd4c5-5mr2k\" (UID: \"f5a1b826-7555-4329-bad8-41387595bcdd\") " pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.571931 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5a1b826-7555-4329-bad8-41387595bcdd-apiservice-cert\") pod \"metallb-operator-controller-manager-86855cd4c5-5mr2k\" (UID: \"f5a1b826-7555-4329-bad8-41387595bcdd\") " pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.572019 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5a1b826-7555-4329-bad8-41387595bcdd-webhook-cert\") pod \"metallb-operator-controller-manager-86855cd4c5-5mr2k\" (UID: \"f5a1b826-7555-4329-bad8-41387595bcdd\") " pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.572069 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wph2\" (UniqueName: \"kubernetes.io/projected/f5a1b826-7555-4329-bad8-41387595bcdd-kube-api-access-4wph2\") pod \"metallb-operator-controller-manager-86855cd4c5-5mr2k\" (UID: \"f5a1b826-7555-4329-bad8-41387595bcdd\") " pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.578003 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5a1b826-7555-4329-bad8-41387595bcdd-webhook-cert\") pod \"metallb-operator-controller-manager-86855cd4c5-5mr2k\" (UID: \"f5a1b826-7555-4329-bad8-41387595bcdd\") " pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.580469 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5a1b826-7555-4329-bad8-41387595bcdd-apiservice-cert\") pod \"metallb-operator-controller-manager-86855cd4c5-5mr2k\" (UID: \"f5a1b826-7555-4329-bad8-41387595bcdd\") " pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.603594 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wph2\" (UniqueName: \"kubernetes.io/projected/f5a1b826-7555-4329-bad8-41387595bcdd-kube-api-access-4wph2\") pod \"metallb-operator-controller-manager-86855cd4c5-5mr2k\" (UID: \"f5a1b826-7555-4329-bad8-41387595bcdd\") " pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.658202 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg"] Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.658956 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.660884 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-jpmvb" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.661055 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.661099 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.664952 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.673303 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb7a1e8d-3cb1-4269-aed3-822874a6b8e6-apiservice-cert\") pod \"metallb-operator-webhook-server-7cf86474db-m5gkg\" (UID: \"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6\") " pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.673527 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb7a1e8d-3cb1-4269-aed3-822874a6b8e6-webhook-cert\") pod \"metallb-operator-webhook-server-7cf86474db-m5gkg\" (UID: \"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6\") " pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.673631 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh6vr\" (UniqueName: \"kubernetes.io/projected/bb7a1e8d-3cb1-4269-aed3-822874a6b8e6-kube-api-access-sh6vr\") pod \"metallb-operator-webhook-server-7cf86474db-m5gkg\" (UID: \"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6\") " pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.710918 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg"] Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.776160 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb7a1e8d-3cb1-4269-aed3-822874a6b8e6-webhook-cert\") pod \"metallb-operator-webhook-server-7cf86474db-m5gkg\" (UID: \"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6\") " pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.776218 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh6vr\" (UniqueName: \"kubernetes.io/projected/bb7a1e8d-3cb1-4269-aed3-822874a6b8e6-kube-api-access-sh6vr\") pod \"metallb-operator-webhook-server-7cf86474db-m5gkg\" (UID: \"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6\") " pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.776326 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb7a1e8d-3cb1-4269-aed3-822874a6b8e6-apiservice-cert\") pod \"metallb-operator-webhook-server-7cf86474db-m5gkg\" (UID: \"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6\") " pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.779934 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bb7a1e8d-3cb1-4269-aed3-822874a6b8e6-webhook-cert\") pod \"metallb-operator-webhook-server-7cf86474db-m5gkg\" (UID: \"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6\") " pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.780018 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bb7a1e8d-3cb1-4269-aed3-822874a6b8e6-apiservice-cert\") pod \"metallb-operator-webhook-server-7cf86474db-m5gkg\" (UID: \"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6\") " pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.808634 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh6vr\" (UniqueName: \"kubernetes.io/projected/bb7a1e8d-3cb1-4269-aed3-822874a6b8e6-kube-api-access-sh6vr\") pod \"metallb-operator-webhook-server-7cf86474db-m5gkg\" (UID: \"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6\") " pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.917554 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k"] Feb 02 10:45:49 crc kubenswrapper[4909]: W0202 10:45:49.924068 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5a1b826_7555_4329_bad8_41387595bcdd.slice/crio-3f2c3e288ec1249d85ec18615e35bb87708df6ed51827c2ca87631332747865d WatchSource:0}: Error finding container 3f2c3e288ec1249d85ec18615e35bb87708df6ed51827c2ca87631332747865d: Status 404 returned error can't find the container with id 3f2c3e288ec1249d85ec18615e35bb87708df6ed51827c2ca87631332747865d Feb 02 10:45:49 crc kubenswrapper[4909]: I0202 10:45:49.974154 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:50 crc kubenswrapper[4909]: I0202 10:45:50.180549 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg"] Feb 02 10:45:50 crc kubenswrapper[4909]: W0202 10:45:50.190219 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb7a1e8d_3cb1_4269_aed3_822874a6b8e6.slice/crio-82f61c2d8d104c2fbda7a02aaae67f2e1fbf766e6e2233d6922cb01a40422816 WatchSource:0}: Error finding container 82f61c2d8d104c2fbda7a02aaae67f2e1fbf766e6e2233d6922cb01a40422816: Status 404 returned error can't find the container with id 82f61c2d8d104c2fbda7a02aaae67f2e1fbf766e6e2233d6922cb01a40422816 Feb 02 10:45:50 crc kubenswrapper[4909]: I0202 10:45:50.430998 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" event={"ID":"f5a1b826-7555-4329-bad8-41387595bcdd","Type":"ContainerStarted","Data":"3f2c3e288ec1249d85ec18615e35bb87708df6ed51827c2ca87631332747865d"} Feb 02 10:45:50 crc kubenswrapper[4909]: I0202 10:45:50.431920 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" event={"ID":"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6","Type":"ContainerStarted","Data":"82f61c2d8d104c2fbda7a02aaae67f2e1fbf766e6e2233d6922cb01a40422816"} Feb 02 10:45:53 crc kubenswrapper[4909]: I0202 10:45:53.449110 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" event={"ID":"f5a1b826-7555-4329-bad8-41387595bcdd","Type":"ContainerStarted","Data":"94120d5739cbec9a1643e8a9392e0196bac6ea76e6d3b96423a7f4555260c48e"} Feb 02 10:45:53 crc kubenswrapper[4909]: I0202 10:45:53.450657 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:45:53 crc kubenswrapper[4909]: I0202 10:45:53.468146 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" podStartSLOduration=1.821639695 podStartE2EDuration="4.468131401s" podCreationTimestamp="2026-02-02 10:45:49 +0000 UTC" firstStartedPulling="2026-02-02 10:45:49.926237973 +0000 UTC m=+875.672338708" lastFinishedPulling="2026-02-02 10:45:52.572729679 +0000 UTC m=+878.318830414" observedRunningTime="2026-02-02 10:45:53.466651919 +0000 UTC m=+879.212752654" watchObservedRunningTime="2026-02-02 10:45:53.468131401 +0000 UTC m=+879.214232136" Feb 02 10:45:54 crc kubenswrapper[4909]: I0202 10:45:54.462516 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" event={"ID":"bb7a1e8d-3cb1-4269-aed3-822874a6b8e6","Type":"ContainerStarted","Data":"6c937c7e5dbcecda201dac795f2b34aab72072168739f51808be9895652e86b8"} Feb 02 10:45:54 crc kubenswrapper[4909]: I0202 10:45:54.462962 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:45:54 crc kubenswrapper[4909]: I0202 10:45:54.492910 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" podStartSLOduration=1.690739073 podStartE2EDuration="5.492888832s" podCreationTimestamp="2026-02-02 10:45:49 +0000 UTC" firstStartedPulling="2026-02-02 10:45:50.194121641 +0000 UTC m=+875.940222376" lastFinishedPulling="2026-02-02 10:45:53.9962714 +0000 UTC m=+879.742372135" observedRunningTime="2026-02-02 10:45:54.490397431 +0000 UTC m=+880.236498166" watchObservedRunningTime="2026-02-02 10:45:54.492888832 +0000 UTC m=+880.238989557" Feb 02 10:46:09 crc kubenswrapper[4909]: I0202 10:46:09.978857 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7cf86474db-m5gkg" Feb 02 10:46:19 crc kubenswrapper[4909]: I0202 10:46:19.510849 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:46:19 crc kubenswrapper[4909]: I0202 10:46:19.511406 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:46:29 crc kubenswrapper[4909]: I0202 10:46:29.667028 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-86855cd4c5-5mr2k" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.647628 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-x252g"] Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.650678 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.652408 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.652497 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh"] Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.652673 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.653140 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.660264 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.660378 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-xlfkz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.665369 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh"] Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.732773 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-6w72v"] Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.733564 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.738965 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-65nk8" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.739751 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.740509 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.740586 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.753697 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-mjpfz"] Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.754735 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.761225 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.768929 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-frr-conf\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.768974 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7aa1b8bf-83b9-40e4-81e7-30087a626c01-cert\") pod \"controller-6968d8fdc4-mjpfz\" (UID: \"7aa1b8bf-83b9-40e4-81e7-30087a626c01\") " pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769003 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g6mj\" (UniqueName: \"kubernetes.io/projected/7aa1b8bf-83b9-40e4-81e7-30087a626c01-kube-api-access-2g6mj\") pod \"controller-6968d8fdc4-mjpfz\" (UID: \"7aa1b8bf-83b9-40e4-81e7-30087a626c01\") " pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769032 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aa1b8bf-83b9-40e4-81e7-30087a626c01-metrics-certs\") pod \"controller-6968d8fdc4-mjpfz\" (UID: \"7aa1b8bf-83b9-40e4-81e7-30087a626c01\") " pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769071 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a41d250d-abb4-43e0-b2e9-73d610ea3ced-metallb-excludel2\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769135 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-frr-sockets\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769161 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f609075c-566c-46ad-bdab-f01502b06571-metrics-certs\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769203 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f609075c-566c-46ad-bdab-f01502b06571-frr-startup\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769241 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-metrics\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769269 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b71e887-8283-4799-b19a-333c87d6fcaa-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-fhcdh\" (UID: \"1b71e887-8283-4799-b19a-333c87d6fcaa\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769299 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-memberlist\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769312 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn2vd\" (UniqueName: \"kubernetes.io/projected/a41d250d-abb4-43e0-b2e9-73d610ea3ced-kube-api-access-rn2vd\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769334 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-metrics-certs\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769354 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9whd\" (UniqueName: \"kubernetes.io/projected/1b71e887-8283-4799-b19a-333c87d6fcaa-kube-api-access-f9whd\") pod \"frr-k8s-webhook-server-7df86c4f6c-fhcdh\" (UID: \"1b71e887-8283-4799-b19a-333c87d6fcaa\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769376 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g5gb\" (UniqueName: \"kubernetes.io/projected/f609075c-566c-46ad-bdab-f01502b06571-kube-api-access-9g5gb\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769392 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-reloader\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.769974 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-mjpfz"] Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.870317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-metrics-certs\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.870615 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9whd\" (UniqueName: \"kubernetes.io/projected/1b71e887-8283-4799-b19a-333c87d6fcaa-kube-api-access-f9whd\") pod \"frr-k8s-webhook-server-7df86c4f6c-fhcdh\" (UID: \"1b71e887-8283-4799-b19a-333c87d6fcaa\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.870729 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g5gb\" (UniqueName: \"kubernetes.io/projected/f609075c-566c-46ad-bdab-f01502b06571-kube-api-access-9g5gb\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.870848 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-reloader\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.870954 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-frr-conf\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871054 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7aa1b8bf-83b9-40e4-81e7-30087a626c01-cert\") pod \"controller-6968d8fdc4-mjpfz\" (UID: \"7aa1b8bf-83b9-40e4-81e7-30087a626c01\") " pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871143 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g6mj\" (UniqueName: \"kubernetes.io/projected/7aa1b8bf-83b9-40e4-81e7-30087a626c01-kube-api-access-2g6mj\") pod \"controller-6968d8fdc4-mjpfz\" (UID: \"7aa1b8bf-83b9-40e4-81e7-30087a626c01\") " pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aa1b8bf-83b9-40e4-81e7-30087a626c01-metrics-certs\") pod \"controller-6968d8fdc4-mjpfz\" (UID: \"7aa1b8bf-83b9-40e4-81e7-30087a626c01\") " pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871368 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a41d250d-abb4-43e0-b2e9-73d610ea3ced-metallb-excludel2\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871477 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-frr-sockets\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871562 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f609075c-566c-46ad-bdab-f01502b06571-metrics-certs\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871659 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f609075c-566c-46ad-bdab-f01502b06571-frr-startup\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871756 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-metrics\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871880 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b71e887-8283-4799-b19a-333c87d6fcaa-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-fhcdh\" (UID: \"1b71e887-8283-4799-b19a-333c87d6fcaa\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.872012 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-memberlist\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.872105 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn2vd\" (UniqueName: \"kubernetes.io/projected/a41d250d-abb4-43e0-b2e9-73d610ea3ced-kube-api-access-rn2vd\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: E0202 10:46:30.870465 4909 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871386 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-reloader\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.871421 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-frr-conf\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.872055 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a41d250d-abb4-43e0-b2e9-73d610ea3ced-metallb-excludel2\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.872222 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-frr-sockets\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: E0202 10:46:30.872269 4909 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 02 10:46:30 crc kubenswrapper[4909]: E0202 10:46:30.872546 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f609075c-566c-46ad-bdab-f01502b06571-metrics-certs podName:f609075c-566c-46ad-bdab-f01502b06571 nodeName:}" failed. No retries permitted until 2026-02-02 10:46:31.372526271 +0000 UTC m=+917.118627006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f609075c-566c-46ad-bdab-f01502b06571-metrics-certs") pod "frr-k8s-x252g" (UID: "f609075c-566c-46ad-bdab-f01502b06571") : secret "frr-k8s-certs-secret" not found Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.872469 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f609075c-566c-46ad-bdab-f01502b06571-metrics\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: E0202 10:46:30.872662 4909 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 02 10:46:30 crc kubenswrapper[4909]: E0202 10:46:30.872776 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-metrics-certs podName:a41d250d-abb4-43e0-b2e9-73d610ea3ced nodeName:}" failed. No retries permitted until 2026-02-02 10:46:31.372643514 +0000 UTC m=+917.118744329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-metrics-certs") pod "speaker-6w72v" (UID: "a41d250d-abb4-43e0-b2e9-73d610ea3ced") : secret "speaker-certs-secret" not found Feb 02 10:46:30 crc kubenswrapper[4909]: E0202 10:46:30.872899 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-memberlist podName:a41d250d-abb4-43e0-b2e9-73d610ea3ced nodeName:}" failed. No retries permitted until 2026-02-02 10:46:31.372883081 +0000 UTC m=+917.118983866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-memberlist") pod "speaker-6w72v" (UID: "a41d250d-abb4-43e0-b2e9-73d610ea3ced") : secret "metallb-memberlist" not found Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.873184 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f609075c-566c-46ad-bdab-f01502b06571-frr-startup\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.892845 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b71e887-8283-4799-b19a-333c87d6fcaa-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-fhcdh\" (UID: \"1b71e887-8283-4799-b19a-333c87d6fcaa\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.893107 4909 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.896492 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7aa1b8bf-83b9-40e4-81e7-30087a626c01-metrics-certs\") pod \"controller-6968d8fdc4-mjpfz\" (UID: \"7aa1b8bf-83b9-40e4-81e7-30087a626c01\") " pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.898539 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g5gb\" (UniqueName: \"kubernetes.io/projected/f609075c-566c-46ad-bdab-f01502b06571-kube-api-access-9g5gb\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.899465 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7aa1b8bf-83b9-40e4-81e7-30087a626c01-cert\") pod \"controller-6968d8fdc4-mjpfz\" (UID: \"7aa1b8bf-83b9-40e4-81e7-30087a626c01\") " pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.905675 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g6mj\" (UniqueName: \"kubernetes.io/projected/7aa1b8bf-83b9-40e4-81e7-30087a626c01-kube-api-access-2g6mj\") pod \"controller-6968d8fdc4-mjpfz\" (UID: \"7aa1b8bf-83b9-40e4-81e7-30087a626c01\") " pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.912404 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9whd\" (UniqueName: \"kubernetes.io/projected/1b71e887-8283-4799-b19a-333c87d6fcaa-kube-api-access-f9whd\") pod \"frr-k8s-webhook-server-7df86c4f6c-fhcdh\" (UID: \"1b71e887-8283-4799-b19a-333c87d6fcaa\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.913495 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn2vd\" (UniqueName: \"kubernetes.io/projected/a41d250d-abb4-43e0-b2e9-73d610ea3ced-kube-api-access-rn2vd\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:30 crc kubenswrapper[4909]: I0202 10:46:30.987544 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.124155 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.377078 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f609075c-566c-46ad-bdab-f01502b06571-metrics-certs\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.377148 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-memberlist\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.377177 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-metrics-certs\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:31 crc kubenswrapper[4909]: E0202 10:46:31.377334 4909 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 02 10:46:31 crc kubenswrapper[4909]: E0202 10:46:31.377413 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-memberlist podName:a41d250d-abb4-43e0-b2e9-73d610ea3ced nodeName:}" failed. No retries permitted until 2026-02-02 10:46:32.377396568 +0000 UTC m=+918.123497303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-memberlist") pod "speaker-6w72v" (UID: "a41d250d-abb4-43e0-b2e9-73d610ea3ced") : secret "metallb-memberlist" not found Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.383236 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-metrics-certs\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.383236 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f609075c-566c-46ad-bdab-f01502b06571-metrics-certs\") pod \"frr-k8s-x252g\" (UID: \"f609075c-566c-46ad-bdab-f01502b06571\") " pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.385179 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh"] Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.537298 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-mjpfz"] Feb 02 10:46:31 crc kubenswrapper[4909]: W0202 10:46:31.540539 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa1b8bf_83b9_40e4_81e7_30087a626c01.slice/crio-1540138df4f8fc7f073a55406039b89a853ac212d73d2fda931d4cba81d63915 WatchSource:0}: Error finding container 1540138df4f8fc7f073a55406039b89a853ac212d73d2fda931d4cba81d63915: Status 404 returned error can't find the container with id 1540138df4f8fc7f073a55406039b89a853ac212d73d2fda931d4cba81d63915 Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.578618 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.647063 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" event={"ID":"1b71e887-8283-4799-b19a-333c87d6fcaa","Type":"ContainerStarted","Data":"27e0ccaa50331440c14794280ebbfa0f2bb052a6226127441aee07b73945b528"} Feb 02 10:46:31 crc kubenswrapper[4909]: I0202 10:46:31.648657 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-mjpfz" event={"ID":"7aa1b8bf-83b9-40e4-81e7-30087a626c01","Type":"ContainerStarted","Data":"1540138df4f8fc7f073a55406039b89a853ac212d73d2fda931d4cba81d63915"} Feb 02 10:46:32 crc kubenswrapper[4909]: I0202 10:46:32.400274 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-memberlist\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:32 crc kubenswrapper[4909]: I0202 10:46:32.406006 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a41d250d-abb4-43e0-b2e9-73d610ea3ced-memberlist\") pod \"speaker-6w72v\" (UID: \"a41d250d-abb4-43e0-b2e9-73d610ea3ced\") " pod="metallb-system/speaker-6w72v" Feb 02 10:46:32 crc kubenswrapper[4909]: I0202 10:46:32.550512 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6w72v" Feb 02 10:46:32 crc kubenswrapper[4909]: I0202 10:46:32.670363 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x252g" event={"ID":"f609075c-566c-46ad-bdab-f01502b06571","Type":"ContainerStarted","Data":"d7b300d2010bd0b9c1033351860ea4a6d302c46c1869513950ea93c1e6064b9f"} Feb 02 10:46:32 crc kubenswrapper[4909]: I0202 10:46:32.672276 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-mjpfz" event={"ID":"7aa1b8bf-83b9-40e4-81e7-30087a626c01","Type":"ContainerStarted","Data":"d5ad0c3f1759e7443b0728997c9097e9a7d6f1596f2665e42511b59bc5f3828c"} Feb 02 10:46:32 crc kubenswrapper[4909]: I0202 10:46:32.672310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-mjpfz" event={"ID":"7aa1b8bf-83b9-40e4-81e7-30087a626c01","Type":"ContainerStarted","Data":"f3e5549d2dddb180d44e09388e728363872a7be4e0ba37d47b3f46beb0ff732d"} Feb 02 10:46:32 crc kubenswrapper[4909]: I0202 10:46:32.673152 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:32 crc kubenswrapper[4909]: I0202 10:46:32.675608 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6w72v" event={"ID":"a41d250d-abb4-43e0-b2e9-73d610ea3ced","Type":"ContainerStarted","Data":"512645fd4c5d71ed18d0dc766d3294ca58ad487203c085cf0b1697163a7ff995"} Feb 02 10:46:32 crc kubenswrapper[4909]: I0202 10:46:32.701055 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-mjpfz" podStartSLOduration=2.701032627 podStartE2EDuration="2.701032627s" podCreationTimestamp="2026-02-02 10:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:46:32.698107164 +0000 UTC m=+918.444207899" watchObservedRunningTime="2026-02-02 10:46:32.701032627 +0000 UTC m=+918.447133362" Feb 02 10:46:33 crc kubenswrapper[4909]: I0202 10:46:33.683839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6w72v" event={"ID":"a41d250d-abb4-43e0-b2e9-73d610ea3ced","Type":"ContainerStarted","Data":"3c07047706bac538812444af0f8ec5cad50cbed7c65d33d052c68384ad476121"} Feb 02 10:46:33 crc kubenswrapper[4909]: I0202 10:46:33.684191 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6w72v" event={"ID":"a41d250d-abb4-43e0-b2e9-73d610ea3ced","Type":"ContainerStarted","Data":"d143ab3f97bb61f1db66a3fb2b93d71bd1455ccf2e331caefe886fce20ff62c3"} Feb 02 10:46:33 crc kubenswrapper[4909]: I0202 10:46:33.684223 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-6w72v" Feb 02 10:46:33 crc kubenswrapper[4909]: I0202 10:46:33.705166 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-6w72v" podStartSLOduration=3.705147681 podStartE2EDuration="3.705147681s" podCreationTimestamp="2026-02-02 10:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:46:33.702543017 +0000 UTC m=+919.448643752" watchObservedRunningTime="2026-02-02 10:46:33.705147681 +0000 UTC m=+919.451248426" Feb 02 10:46:39 crc kubenswrapper[4909]: I0202 10:46:39.715324 4909 generic.go:334] "Generic (PLEG): container finished" podID="f609075c-566c-46ad-bdab-f01502b06571" containerID="ca553a31d07ee7b2b2468a4be126652917744ca35f1d06153c23cbe3f970ba22" exitCode=0 Feb 02 10:46:39 crc kubenswrapper[4909]: I0202 10:46:39.715426 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x252g" event={"ID":"f609075c-566c-46ad-bdab-f01502b06571","Type":"ContainerDied","Data":"ca553a31d07ee7b2b2468a4be126652917744ca35f1d06153c23cbe3f970ba22"} Feb 02 10:46:39 crc kubenswrapper[4909]: I0202 10:46:39.718103 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" event={"ID":"1b71e887-8283-4799-b19a-333c87d6fcaa","Type":"ContainerStarted","Data":"691eb77080bdfd12c7d5308e09f7c458ba6c75f866625161e8d3f83bd575a192"} Feb 02 10:46:39 crc kubenswrapper[4909]: I0202 10:46:39.718243 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" Feb 02 10:46:39 crc kubenswrapper[4909]: I0202 10:46:39.769343 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" podStartSLOduration=1.958914344 podStartE2EDuration="9.769315083s" podCreationTimestamp="2026-02-02 10:46:30 +0000 UTC" firstStartedPulling="2026-02-02 10:46:31.392456916 +0000 UTC m=+917.138557651" lastFinishedPulling="2026-02-02 10:46:39.202857655 +0000 UTC m=+924.948958390" observedRunningTime="2026-02-02 10:46:39.763677682 +0000 UTC m=+925.509778417" watchObservedRunningTime="2026-02-02 10:46:39.769315083 +0000 UTC m=+925.515415818" Feb 02 10:46:40 crc kubenswrapper[4909]: I0202 10:46:40.724702 4909 generic.go:334] "Generic (PLEG): container finished" podID="f609075c-566c-46ad-bdab-f01502b06571" containerID="e2e2fd6e430b117ec84d73469f0976ec9992068c067bd6549abf2be1aa56c0c2" exitCode=0 Feb 02 10:46:40 crc kubenswrapper[4909]: I0202 10:46:40.724800 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x252g" event={"ID":"f609075c-566c-46ad-bdab-f01502b06571","Type":"ContainerDied","Data":"e2e2fd6e430b117ec84d73469f0976ec9992068c067bd6549abf2be1aa56c0c2"} Feb 02 10:46:41 crc kubenswrapper[4909]: I0202 10:46:41.128196 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-mjpfz" Feb 02 10:46:41 crc kubenswrapper[4909]: I0202 10:46:41.732551 4909 generic.go:334] "Generic (PLEG): container finished" podID="f609075c-566c-46ad-bdab-f01502b06571" containerID="b299517cec7715764ba5cb604beab780dd3a6def3ee835871f741745b8614400" exitCode=0 Feb 02 10:46:41 crc kubenswrapper[4909]: I0202 10:46:41.732598 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x252g" event={"ID":"f609075c-566c-46ad-bdab-f01502b06571","Type":"ContainerDied","Data":"b299517cec7715764ba5cb604beab780dd3a6def3ee835871f741745b8614400"} Feb 02 10:46:42 crc kubenswrapper[4909]: I0202 10:46:42.553976 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-6w72v" Feb 02 10:46:42 crc kubenswrapper[4909]: I0202 10:46:42.742258 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x252g" event={"ID":"f609075c-566c-46ad-bdab-f01502b06571","Type":"ContainerStarted","Data":"b46b8fff125b9d53d1c2a6914b3cc75698b6bccd6917632c534b74657f3ca20d"} Feb 02 10:46:42 crc kubenswrapper[4909]: I0202 10:46:42.742304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x252g" event={"ID":"f609075c-566c-46ad-bdab-f01502b06571","Type":"ContainerStarted","Data":"b17a96e71e818f3fe3404eff7e6054f2b731f3c86cce9a11e00bedafa76ec7c2"} Feb 02 10:46:42 crc kubenswrapper[4909]: I0202 10:46:42.742319 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x252g" event={"ID":"f609075c-566c-46ad-bdab-f01502b06571","Type":"ContainerStarted","Data":"501b32fd49451c6ea1e3273c25fe3c5b42015447b5ff2abf397b3765aa4e27cd"} Feb 02 10:46:42 crc kubenswrapper[4909]: I0202 10:46:42.742331 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x252g" event={"ID":"f609075c-566c-46ad-bdab-f01502b06571","Type":"ContainerStarted","Data":"0cf96837a60b006124407a5357641ea1982c8458c2bb0f9759efb822262056e9"} Feb 02 10:46:42 crc kubenswrapper[4909]: I0202 10:46:42.742342 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x252g" event={"ID":"f609075c-566c-46ad-bdab-f01502b06571","Type":"ContainerStarted","Data":"31008a2be757151874a14156301608e6948268db1a18fd357b46eec6e293bd8b"} Feb 02 10:46:42 crc kubenswrapper[4909]: I0202 10:46:42.742353 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-x252g" event={"ID":"f609075c-566c-46ad-bdab-f01502b06571","Type":"ContainerStarted","Data":"e728d0e2b1ecc4ed118307c5be3faa929174247889c66e9cccb44577c4b26326"} Feb 02 10:46:42 crc kubenswrapper[4909]: I0202 10:46:42.742440 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:42 crc kubenswrapper[4909]: I0202 10:46:42.765159 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-x252g" podStartSLOduration=5.269109954 podStartE2EDuration="12.765143253s" podCreationTimestamp="2026-02-02 10:46:30 +0000 UTC" firstStartedPulling="2026-02-02 10:46:31.684172081 +0000 UTC m=+917.430272806" lastFinishedPulling="2026-02-02 10:46:39.18020537 +0000 UTC m=+924.926306105" observedRunningTime="2026-02-02 10:46:42.764796753 +0000 UTC m=+928.510897488" watchObservedRunningTime="2026-02-02 10:46:42.765143253 +0000 UTC m=+928.511243998" Feb 02 10:46:43 crc kubenswrapper[4909]: I0202 10:46:43.922082 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8"] Feb 02 10:46:43 crc kubenswrapper[4909]: I0202 10:46:43.923522 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:43 crc kubenswrapper[4909]: I0202 10:46:43.925669 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 10:46:43 crc kubenswrapper[4909]: I0202 10:46:43.933240 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8"] Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.067662 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.067763 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8bsq\" (UniqueName: \"kubernetes.io/projected/a54838f1-b5b8-4650-a518-8dbb1753d5c9-kube-api-access-c8bsq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.068083 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.169300 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.169342 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8bsq\" (UniqueName: \"kubernetes.io/projected/a54838f1-b5b8-4650-a518-8dbb1753d5c9-kube-api-access-c8bsq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.169398 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.170388 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.170457 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.210447 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8bsq\" (UniqueName: \"kubernetes.io/projected/a54838f1-b5b8-4650-a518-8dbb1753d5c9-kube-api-access-c8bsq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.241976 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.432259 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8"] Feb 02 10:46:44 crc kubenswrapper[4909]: W0202 10:46:44.451155 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda54838f1_b5b8_4650_a518_8dbb1753d5c9.slice/crio-82b11cd0bb3a9e44890c04b689107ea0dd4850d6c99b5517c1827bad287e76a1 WatchSource:0}: Error finding container 82b11cd0bb3a9e44890c04b689107ea0dd4850d6c99b5517c1827bad287e76a1: Status 404 returned error can't find the container with id 82b11cd0bb3a9e44890c04b689107ea0dd4850d6c99b5517c1827bad287e76a1 Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.760070 4909 generic.go:334] "Generic (PLEG): container finished" podID="a54838f1-b5b8-4650-a518-8dbb1753d5c9" containerID="f956206f095537c546ecc413f816969adbbffe0fbec3321bb455d3216aa00fd3" exitCode=0 Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.760144 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" event={"ID":"a54838f1-b5b8-4650-a518-8dbb1753d5c9","Type":"ContainerDied","Data":"f956206f095537c546ecc413f816969adbbffe0fbec3321bb455d3216aa00fd3"} Feb 02 10:46:44 crc kubenswrapper[4909]: I0202 10:46:44.760185 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" event={"ID":"a54838f1-b5b8-4650-a518-8dbb1753d5c9","Type":"ContainerStarted","Data":"82b11cd0bb3a9e44890c04b689107ea0dd4850d6c99b5517c1827bad287e76a1"} Feb 02 10:46:46 crc kubenswrapper[4909]: I0202 10:46:46.579845 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:46 crc kubenswrapper[4909]: I0202 10:46:46.619711 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:47 crc kubenswrapper[4909]: I0202 10:46:47.779123 4909 generic.go:334] "Generic (PLEG): container finished" podID="a54838f1-b5b8-4650-a518-8dbb1753d5c9" containerID="982d9c644e7ffd1217cb53dc39be768778e83b2e92bce6d5d5d05ae426dd77ab" exitCode=0 Feb 02 10:46:47 crc kubenswrapper[4909]: I0202 10:46:47.779208 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" event={"ID":"a54838f1-b5b8-4650-a518-8dbb1753d5c9","Type":"ContainerDied","Data":"982d9c644e7ffd1217cb53dc39be768778e83b2e92bce6d5d5d05ae426dd77ab"} Feb 02 10:46:48 crc kubenswrapper[4909]: I0202 10:46:48.786682 4909 generic.go:334] "Generic (PLEG): container finished" podID="a54838f1-b5b8-4650-a518-8dbb1753d5c9" containerID="e7e9be419c58c3b8edc31d0198b5e9b8afb1688ee02d44617c2348b885290c99" exitCode=0 Feb 02 10:46:48 crc kubenswrapper[4909]: I0202 10:46:48.786784 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" event={"ID":"a54838f1-b5b8-4650-a518-8dbb1753d5c9","Type":"ContainerDied","Data":"e7e9be419c58c3b8edc31d0198b5e9b8afb1688ee02d44617c2348b885290c99"} Feb 02 10:46:49 crc kubenswrapper[4909]: I0202 10:46:49.510933 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:46:49 crc kubenswrapper[4909]: I0202 10:46:49.511022 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.050313 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.154425 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8bsq\" (UniqueName: \"kubernetes.io/projected/a54838f1-b5b8-4650-a518-8dbb1753d5c9-kube-api-access-c8bsq\") pod \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.154842 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-util\") pod \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.154910 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-bundle\") pod \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\" (UID: \"a54838f1-b5b8-4650-a518-8dbb1753d5c9\") " Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.156006 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-bundle" (OuterVolumeSpecName: "bundle") pod "a54838f1-b5b8-4650-a518-8dbb1753d5c9" (UID: "a54838f1-b5b8-4650-a518-8dbb1753d5c9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.161558 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54838f1-b5b8-4650-a518-8dbb1753d5c9-kube-api-access-c8bsq" (OuterVolumeSpecName: "kube-api-access-c8bsq") pod "a54838f1-b5b8-4650-a518-8dbb1753d5c9" (UID: "a54838f1-b5b8-4650-a518-8dbb1753d5c9"). InnerVolumeSpecName "kube-api-access-c8bsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.165412 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-util" (OuterVolumeSpecName: "util") pod "a54838f1-b5b8-4650-a518-8dbb1753d5c9" (UID: "a54838f1-b5b8-4650-a518-8dbb1753d5c9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.259177 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.259216 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8bsq\" (UniqueName: \"kubernetes.io/projected/a54838f1-b5b8-4650-a518-8dbb1753d5c9-kube-api-access-c8bsq\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.259226 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a54838f1-b5b8-4650-a518-8dbb1753d5c9-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.801871 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" event={"ID":"a54838f1-b5b8-4650-a518-8dbb1753d5c9","Type":"ContainerDied","Data":"82b11cd0bb3a9e44890c04b689107ea0dd4850d6c99b5517c1827bad287e76a1"} Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.801914 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82b11cd0bb3a9e44890c04b689107ea0dd4850d6c99b5517c1827bad287e76a1" Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.801930 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8" Feb 02 10:46:50 crc kubenswrapper[4909]: I0202 10:46:50.993031 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-fhcdh" Feb 02 10:46:51 crc kubenswrapper[4909]: I0202 10:46:51.585074 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-x252g" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.387666 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw"] Feb 02 10:46:57 crc kubenswrapper[4909]: E0202 10:46:57.389981 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54838f1-b5b8-4650-a518-8dbb1753d5c9" containerName="extract" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.390264 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54838f1-b5b8-4650-a518-8dbb1753d5c9" containerName="extract" Feb 02 10:46:57 crc kubenswrapper[4909]: E0202 10:46:57.390281 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54838f1-b5b8-4650-a518-8dbb1753d5c9" containerName="pull" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.390289 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54838f1-b5b8-4650-a518-8dbb1753d5c9" containerName="pull" Feb 02 10:46:57 crc kubenswrapper[4909]: E0202 10:46:57.390300 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54838f1-b5b8-4650-a518-8dbb1753d5c9" containerName="util" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.390308 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54838f1-b5b8-4650-a518-8dbb1753d5c9" containerName="util" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.390445 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54838f1-b5b8-4650-a518-8dbb1753d5c9" containerName="extract" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.391342 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.394409 4909 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-nh25x" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.394412 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.394971 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.397572 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw"] Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.551516 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a7e4827-4680-42ef-a01c-0d440b8627be-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-m6crw\" (UID: \"2a7e4827-4680-42ef-a01c-0d440b8627be\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.551608 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv755\" (UniqueName: \"kubernetes.io/projected/2a7e4827-4680-42ef-a01c-0d440b8627be-kube-api-access-bv755\") pod \"cert-manager-operator-controller-manager-66c8bdd694-m6crw\" (UID: \"2a7e4827-4680-42ef-a01c-0d440b8627be\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.653249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv755\" (UniqueName: \"kubernetes.io/projected/2a7e4827-4680-42ef-a01c-0d440b8627be-kube-api-access-bv755\") pod \"cert-manager-operator-controller-manager-66c8bdd694-m6crw\" (UID: \"2a7e4827-4680-42ef-a01c-0d440b8627be\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.653318 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a7e4827-4680-42ef-a01c-0d440b8627be-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-m6crw\" (UID: \"2a7e4827-4680-42ef-a01c-0d440b8627be\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.653859 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2a7e4827-4680-42ef-a01c-0d440b8627be-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-m6crw\" (UID: \"2a7e4827-4680-42ef-a01c-0d440b8627be\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.670630 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv755\" (UniqueName: \"kubernetes.io/projected/2a7e4827-4680-42ef-a01c-0d440b8627be-kube-api-access-bv755\") pod \"cert-manager-operator-controller-manager-66c8bdd694-m6crw\" (UID: \"2a7e4827-4680-42ef-a01c-0d440b8627be\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" Feb 02 10:46:57 crc kubenswrapper[4909]: I0202 10:46:57.710259 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" Feb 02 10:46:58 crc kubenswrapper[4909]: I0202 10:46:58.237459 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw"] Feb 02 10:46:58 crc kubenswrapper[4909]: I0202 10:46:58.858179 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" event={"ID":"2a7e4827-4680-42ef-a01c-0d440b8627be","Type":"ContainerStarted","Data":"3bc2d33b143420d1a9bb5ec24a9c236df07b202033ebb40c89e5821ccc32c95c"} Feb 02 10:47:00 crc kubenswrapper[4909]: I0202 10:47:00.869824 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" event={"ID":"2a7e4827-4680-42ef-a01c-0d440b8627be","Type":"ContainerStarted","Data":"9e999556ca3aa828d2e3d478d68de59228d67e57f600b8f6698fc5a1b64b1ac8"} Feb 02 10:47:00 crc kubenswrapper[4909]: I0202 10:47:00.889051 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-m6crw" podStartSLOduration=1.5146523109999999 podStartE2EDuration="3.88903515s" podCreationTimestamp="2026-02-02 10:46:57 +0000 UTC" firstStartedPulling="2026-02-02 10:46:58.252488276 +0000 UTC m=+943.998589021" lastFinishedPulling="2026-02-02 10:47:00.626871125 +0000 UTC m=+946.372971860" observedRunningTime="2026-02-02 10:47:00.884367037 +0000 UTC m=+946.630467782" watchObservedRunningTime="2026-02-02 10:47:00.88903515 +0000 UTC m=+946.635135885" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.409534 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-sb4rv"] Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.410958 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.415534 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.415920 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.416124 4909 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-4tzkk" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.428468 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-sb4rv"] Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.555034 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4pxl\" (UniqueName: \"kubernetes.io/projected/3a475b79-512c-42c3-ae33-9c4208d55edd-kube-api-access-b4pxl\") pod \"cert-manager-cainjector-5545bd876-sb4rv\" (UID: \"3a475b79-512c-42c3-ae33-9c4208d55edd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.555223 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a475b79-512c-42c3-ae33-9c4208d55edd-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-sb4rv\" (UID: \"3a475b79-512c-42c3-ae33-9c4208d55edd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.656459 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a475b79-512c-42c3-ae33-9c4208d55edd-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-sb4rv\" (UID: \"3a475b79-512c-42c3-ae33-9c4208d55edd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.656517 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4pxl\" (UniqueName: \"kubernetes.io/projected/3a475b79-512c-42c3-ae33-9c4208d55edd-kube-api-access-b4pxl\") pod \"cert-manager-cainjector-5545bd876-sb4rv\" (UID: \"3a475b79-512c-42c3-ae33-9c4208d55edd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.672560 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4pxl\" (UniqueName: \"kubernetes.io/projected/3a475b79-512c-42c3-ae33-9c4208d55edd-kube-api-access-b4pxl\") pod \"cert-manager-cainjector-5545bd876-sb4rv\" (UID: \"3a475b79-512c-42c3-ae33-9c4208d55edd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.682746 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3a475b79-512c-42c3-ae33-9c4208d55edd-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-sb4rv\" (UID: \"3a475b79-512c-42c3-ae33-9c4208d55edd\") " pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" Feb 02 10:47:05 crc kubenswrapper[4909]: I0202 10:47:05.731764 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.203532 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-sb4rv"] Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.615024 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-v24jd"] Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.615799 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.617907 4909 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-66sjw" Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.627299 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-v24jd"] Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.773929 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48108612-44a9-47ff-8d12-a52482a1c24c-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-v24jd\" (UID: \"48108612-44a9-47ff-8d12-a52482a1c24c\") " pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.774012 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmbmw\" (UniqueName: \"kubernetes.io/projected/48108612-44a9-47ff-8d12-a52482a1c24c-kube-api-access-lmbmw\") pod \"cert-manager-webhook-6888856db4-v24jd\" (UID: \"48108612-44a9-47ff-8d12-a52482a1c24c\") " pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.875292 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmbmw\" (UniqueName: \"kubernetes.io/projected/48108612-44a9-47ff-8d12-a52482a1c24c-kube-api-access-lmbmw\") pod \"cert-manager-webhook-6888856db4-v24jd\" (UID: \"48108612-44a9-47ff-8d12-a52482a1c24c\") " pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.875394 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48108612-44a9-47ff-8d12-a52482a1c24c-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-v24jd\" (UID: \"48108612-44a9-47ff-8d12-a52482a1c24c\") " pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.891919 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48108612-44a9-47ff-8d12-a52482a1c24c-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-v24jd\" (UID: \"48108612-44a9-47ff-8d12-a52482a1c24c\") " pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.897179 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmbmw\" (UniqueName: \"kubernetes.io/projected/48108612-44a9-47ff-8d12-a52482a1c24c-kube-api-access-lmbmw\") pod \"cert-manager-webhook-6888856db4-v24jd\" (UID: \"48108612-44a9-47ff-8d12-a52482a1c24c\") " pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.903103 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" event={"ID":"3a475b79-512c-42c3-ae33-9c4208d55edd","Type":"ContainerStarted","Data":"3b8c7cf1bca1e7e2d590fa246a6962a17496adabc2cd0426f284f7149017d307"} Feb 02 10:47:06 crc kubenswrapper[4909]: I0202 10:47:06.930132 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" Feb 02 10:47:07 crc kubenswrapper[4909]: I0202 10:47:07.311440 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-v24jd"] Feb 02 10:47:07 crc kubenswrapper[4909]: W0202 10:47:07.320971 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48108612_44a9_47ff_8d12_a52482a1c24c.slice/crio-4bace60e90f4d685d6904e75bb09ee981ad523abc371cff5ea21f0d7fa6a8d91 WatchSource:0}: Error finding container 4bace60e90f4d685d6904e75bb09ee981ad523abc371cff5ea21f0d7fa6a8d91: Status 404 returned error can't find the container with id 4bace60e90f4d685d6904e75bb09ee981ad523abc371cff5ea21f0d7fa6a8d91 Feb 02 10:47:07 crc kubenswrapper[4909]: I0202 10:47:07.909407 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" event={"ID":"48108612-44a9-47ff-8d12-a52482a1c24c","Type":"ContainerStarted","Data":"4bace60e90f4d685d6904e75bb09ee981ad523abc371cff5ea21f0d7fa6a8d91"} Feb 02 10:47:10 crc kubenswrapper[4909]: I0202 10:47:10.937690 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" event={"ID":"48108612-44a9-47ff-8d12-a52482a1c24c","Type":"ContainerStarted","Data":"067b82e12e05aed9b812d14882b480fce5d71822b8f535c7a1ec6cb4a7dfc430"} Feb 02 10:47:10 crc kubenswrapper[4909]: I0202 10:47:10.939279 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" Feb 02 10:47:10 crc kubenswrapper[4909]: I0202 10:47:10.940064 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" event={"ID":"3a475b79-512c-42c3-ae33-9c4208d55edd","Type":"ContainerStarted","Data":"a74bd4a615fc6f2c825e335f140cf3c5599363f8a95b26cb5b6aba5c81ea9c72"} Feb 02 10:47:10 crc kubenswrapper[4909]: I0202 10:47:10.966780 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" podStartSLOduration=1.986945969 podStartE2EDuration="4.966762613s" podCreationTimestamp="2026-02-02 10:47:06 +0000 UTC" firstStartedPulling="2026-02-02 10:47:07.324959214 +0000 UTC m=+953.071059949" lastFinishedPulling="2026-02-02 10:47:10.304775858 +0000 UTC m=+956.050876593" observedRunningTime="2026-02-02 10:47:10.965637961 +0000 UTC m=+956.711738696" watchObservedRunningTime="2026-02-02 10:47:10.966762613 +0000 UTC m=+956.712863348" Feb 02 10:47:10 crc kubenswrapper[4909]: I0202 10:47:10.985633 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-sb4rv" podStartSLOduration=1.912751882 podStartE2EDuration="5.985609929s" podCreationTimestamp="2026-02-02 10:47:05 +0000 UTC" firstStartedPulling="2026-02-02 10:47:06.215554276 +0000 UTC m=+951.961655011" lastFinishedPulling="2026-02-02 10:47:10.288412323 +0000 UTC m=+956.034513058" observedRunningTime="2026-02-02 10:47:10.981530223 +0000 UTC m=+956.727630958" watchObservedRunningTime="2026-02-02 10:47:10.985609929 +0000 UTC m=+956.731710664" Feb 02 10:47:16 crc kubenswrapper[4909]: I0202 10:47:16.933375 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-v24jd" Feb 02 10:47:19 crc kubenswrapper[4909]: I0202 10:47:19.511031 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:47:19 crc kubenswrapper[4909]: I0202 10:47:19.511402 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:47:19 crc kubenswrapper[4909]: I0202 10:47:19.511445 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:47:19 crc kubenswrapper[4909]: I0202 10:47:19.512046 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0831b6285fe2493141946d0a4e8629f9b6b1551f717985b11e1d8a63f78fa44"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:47:19 crc kubenswrapper[4909]: I0202 10:47:19.512092 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://e0831b6285fe2493141946d0a4e8629f9b6b1551f717985b11e1d8a63f78fa44" gracePeriod=600 Feb 02 10:47:19 crc kubenswrapper[4909]: I0202 10:47:19.993947 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="e0831b6285fe2493141946d0a4e8629f9b6b1551f717985b11e1d8a63f78fa44" exitCode=0 Feb 02 10:47:19 crc kubenswrapper[4909]: I0202 10:47:19.994043 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"e0831b6285fe2493141946d0a4e8629f9b6b1551f717985b11e1d8a63f78fa44"} Feb 02 10:47:19 crc kubenswrapper[4909]: I0202 10:47:19.994274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"8890ae7c7b5156c4d584d7bd5581da3d2b944d91026e2fa8dff7d54c88b8b78c"} Feb 02 10:47:19 crc kubenswrapper[4909]: I0202 10:47:19.994295 4909 scope.go:117] "RemoveContainer" containerID="93779139e6330b1d279baec90b6f5cebca5bcec1fa26d6a2c9986b097b6f7fb9" Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.520093 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-rdsl8"] Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.522007 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-rdsl8" Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.524049 4909 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-nlwc2" Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.527369 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-rdsl8"] Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.681457 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf5987aa-9823-41a1-ac90-a4d310f6fecb-bound-sa-token\") pod \"cert-manager-545d4d4674-rdsl8\" (UID: \"cf5987aa-9823-41a1-ac90-a4d310f6fecb\") " pod="cert-manager/cert-manager-545d4d4674-rdsl8" Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.681555 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jjsn\" (UniqueName: \"kubernetes.io/projected/cf5987aa-9823-41a1-ac90-a4d310f6fecb-kube-api-access-7jjsn\") pod \"cert-manager-545d4d4674-rdsl8\" (UID: \"cf5987aa-9823-41a1-ac90-a4d310f6fecb\") " pod="cert-manager/cert-manager-545d4d4674-rdsl8" Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.782992 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf5987aa-9823-41a1-ac90-a4d310f6fecb-bound-sa-token\") pod \"cert-manager-545d4d4674-rdsl8\" (UID: \"cf5987aa-9823-41a1-ac90-a4d310f6fecb\") " pod="cert-manager/cert-manager-545d4d4674-rdsl8" Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.783052 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jjsn\" (UniqueName: \"kubernetes.io/projected/cf5987aa-9823-41a1-ac90-a4d310f6fecb-kube-api-access-7jjsn\") pod \"cert-manager-545d4d4674-rdsl8\" (UID: \"cf5987aa-9823-41a1-ac90-a4d310f6fecb\") " pod="cert-manager/cert-manager-545d4d4674-rdsl8" Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.802569 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf5987aa-9823-41a1-ac90-a4d310f6fecb-bound-sa-token\") pod \"cert-manager-545d4d4674-rdsl8\" (UID: \"cf5987aa-9823-41a1-ac90-a4d310f6fecb\") " pod="cert-manager/cert-manager-545d4d4674-rdsl8" Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.805509 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jjsn\" (UniqueName: \"kubernetes.io/projected/cf5987aa-9823-41a1-ac90-a4d310f6fecb-kube-api-access-7jjsn\") pod \"cert-manager-545d4d4674-rdsl8\" (UID: \"cf5987aa-9823-41a1-ac90-a4d310f6fecb\") " pod="cert-manager/cert-manager-545d4d4674-rdsl8" Feb 02 10:47:22 crc kubenswrapper[4909]: I0202 10:47:22.841083 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-rdsl8" Feb 02 10:47:23 crc kubenswrapper[4909]: I0202 10:47:23.290739 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-rdsl8"] Feb 02 10:47:23 crc kubenswrapper[4909]: W0202 10:47:23.315150 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf5987aa_9823_41a1_ac90_a4d310f6fecb.slice/crio-d9b99de7b455b377e466879a8f7ce0bd4b3cb7990ae7f43ef9623b556f031d01 WatchSource:0}: Error finding container d9b99de7b455b377e466879a8f7ce0bd4b3cb7990ae7f43ef9623b556f031d01: Status 404 returned error can't find the container with id d9b99de7b455b377e466879a8f7ce0bd4b3cb7990ae7f43ef9623b556f031d01 Feb 02 10:47:24 crc kubenswrapper[4909]: I0202 10:47:24.022841 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-rdsl8" event={"ID":"cf5987aa-9823-41a1-ac90-a4d310f6fecb","Type":"ContainerStarted","Data":"d9230be88d50a66b0af24bb91a1a5aae7033e1ebf18104f8d372e3d1271d37bd"} Feb 02 10:47:24 crc kubenswrapper[4909]: I0202 10:47:24.023232 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-rdsl8" event={"ID":"cf5987aa-9823-41a1-ac90-a4d310f6fecb","Type":"ContainerStarted","Data":"d9b99de7b455b377e466879a8f7ce0bd4b3cb7990ae7f43ef9623b556f031d01"} Feb 02 10:47:24 crc kubenswrapper[4909]: I0202 10:47:24.039231 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-rdsl8" podStartSLOduration=2.039213106 podStartE2EDuration="2.039213106s" podCreationTimestamp="2026-02-02 10:47:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:47:24.036499659 +0000 UTC m=+969.782600394" watchObservedRunningTime="2026-02-02 10:47:24.039213106 +0000 UTC m=+969.785313841" Feb 02 10:47:29 crc kubenswrapper[4909]: I0202 10:47:29.931885 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-v88fb"] Feb 02 10:47:29 crc kubenswrapper[4909]: I0202 10:47:29.933302 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v88fb" Feb 02 10:47:29 crc kubenswrapper[4909]: I0202 10:47:29.937657 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 02 10:47:29 crc kubenswrapper[4909]: I0202 10:47:29.938933 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9kcvh" Feb 02 10:47:29 crc kubenswrapper[4909]: I0202 10:47:29.938997 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 02 10:47:29 crc kubenswrapper[4909]: I0202 10:47:29.945525 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v88fb"] Feb 02 10:47:30 crc kubenswrapper[4909]: I0202 10:47:30.086710 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjlfv\" (UniqueName: \"kubernetes.io/projected/4ee2ee55-73a1-4b90-b1c5-47baab55f8f0-kube-api-access-hjlfv\") pod \"openstack-operator-index-v88fb\" (UID: \"4ee2ee55-73a1-4b90-b1c5-47baab55f8f0\") " pod="openstack-operators/openstack-operator-index-v88fb" Feb 02 10:47:30 crc kubenswrapper[4909]: I0202 10:47:30.188641 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjlfv\" (UniqueName: \"kubernetes.io/projected/4ee2ee55-73a1-4b90-b1c5-47baab55f8f0-kube-api-access-hjlfv\") pod \"openstack-operator-index-v88fb\" (UID: \"4ee2ee55-73a1-4b90-b1c5-47baab55f8f0\") " pod="openstack-operators/openstack-operator-index-v88fb" Feb 02 10:47:30 crc kubenswrapper[4909]: I0202 10:47:30.208718 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjlfv\" (UniqueName: \"kubernetes.io/projected/4ee2ee55-73a1-4b90-b1c5-47baab55f8f0-kube-api-access-hjlfv\") pod \"openstack-operator-index-v88fb\" (UID: \"4ee2ee55-73a1-4b90-b1c5-47baab55f8f0\") " pod="openstack-operators/openstack-operator-index-v88fb" Feb 02 10:47:30 crc kubenswrapper[4909]: I0202 10:47:30.262592 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v88fb" Feb 02 10:47:30 crc kubenswrapper[4909]: I0202 10:47:30.716388 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-v88fb"] Feb 02 10:47:30 crc kubenswrapper[4909]: W0202 10:47:30.723020 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ee2ee55_73a1_4b90_b1c5_47baab55f8f0.slice/crio-231035ad8b8d5856ff0127cdac0ac033b21f366f71e9eb1b70918e6674d6cc0b WatchSource:0}: Error finding container 231035ad8b8d5856ff0127cdac0ac033b21f366f71e9eb1b70918e6674d6cc0b: Status 404 returned error can't find the container with id 231035ad8b8d5856ff0127cdac0ac033b21f366f71e9eb1b70918e6674d6cc0b Feb 02 10:47:31 crc kubenswrapper[4909]: I0202 10:47:31.075058 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v88fb" event={"ID":"4ee2ee55-73a1-4b90-b1c5-47baab55f8f0","Type":"ContainerStarted","Data":"231035ad8b8d5856ff0127cdac0ac033b21f366f71e9eb1b70918e6674d6cc0b"} Feb 02 10:47:32 crc kubenswrapper[4909]: I0202 10:47:32.083549 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v88fb" event={"ID":"4ee2ee55-73a1-4b90-b1c5-47baab55f8f0","Type":"ContainerStarted","Data":"13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13"} Feb 02 10:47:32 crc kubenswrapper[4909]: I0202 10:47:32.101697 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-v88fb" podStartSLOduration=2.444004525 podStartE2EDuration="3.10166856s" podCreationTimestamp="2026-02-02 10:47:29 +0000 UTC" firstStartedPulling="2026-02-02 10:47:30.725171812 +0000 UTC m=+976.471272547" lastFinishedPulling="2026-02-02 10:47:31.382835847 +0000 UTC m=+977.128936582" observedRunningTime="2026-02-02 10:47:32.095870656 +0000 UTC m=+977.841971421" watchObservedRunningTime="2026-02-02 10:47:32.10166856 +0000 UTC m=+977.847769305" Feb 02 10:47:34 crc kubenswrapper[4909]: I0202 10:47:34.318640 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v88fb"] Feb 02 10:47:34 crc kubenswrapper[4909]: I0202 10:47:34.321490 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-v88fb" podUID="4ee2ee55-73a1-4b90-b1c5-47baab55f8f0" containerName="registry-server" containerID="cri-o://13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13" gracePeriod=2 Feb 02 10:47:34 crc kubenswrapper[4909]: I0202 10:47:34.928730 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wgm82"] Feb 02 10:47:34 crc kubenswrapper[4909]: I0202 10:47:34.929969 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:34 crc kubenswrapper[4909]: I0202 10:47:34.947778 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wgm82"] Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.045481 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v88fb" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.051313 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-utilities\") pod \"certified-operators-wgm82\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.051393 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-catalog-content\") pod \"certified-operators-wgm82\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.051415 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wn6c\" (UniqueName: \"kubernetes.io/projected/3e23727f-2921-45d0-98f9-9a80e6de601a-kube-api-access-8wn6c\") pod \"certified-operators-wgm82\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.101687 4909 generic.go:334] "Generic (PLEG): container finished" podID="4ee2ee55-73a1-4b90-b1c5-47baab55f8f0" containerID="13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13" exitCode=0 Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.101726 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v88fb" event={"ID":"4ee2ee55-73a1-4b90-b1c5-47baab55f8f0","Type":"ContainerDied","Data":"13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13"} Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.101757 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-v88fb" event={"ID":"4ee2ee55-73a1-4b90-b1c5-47baab55f8f0","Type":"ContainerDied","Data":"231035ad8b8d5856ff0127cdac0ac033b21f366f71e9eb1b70918e6674d6cc0b"} Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.101765 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-v88fb" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.101773 4909 scope.go:117] "RemoveContainer" containerID="13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.118159 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-27vfj"] Feb 02 10:47:35 crc kubenswrapper[4909]: E0202 10:47:35.118422 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee2ee55-73a1-4b90-b1c5-47baab55f8f0" containerName="registry-server" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.118434 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee2ee55-73a1-4b90-b1c5-47baab55f8f0" containerName="registry-server" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.118535 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee2ee55-73a1-4b90-b1c5-47baab55f8f0" containerName="registry-server" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.118961 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-27vfj" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.131200 4909 scope.go:117] "RemoveContainer" containerID="13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13" Feb 02 10:47:35 crc kubenswrapper[4909]: E0202 10:47:35.131834 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13\": container with ID starting with 13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13 not found: ID does not exist" containerID="13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.131890 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13"} err="failed to get container status \"13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13\": rpc error: code = NotFound desc = could not find container \"13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13\": container with ID starting with 13d5f5105ba6f0c25687f50e6fcdece38a16c887fb03b481fd2607775cf17c13 not found: ID does not exist" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.132910 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-27vfj"] Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.152873 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjlfv\" (UniqueName: \"kubernetes.io/projected/4ee2ee55-73a1-4b90-b1c5-47baab55f8f0-kube-api-access-hjlfv\") pod \"4ee2ee55-73a1-4b90-b1c5-47baab55f8f0\" (UID: \"4ee2ee55-73a1-4b90-b1c5-47baab55f8f0\") " Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.153366 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-catalog-content\") pod \"certified-operators-wgm82\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.153712 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wn6c\" (UniqueName: \"kubernetes.io/projected/3e23727f-2921-45d0-98f9-9a80e6de601a-kube-api-access-8wn6c\") pod \"certified-operators-wgm82\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.153775 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-utilities\") pod \"certified-operators-wgm82\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.153835 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scj4r\" (UniqueName: \"kubernetes.io/projected/f0df9d40-bcc7-4f3a-8424-a8404d325e7b-kube-api-access-scj4r\") pod \"openstack-operator-index-27vfj\" (UID: \"f0df9d40-bcc7-4f3a-8424-a8404d325e7b\") " pod="openstack-operators/openstack-operator-index-27vfj" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.153972 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-catalog-content\") pod \"certified-operators-wgm82\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.154156 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-utilities\") pod \"certified-operators-wgm82\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.169363 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wn6c\" (UniqueName: \"kubernetes.io/projected/3e23727f-2921-45d0-98f9-9a80e6de601a-kube-api-access-8wn6c\") pod \"certified-operators-wgm82\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.190509 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee2ee55-73a1-4b90-b1c5-47baab55f8f0-kube-api-access-hjlfv" (OuterVolumeSpecName: "kube-api-access-hjlfv") pod "4ee2ee55-73a1-4b90-b1c5-47baab55f8f0" (UID: "4ee2ee55-73a1-4b90-b1c5-47baab55f8f0"). InnerVolumeSpecName "kube-api-access-hjlfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.250901 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.254551 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scj4r\" (UniqueName: \"kubernetes.io/projected/f0df9d40-bcc7-4f3a-8424-a8404d325e7b-kube-api-access-scj4r\") pod \"openstack-operator-index-27vfj\" (UID: \"f0df9d40-bcc7-4f3a-8424-a8404d325e7b\") " pod="openstack-operators/openstack-operator-index-27vfj" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.254692 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjlfv\" (UniqueName: \"kubernetes.io/projected/4ee2ee55-73a1-4b90-b1c5-47baab55f8f0-kube-api-access-hjlfv\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.276537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scj4r\" (UniqueName: \"kubernetes.io/projected/f0df9d40-bcc7-4f3a-8424-a8404d325e7b-kube-api-access-scj4r\") pod \"openstack-operator-index-27vfj\" (UID: \"f0df9d40-bcc7-4f3a-8424-a8404d325e7b\") " pod="openstack-operators/openstack-operator-index-27vfj" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.434485 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-27vfj" Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.490173 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-v88fb"] Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.497110 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-v88fb"] Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.507269 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wgm82"] Feb 02 10:47:35 crc kubenswrapper[4909]: I0202 10:47:35.712185 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-27vfj"] Feb 02 10:47:35 crc kubenswrapper[4909]: W0202 10:47:35.782566 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0df9d40_bcc7_4f3a_8424_a8404d325e7b.slice/crio-ff8944c2de4f591aef5a58b19552a8407b6997c7973f1cf04e9215d9b59d7307 WatchSource:0}: Error finding container ff8944c2de4f591aef5a58b19552a8407b6997c7973f1cf04e9215d9b59d7307: Status 404 returned error can't find the container with id ff8944c2de4f591aef5a58b19552a8407b6997c7973f1cf04e9215d9b59d7307 Feb 02 10:47:36 crc kubenswrapper[4909]: I0202 10:47:36.114274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-27vfj" event={"ID":"f0df9d40-bcc7-4f3a-8424-a8404d325e7b","Type":"ContainerStarted","Data":"ff8944c2de4f591aef5a58b19552a8407b6997c7973f1cf04e9215d9b59d7307"} Feb 02 10:47:36 crc kubenswrapper[4909]: I0202 10:47:36.117062 4909 generic.go:334] "Generic (PLEG): container finished" podID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerID="615ab8fcca306b159a743cf296341c91966563f576f2cc1795d104919d542a33" exitCode=0 Feb 02 10:47:36 crc kubenswrapper[4909]: I0202 10:47:36.117110 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgm82" event={"ID":"3e23727f-2921-45d0-98f9-9a80e6de601a","Type":"ContainerDied","Data":"615ab8fcca306b159a743cf296341c91966563f576f2cc1795d104919d542a33"} Feb 02 10:47:36 crc kubenswrapper[4909]: I0202 10:47:36.117138 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgm82" event={"ID":"3e23727f-2921-45d0-98f9-9a80e6de601a","Type":"ContainerStarted","Data":"e15705ac042022db8e8fff08e9749a58e53abdd8a8ce3b49e39d6df610a168cc"} Feb 02 10:47:37 crc kubenswrapper[4909]: I0202 10:47:37.025065 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee2ee55-73a1-4b90-b1c5-47baab55f8f0" path="/var/lib/kubelet/pods/4ee2ee55-73a1-4b90-b1c5-47baab55f8f0/volumes" Feb 02 10:47:37 crc kubenswrapper[4909]: I0202 10:47:37.124266 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-27vfj" event={"ID":"f0df9d40-bcc7-4f3a-8424-a8404d325e7b","Type":"ContainerStarted","Data":"e127b4b3fd811b3b0fe8f0b86e1ac76a4b890ab75b861b13e454889e99a3e258"} Feb 02 10:47:37 crc kubenswrapper[4909]: I0202 10:47:37.127127 4909 generic.go:334] "Generic (PLEG): container finished" podID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerID="b2eb60d3315aeb009af6298999413d534226688534842c6efcb9814f0725c56c" exitCode=0 Feb 02 10:47:37 crc kubenswrapper[4909]: I0202 10:47:37.127175 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgm82" event={"ID":"3e23727f-2921-45d0-98f9-9a80e6de601a","Type":"ContainerDied","Data":"b2eb60d3315aeb009af6298999413d534226688534842c6efcb9814f0725c56c"} Feb 02 10:47:37 crc kubenswrapper[4909]: I0202 10:47:37.138418 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-27vfj" podStartSLOduration=1.758192143 podStartE2EDuration="2.138399226s" podCreationTimestamp="2026-02-02 10:47:35 +0000 UTC" firstStartedPulling="2026-02-02 10:47:35.786446743 +0000 UTC m=+981.532547478" lastFinishedPulling="2026-02-02 10:47:36.166653806 +0000 UTC m=+981.912754561" observedRunningTime="2026-02-02 10:47:37.136907094 +0000 UTC m=+982.883007829" watchObservedRunningTime="2026-02-02 10:47:37.138399226 +0000 UTC m=+982.884499961" Feb 02 10:47:38 crc kubenswrapper[4909]: I0202 10:47:38.134870 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgm82" event={"ID":"3e23727f-2921-45d0-98f9-9a80e6de601a","Type":"ContainerStarted","Data":"6955d0e7bcbfd886fd3de3f45ae56f7a3c1fd31861d4960f8848001cc7361437"} Feb 02 10:47:38 crc kubenswrapper[4909]: I0202 10:47:38.156521 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wgm82" podStartSLOduration=2.737989089 podStartE2EDuration="4.156505073s" podCreationTimestamp="2026-02-02 10:47:34 +0000 UTC" firstStartedPulling="2026-02-02 10:47:36.119744863 +0000 UTC m=+981.865845598" lastFinishedPulling="2026-02-02 10:47:37.538260847 +0000 UTC m=+983.284361582" observedRunningTime="2026-02-02 10:47:38.152557491 +0000 UTC m=+983.898658236" watchObservedRunningTime="2026-02-02 10:47:38.156505073 +0000 UTC m=+983.902605808" Feb 02 10:47:45 crc kubenswrapper[4909]: I0202 10:47:45.252044 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:45 crc kubenswrapper[4909]: I0202 10:47:45.252992 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:45 crc kubenswrapper[4909]: I0202 10:47:45.287075 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:45 crc kubenswrapper[4909]: I0202 10:47:45.435136 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-27vfj" Feb 02 10:47:45 crc kubenswrapper[4909]: I0202 10:47:45.435344 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-27vfj" Feb 02 10:47:45 crc kubenswrapper[4909]: I0202 10:47:45.463893 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-27vfj" Feb 02 10:47:46 crc kubenswrapper[4909]: I0202 10:47:46.200670 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-27vfj" Feb 02 10:47:46 crc kubenswrapper[4909]: I0202 10:47:46.221437 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.348957 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4"] Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.351149 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.353655 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-zfgrh" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.364114 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4"] Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.505465 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-bundle\") pod \"805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.505702 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-util\") pod \"805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.505774 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkzd6\" (UniqueName: \"kubernetes.io/projected/aab415f2-2dde-4804-a951-2a0df278fd86-kube-api-access-wkzd6\") pod \"805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.607977 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-util\") pod \"805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.608056 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkzd6\" (UniqueName: \"kubernetes.io/projected/aab415f2-2dde-4804-a951-2a0df278fd86-kube-api-access-wkzd6\") pod \"805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.608162 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-bundle\") pod \"805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.608884 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-util\") pod \"805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.609079 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-bundle\") pod \"805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.634750 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkzd6\" (UniqueName: \"kubernetes.io/projected/aab415f2-2dde-4804-a951-2a0df278fd86-kube-api-access-wkzd6\") pod \"805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.673094 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:47 crc kubenswrapper[4909]: I0202 10:47:47.929991 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4"] Feb 02 10:47:48 crc kubenswrapper[4909]: I0202 10:47:48.197198 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" event={"ID":"aab415f2-2dde-4804-a951-2a0df278fd86","Type":"ContainerStarted","Data":"dcc61d288612cef8787cd658d7419645fb13a7802f00fa866c347db61453677c"} Feb 02 10:47:48 crc kubenswrapper[4909]: I0202 10:47:48.197253 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" event={"ID":"aab415f2-2dde-4804-a951-2a0df278fd86","Type":"ContainerStarted","Data":"4b216b43e6ef934ffcffe2477998f4589c6008d54df7779de4f39949b8801799"} Feb 02 10:47:48 crc kubenswrapper[4909]: I0202 10:47:48.908533 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wgm82"] Feb 02 10:47:48 crc kubenswrapper[4909]: I0202 10:47:48.908993 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wgm82" podUID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerName="registry-server" containerID="cri-o://6955d0e7bcbfd886fd3de3f45ae56f7a3c1fd31861d4960f8848001cc7361437" gracePeriod=2 Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.207144 4909 generic.go:334] "Generic (PLEG): container finished" podID="aab415f2-2dde-4804-a951-2a0df278fd86" containerID="dcc61d288612cef8787cd658d7419645fb13a7802f00fa866c347db61453677c" exitCode=0 Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.207298 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" event={"ID":"aab415f2-2dde-4804-a951-2a0df278fd86","Type":"ContainerDied","Data":"dcc61d288612cef8787cd658d7419645fb13a7802f00fa866c347db61453677c"} Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.215111 4909 generic.go:334] "Generic (PLEG): container finished" podID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerID="6955d0e7bcbfd886fd3de3f45ae56f7a3c1fd31861d4960f8848001cc7361437" exitCode=0 Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.215147 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgm82" event={"ID":"3e23727f-2921-45d0-98f9-9a80e6de601a","Type":"ContainerDied","Data":"6955d0e7bcbfd886fd3de3f45ae56f7a3c1fd31861d4960f8848001cc7361437"} Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.276937 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.333806 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-catalog-content\") pod \"3e23727f-2921-45d0-98f9-9a80e6de601a\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.334018 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wn6c\" (UniqueName: \"kubernetes.io/projected/3e23727f-2921-45d0-98f9-9a80e6de601a-kube-api-access-8wn6c\") pod \"3e23727f-2921-45d0-98f9-9a80e6de601a\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.334088 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-utilities\") pod \"3e23727f-2921-45d0-98f9-9a80e6de601a\" (UID: \"3e23727f-2921-45d0-98f9-9a80e6de601a\") " Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.335202 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-utilities" (OuterVolumeSpecName: "utilities") pod "3e23727f-2921-45d0-98f9-9a80e6de601a" (UID: "3e23727f-2921-45d0-98f9-9a80e6de601a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.343419 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e23727f-2921-45d0-98f9-9a80e6de601a-kube-api-access-8wn6c" (OuterVolumeSpecName: "kube-api-access-8wn6c") pod "3e23727f-2921-45d0-98f9-9a80e6de601a" (UID: "3e23727f-2921-45d0-98f9-9a80e6de601a"). InnerVolumeSpecName "kube-api-access-8wn6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.395158 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e23727f-2921-45d0-98f9-9a80e6de601a" (UID: "3e23727f-2921-45d0-98f9-9a80e6de601a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.435055 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.435091 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e23727f-2921-45d0-98f9-9a80e6de601a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:49 crc kubenswrapper[4909]: I0202 10:47:49.435104 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wn6c\" (UniqueName: \"kubernetes.io/projected/3e23727f-2921-45d0-98f9-9a80e6de601a-kube-api-access-8wn6c\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:50 crc kubenswrapper[4909]: I0202 10:47:50.223499 4909 generic.go:334] "Generic (PLEG): container finished" podID="aab415f2-2dde-4804-a951-2a0df278fd86" containerID="e1c8bab26379c4c35c9dc4891ba5b4be6b51a7d6811f4dbf75372316f0020ab6" exitCode=0 Feb 02 10:47:50 crc kubenswrapper[4909]: I0202 10:47:50.223544 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" event={"ID":"aab415f2-2dde-4804-a951-2a0df278fd86","Type":"ContainerDied","Data":"e1c8bab26379c4c35c9dc4891ba5b4be6b51a7d6811f4dbf75372316f0020ab6"} Feb 02 10:47:50 crc kubenswrapper[4909]: I0202 10:47:50.227306 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wgm82" event={"ID":"3e23727f-2921-45d0-98f9-9a80e6de601a","Type":"ContainerDied","Data":"e15705ac042022db8e8fff08e9749a58e53abdd8a8ce3b49e39d6df610a168cc"} Feb 02 10:47:50 crc kubenswrapper[4909]: I0202 10:47:50.227391 4909 scope.go:117] "RemoveContainer" containerID="6955d0e7bcbfd886fd3de3f45ae56f7a3c1fd31861d4960f8848001cc7361437" Feb 02 10:47:50 crc kubenswrapper[4909]: I0202 10:47:50.227405 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wgm82" Feb 02 10:47:50 crc kubenswrapper[4909]: I0202 10:47:50.242556 4909 scope.go:117] "RemoveContainer" containerID="b2eb60d3315aeb009af6298999413d534226688534842c6efcb9814f0725c56c" Feb 02 10:47:50 crc kubenswrapper[4909]: I0202 10:47:50.261010 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wgm82"] Feb 02 10:47:50 crc kubenswrapper[4909]: I0202 10:47:50.266007 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wgm82"] Feb 02 10:47:50 crc kubenswrapper[4909]: I0202 10:47:50.287876 4909 scope.go:117] "RemoveContainer" containerID="615ab8fcca306b159a743cf296341c91966563f576f2cc1795d104919d542a33" Feb 02 10:47:51 crc kubenswrapper[4909]: I0202 10:47:51.025124 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e23727f-2921-45d0-98f9-9a80e6de601a" path="/var/lib/kubelet/pods/3e23727f-2921-45d0-98f9-9a80e6de601a/volumes" Feb 02 10:47:51 crc kubenswrapper[4909]: I0202 10:47:51.235446 4909 generic.go:334] "Generic (PLEG): container finished" podID="aab415f2-2dde-4804-a951-2a0df278fd86" containerID="dee420576cc793b1dd49a96d0803dfb44138458bf995fc645a941235359e0f4e" exitCode=0 Feb 02 10:47:51 crc kubenswrapper[4909]: I0202 10:47:51.235525 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" event={"ID":"aab415f2-2dde-4804-a951-2a0df278fd86","Type":"ContainerDied","Data":"dee420576cc793b1dd49a96d0803dfb44138458bf995fc645a941235359e0f4e"} Feb 02 10:47:52 crc kubenswrapper[4909]: I0202 10:47:52.576227 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:52 crc kubenswrapper[4909]: I0202 10:47:52.775185 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-bundle\") pod \"aab415f2-2dde-4804-a951-2a0df278fd86\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " Feb 02 10:47:52 crc kubenswrapper[4909]: I0202 10:47:52.775278 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-util\") pod \"aab415f2-2dde-4804-a951-2a0df278fd86\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " Feb 02 10:47:52 crc kubenswrapper[4909]: I0202 10:47:52.775326 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkzd6\" (UniqueName: \"kubernetes.io/projected/aab415f2-2dde-4804-a951-2a0df278fd86-kube-api-access-wkzd6\") pod \"aab415f2-2dde-4804-a951-2a0df278fd86\" (UID: \"aab415f2-2dde-4804-a951-2a0df278fd86\") " Feb 02 10:47:52 crc kubenswrapper[4909]: I0202 10:47:52.776458 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-bundle" (OuterVolumeSpecName: "bundle") pod "aab415f2-2dde-4804-a951-2a0df278fd86" (UID: "aab415f2-2dde-4804-a951-2a0df278fd86"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:52 crc kubenswrapper[4909]: I0202 10:47:52.781016 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab415f2-2dde-4804-a951-2a0df278fd86-kube-api-access-wkzd6" (OuterVolumeSpecName: "kube-api-access-wkzd6") pod "aab415f2-2dde-4804-a951-2a0df278fd86" (UID: "aab415f2-2dde-4804-a951-2a0df278fd86"). InnerVolumeSpecName "kube-api-access-wkzd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:47:52 crc kubenswrapper[4909]: I0202 10:47:52.797883 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-util" (OuterVolumeSpecName: "util") pod "aab415f2-2dde-4804-a951-2a0df278fd86" (UID: "aab415f2-2dde-4804-a951-2a0df278fd86"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:47:52 crc kubenswrapper[4909]: I0202 10:47:52.876241 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-util\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:52 crc kubenswrapper[4909]: I0202 10:47:52.876272 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkzd6\" (UniqueName: \"kubernetes.io/projected/aab415f2-2dde-4804-a951-2a0df278fd86-kube-api-access-wkzd6\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:52 crc kubenswrapper[4909]: I0202 10:47:52.876282 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aab415f2-2dde-4804-a951-2a0df278fd86-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:47:53 crc kubenswrapper[4909]: I0202 10:47:53.250864 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" event={"ID":"aab415f2-2dde-4804-a951-2a0df278fd86","Type":"ContainerDied","Data":"4b216b43e6ef934ffcffe2477998f4589c6008d54df7779de4f39949b8801799"} Feb 02 10:47:53 crc kubenswrapper[4909]: I0202 10:47:53.250940 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4" Feb 02 10:47:53 crc kubenswrapper[4909]: I0202 10:47:53.250947 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b216b43e6ef934ffcffe2477998f4589c6008d54df7779de4f39949b8801799" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.143534 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq"] Feb 02 10:47:56 crc kubenswrapper[4909]: E0202 10:47:56.144379 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab415f2-2dde-4804-a951-2a0df278fd86" containerName="util" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.144395 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab415f2-2dde-4804-a951-2a0df278fd86" containerName="util" Feb 02 10:47:56 crc kubenswrapper[4909]: E0202 10:47:56.144407 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab415f2-2dde-4804-a951-2a0df278fd86" containerName="pull" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.144414 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab415f2-2dde-4804-a951-2a0df278fd86" containerName="pull" Feb 02 10:47:56 crc kubenswrapper[4909]: E0202 10:47:56.144429 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab415f2-2dde-4804-a951-2a0df278fd86" containerName="extract" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.144439 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab415f2-2dde-4804-a951-2a0df278fd86" containerName="extract" Feb 02 10:47:56 crc kubenswrapper[4909]: E0202 10:47:56.144453 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerName="extract-utilities" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.144462 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerName="extract-utilities" Feb 02 10:47:56 crc kubenswrapper[4909]: E0202 10:47:56.144481 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerName="registry-server" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.144490 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerName="registry-server" Feb 02 10:47:56 crc kubenswrapper[4909]: E0202 10:47:56.144504 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerName="extract-content" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.144512 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerName="extract-content" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.144644 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab415f2-2dde-4804-a951-2a0df278fd86" containerName="extract" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.144661 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e23727f-2921-45d0-98f9-9a80e6de601a" containerName="registry-server" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.145317 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.148405 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-fqqk2" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.176442 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq"] Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.313361 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5wmp\" (UniqueName: \"kubernetes.io/projected/5718c49e-365b-4d07-8a3f-69cdd9012758-kube-api-access-n5wmp\") pod \"openstack-operator-controller-init-6bf6665fd-mnrdq\" (UID: \"5718c49e-365b-4d07-8a3f-69cdd9012758\") " pod="openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.415035 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5wmp\" (UniqueName: \"kubernetes.io/projected/5718c49e-365b-4d07-8a3f-69cdd9012758-kube-api-access-n5wmp\") pod \"openstack-operator-controller-init-6bf6665fd-mnrdq\" (UID: \"5718c49e-365b-4d07-8a3f-69cdd9012758\") " pod="openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.438955 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5wmp\" (UniqueName: \"kubernetes.io/projected/5718c49e-365b-4d07-8a3f-69cdd9012758-kube-api-access-n5wmp\") pod \"openstack-operator-controller-init-6bf6665fd-mnrdq\" (UID: \"5718c49e-365b-4d07-8a3f-69cdd9012758\") " pod="openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.469947 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq" Feb 02 10:47:56 crc kubenswrapper[4909]: I0202 10:47:56.688217 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq"] Feb 02 10:47:57 crc kubenswrapper[4909]: I0202 10:47:57.273828 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq" event={"ID":"5718c49e-365b-4d07-8a3f-69cdd9012758","Type":"ContainerStarted","Data":"d0abc28d6cdb6fc96f272db70c6f7200aff1212d4d407f961c10ac09a9c87e1b"} Feb 02 10:48:01 crc kubenswrapper[4909]: I0202 10:48:01.296117 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq" event={"ID":"5718c49e-365b-4d07-8a3f-69cdd9012758","Type":"ContainerStarted","Data":"ef3bc840cf843b043b1b2430ebaa159609e5663b7819de4e7810f6eef5d20e2c"} Feb 02 10:48:01 crc kubenswrapper[4909]: I0202 10:48:01.296706 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq" Feb 02 10:48:01 crc kubenswrapper[4909]: I0202 10:48:01.326602 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq" podStartSLOduration=1.4086527819999999 podStartE2EDuration="5.32658592s" podCreationTimestamp="2026-02-02 10:47:56 +0000 UTC" firstStartedPulling="2026-02-02 10:47:56.697858397 +0000 UTC m=+1002.443959132" lastFinishedPulling="2026-02-02 10:48:00.615791535 +0000 UTC m=+1006.361892270" observedRunningTime="2026-02-02 10:48:01.321335581 +0000 UTC m=+1007.067436316" watchObservedRunningTime="2026-02-02 10:48:01.32658592 +0000 UTC m=+1007.072686645" Feb 02 10:48:06 crc kubenswrapper[4909]: I0202 10:48:06.473657 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6bf6665fd-mnrdq" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.384658 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.386184 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.388765 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4ss5j" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.402516 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.406953 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llsbv\" (UniqueName: \"kubernetes.io/projected/b3448119-ce0b-44b6-8491-2d2bc7a1352b-kube-api-access-llsbv\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-dhq2f\" (UID: \"b3448119-ce0b-44b6-8491-2d2bc7a1352b\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.413178 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.413923 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.419445 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-zstmh" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.440835 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.442199 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.446236 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-kl77j" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.454372 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.493432 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.506912 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.507929 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.507994 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d269x\" (UniqueName: \"kubernetes.io/projected/fd689c48-8bd5-4200-9602-2a9c82503585-kube-api-access-d269x\") pod \"cinder-operator-controller-manager-8d874c8fc-q85jz\" (UID: \"fd689c48-8bd5-4200-9602-2a9c82503585\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.508048 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7lxh\" (UniqueName: \"kubernetes.io/projected/f5680c7a-ca86-40e8-b724-de63f5a24da2-kube-api-access-h7lxh\") pod \"designate-operator-controller-manager-6d9697b7f4-ztcq9\" (UID: \"f5680c7a-ca86-40e8-b724-de63f5a24da2\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.508072 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llsbv\" (UniqueName: \"kubernetes.io/projected/b3448119-ce0b-44b6-8491-2d2bc7a1352b-kube-api-access-llsbv\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-dhq2f\" (UID: \"b3448119-ce0b-44b6-8491-2d2bc7a1352b\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.512329 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-j8qm6" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.516229 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.544323 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llsbv\" (UniqueName: \"kubernetes.io/projected/b3448119-ce0b-44b6-8491-2d2bc7a1352b-kube-api-access-llsbv\") pod \"barbican-operator-controller-manager-7b6c4d8c5f-dhq2f\" (UID: \"b3448119-ce0b-44b6-8491-2d2bc7a1352b\") " pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.546260 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.556946 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.560287 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lmntv" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.561248 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.562644 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.569347 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-szkhp" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.605169 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.611434 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vbtc\" (UniqueName: \"kubernetes.io/projected/113ec314-f58c-41b3-bf30-8925e5555c77-kube-api-access-2vbtc\") pod \"horizon-operator-controller-manager-5fb775575f-h65nm\" (UID: \"113ec314-f58c-41b3-bf30-8925e5555c77\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.611473 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p5f2\" (UniqueName: \"kubernetes.io/projected/beeaa5f8-617f-4486-86a0-122ab355e4de-kube-api-access-7p5f2\") pod \"glance-operator-controller-manager-8886f4c47-prbwc\" (UID: \"beeaa5f8-617f-4486-86a0-122ab355e4de\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.611525 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d269x\" (UniqueName: \"kubernetes.io/projected/fd689c48-8bd5-4200-9602-2a9c82503585-kube-api-access-d269x\") pod \"cinder-operator-controller-manager-8d874c8fc-q85jz\" (UID: \"fd689c48-8bd5-4200-9602-2a9c82503585\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.611561 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpl98\" (UniqueName: \"kubernetes.io/projected/b2ab0041-b77e-4974-9ca4-7100b40c06e8-kube-api-access-qpl98\") pod \"heat-operator-controller-manager-69d6db494d-kqmfw\" (UID: \"b2ab0041-b77e-4974-9ca4-7100b40c06e8\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.611637 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7lxh\" (UniqueName: \"kubernetes.io/projected/f5680c7a-ca86-40e8-b724-de63f5a24da2-kube-api-access-h7lxh\") pod \"designate-operator-controller-manager-6d9697b7f4-ztcq9\" (UID: \"f5680c7a-ca86-40e8-b724-de63f5a24da2\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.616840 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.617666 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.624183 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-d6dqt" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.625988 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.640978 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7lxh\" (UniqueName: \"kubernetes.io/projected/f5680c7a-ca86-40e8-b724-de63f5a24da2-kube-api-access-h7lxh\") pod \"designate-operator-controller-manager-6d9697b7f4-ztcq9\" (UID: \"f5680c7a-ca86-40e8-b724-de63f5a24da2\") " pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.641380 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d269x\" (UniqueName: \"kubernetes.io/projected/fd689c48-8bd5-4200-9602-2a9c82503585-kube-api-access-d269x\") pod \"cinder-operator-controller-manager-8d874c8fc-q85jz\" (UID: \"fd689c48-8bd5-4200-9602-2a9c82503585\") " pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.659013 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.675633 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.695450 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.696327 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.699937 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2hsgz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.707263 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.708341 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.709418 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.712665 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vbtc\" (UniqueName: \"kubernetes.io/projected/113ec314-f58c-41b3-bf30-8925e5555c77-kube-api-access-2vbtc\") pod \"horizon-operator-controller-manager-5fb775575f-h65nm\" (UID: \"113ec314-f58c-41b3-bf30-8925e5555c77\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.712786 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p5f2\" (UniqueName: \"kubernetes.io/projected/beeaa5f8-617f-4486-86a0-122ab355e4de-kube-api-access-7p5f2\") pod \"glance-operator-controller-manager-8886f4c47-prbwc\" (UID: \"beeaa5f8-617f-4486-86a0-122ab355e4de\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.714014 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.714166 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckh6c\" (UniqueName: \"kubernetes.io/projected/2b2f22c4-e784-4742-86fa-aef6d4e31970-kube-api-access-ckh6c\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.714267 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpl98\" (UniqueName: \"kubernetes.io/projected/b2ab0041-b77e-4974-9ca4-7100b40c06e8-kube-api-access-qpl98\") pod \"heat-operator-controller-manager-69d6db494d-kqmfw\" (UID: \"b2ab0041-b77e-4974-9ca4-7100b40c06e8\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.714371 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-6q662" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.714478 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ptgm\" (UniqueName: \"kubernetes.io/projected/3a9ea22c-1031-4060-a4c1-7b65710bcb49-kube-api-access-5ptgm\") pod \"keystone-operator-controller-manager-84f48565d4-x9bhz\" (UID: \"3a9ea22c-1031-4060-a4c1-7b65710bcb49\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.719894 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-k6628"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.720841 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-k6628" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.723393 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-jbvh4" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.732245 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.740878 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.745363 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p5f2\" (UniqueName: \"kubernetes.io/projected/beeaa5f8-617f-4486-86a0-122ab355e4de-kube-api-access-7p5f2\") pod \"glance-operator-controller-manager-8886f4c47-prbwc\" (UID: \"beeaa5f8-617f-4486-86a0-122ab355e4de\") " pod="openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.748484 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vbtc\" (UniqueName: \"kubernetes.io/projected/113ec314-f58c-41b3-bf30-8925e5555c77-kube-api-access-2vbtc\") pod \"horizon-operator-controller-manager-5fb775575f-h65nm\" (UID: \"113ec314-f58c-41b3-bf30-8925e5555c77\") " pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.758924 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpl98\" (UniqueName: \"kubernetes.io/projected/b2ab0041-b77e-4974-9ca4-7100b40c06e8-kube-api-access-qpl98\") pod \"heat-operator-controller-manager-69d6db494d-kqmfw\" (UID: \"b2ab0041-b77e-4974-9ca4-7100b40c06e8\") " pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.778900 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.782974 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.794573 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.796266 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.798952 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-5w9xt" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.806621 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-k6628"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.811214 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.816296 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.817243 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.818660 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.818707 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlxzg\" (UniqueName: \"kubernetes.io/projected/939074c8-53f8-4574-8868-34d99851993d-kube-api-access-jlxzg\") pod \"ironic-operator-controller-manager-5f4b8bd54d-zfzjz\" (UID: \"939074c8-53f8-4574-8868-34d99851993d\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.818743 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckh6c\" (UniqueName: \"kubernetes.io/projected/2b2f22c4-e784-4742-86fa-aef6d4e31970-kube-api-access-ckh6c\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.818770 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9n7m\" (UniqueName: \"kubernetes.io/projected/9002663c-c720-4fde-be9c-47008dfd15a6-kube-api-access-d9n7m\") pod \"mariadb-operator-controller-manager-67bf948998-ppnjd\" (UID: \"9002663c-c720-4fde-be9c-47008dfd15a6\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.818800 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l68j\" (UniqueName: \"kubernetes.io/projected/eceab652-e90e-4c17-a629-91fbd88492e6-kube-api-access-4l68j\") pod \"manila-operator-controller-manager-7dd968899f-k6628\" (UID: \"eceab652-e90e-4c17-a629-91fbd88492e6\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-k6628" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.818850 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ptgm\" (UniqueName: \"kubernetes.io/projected/3a9ea22c-1031-4060-a4c1-7b65710bcb49-kube-api-access-5ptgm\") pod \"keystone-operator-controller-manager-84f48565d4-x9bhz\" (UID: \"3a9ea22c-1031-4060-a4c1-7b65710bcb49\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz" Feb 02 10:48:25 crc kubenswrapper[4909]: E0202 10:48:25.818847 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:25 crc kubenswrapper[4909]: E0202 10:48:25.818933 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert podName:2b2f22c4-e784-4742-86fa-aef6d4e31970 nodeName:}" failed. No retries permitted until 2026-02-02 10:48:26.318914395 +0000 UTC m=+1032.065015130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert") pod "infra-operator-controller-manager-79955696d6-2rl4b" (UID: "2b2f22c4-e784-4742-86fa-aef6d4e31970") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.823930 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-bhskj" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.825239 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.825998 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.828857 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-dqqdw" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.833873 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.834705 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.835631 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.837378 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.838864 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-thbh8" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.841564 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckh6c\" (UniqueName: \"kubernetes.io/projected/2b2f22c4-e784-4742-86fa-aef6d4e31970-kube-api-access-ckh6c\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.842416 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.844494 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ptgm\" (UniqueName: \"kubernetes.io/projected/3a9ea22c-1031-4060-a4c1-7b65710bcb49-kube-api-access-5ptgm\") pod \"keystone-operator-controller-manager-84f48565d4-x9bhz\" (UID: \"3a9ea22c-1031-4060-a4c1-7b65710bcb49\") " pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.848518 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.894114 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-845mj"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.895516 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-845mj" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.899925 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-845mj"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.903292 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.903601 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qm7xp" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.909737 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.910795 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.916459 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.917389 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.917571 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-m87mq" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.919588 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l68j\" (UniqueName: \"kubernetes.io/projected/eceab652-e90e-4c17-a629-91fbd88492e6-kube-api-access-4l68j\") pod \"manila-operator-controller-manager-7dd968899f-k6628\" (UID: \"eceab652-e90e-4c17-a629-91fbd88492e6\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-k6628" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.919627 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.919682 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv78m\" (UniqueName: \"kubernetes.io/projected/f4fc8502-eb04-4488-aeef-02f233cac870-kube-api-access-jv78m\") pod \"nova-operator-controller-manager-55bff696bd-vcrsp\" (UID: \"f4fc8502-eb04-4488-aeef-02f233cac870\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.919714 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz57x\" (UniqueName: \"kubernetes.io/projected/4b7813c2-b207-47b4-a1da-099645dc5e7c-kube-api-access-nz57x\") pod \"ovn-operator-controller-manager-788c46999f-845mj\" (UID: \"4b7813c2-b207-47b4-a1da-099645dc5e7c\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-845mj" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.919736 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8hzn\" (UniqueName: \"kubernetes.io/projected/12c45a89-12fb-4ce4-aa9a-35931b51407a-kube-api-access-x8hzn\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.919758 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlxzg\" (UniqueName: \"kubernetes.io/projected/939074c8-53f8-4574-8868-34d99851993d-kube-api-access-jlxzg\") pod \"ironic-operator-controller-manager-5f4b8bd54d-zfzjz\" (UID: \"939074c8-53f8-4574-8868-34d99851993d\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.919794 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4k2r\" (UniqueName: \"kubernetes.io/projected/f79c4c2a-6eb9-4883-a6bf-3955b44fad05-kube-api-access-b4k2r\") pod \"octavia-operator-controller-manager-6687f8d877-7bxgj\" (UID: \"f79c4c2a-6eb9-4883-a6bf-3955b44fad05\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.919829 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9n7m\" (UniqueName: \"kubernetes.io/projected/9002663c-c720-4fde-be9c-47008dfd15a6-kube-api-access-d9n7m\") pod \"mariadb-operator-controller-manager-67bf948998-ppnjd\" (UID: \"9002663c-c720-4fde-be9c-47008dfd15a6\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.919846 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f88d9\" (UniqueName: \"kubernetes.io/projected/dc7852b6-2562-4a40-8ba7-01764a270e45-kube-api-access-f88d9\") pod \"neutron-operator-controller-manager-585dbc889-5kc7k\" (UID: \"dc7852b6-2562-4a40-8ba7-01764a270e45\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.920743 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.921597 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.923130 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.926322 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-zbmgb" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.945127 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l68j\" (UniqueName: \"kubernetes.io/projected/eceab652-e90e-4c17-a629-91fbd88492e6-kube-api-access-4l68j\") pod \"manila-operator-controller-manager-7dd968899f-k6628\" (UID: \"eceab652-e90e-4c17-a629-91fbd88492e6\") " pod="openstack-operators/manila-operator-controller-manager-7dd968899f-k6628" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.948325 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlxzg\" (UniqueName: \"kubernetes.io/projected/939074c8-53f8-4574-8868-34d99851993d-kube-api-access-jlxzg\") pod \"ironic-operator-controller-manager-5f4b8bd54d-zfzjz\" (UID: \"939074c8-53f8-4574-8868-34d99851993d\") " pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz" Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.948390 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp"] Feb 02 10:48:25 crc kubenswrapper[4909]: I0202 10:48:25.963943 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9n7m\" (UniqueName: \"kubernetes.io/projected/9002663c-c720-4fde-be9c-47008dfd15a6-kube-api-access-d9n7m\") pod \"mariadb-operator-controller-manager-67bf948998-ppnjd\" (UID: \"9002663c-c720-4fde-be9c-47008dfd15a6\") " pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.013739 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.015756 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.018555 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.022372 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv78m\" (UniqueName: \"kubernetes.io/projected/f4fc8502-eb04-4488-aeef-02f233cac870-kube-api-access-jv78m\") pod \"nova-operator-controller-manager-55bff696bd-vcrsp\" (UID: \"f4fc8502-eb04-4488-aeef-02f233cac870\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.022605 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz57x\" (UniqueName: \"kubernetes.io/projected/4b7813c2-b207-47b4-a1da-099645dc5e7c-kube-api-access-nz57x\") pod \"ovn-operator-controller-manager-788c46999f-845mj\" (UID: \"4b7813c2-b207-47b4-a1da-099645dc5e7c\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-845mj" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.022907 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8hzn\" (UniqueName: \"kubernetes.io/projected/12c45a89-12fb-4ce4-aa9a-35931b51407a-kube-api-access-x8hzn\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.023076 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4k2r\" (UniqueName: \"kubernetes.io/projected/f79c4c2a-6eb9-4883-a6bf-3955b44fad05-kube-api-access-b4k2r\") pod \"octavia-operator-controller-manager-6687f8d877-7bxgj\" (UID: \"f79c4c2a-6eb9-4883-a6bf-3955b44fad05\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.023234 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f88d9\" (UniqueName: \"kubernetes.io/projected/dc7852b6-2562-4a40-8ba7-01764a270e45-kube-api-access-f88d9\") pod \"neutron-operator-controller-manager-585dbc889-5kc7k\" (UID: \"dc7852b6-2562-4a40-8ba7-01764a270e45\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.023350 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.023485 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4xtk\" (UniqueName: \"kubernetes.io/projected/74b20727-6371-480f-aece-9d33cfc2075a-kube-api-access-n4xtk\") pod \"placement-operator-controller-manager-5b964cf4cd-9v7l6\" (UID: \"74b20727-6371-480f-aece-9d33cfc2075a\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" Feb 02 10:48:26 crc kubenswrapper[4909]: E0202 10:48:26.024550 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:26 crc kubenswrapper[4909]: E0202 10:48:26.024624 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert podName:12c45a89-12fb-4ce4-aa9a-35931b51407a nodeName:}" failed. No retries permitted until 2026-02-02 10:48:26.524601049 +0000 UTC m=+1032.270701864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert") pod "openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" (UID: "12c45a89-12fb-4ce4-aa9a-35931b51407a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.048202 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-k9ld4" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.058520 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv78m\" (UniqueName: \"kubernetes.io/projected/f4fc8502-eb04-4488-aeef-02f233cac870-kube-api-access-jv78m\") pod \"nova-operator-controller-manager-55bff696bd-vcrsp\" (UID: \"f4fc8502-eb04-4488-aeef-02f233cac870\") " pod="openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.101900 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.101956 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.113315 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.126543 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.127949 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8hzn\" (UniqueName: \"kubernetes.io/projected/12c45a89-12fb-4ce4-aa9a-35931b51407a-kube-api-access-x8hzn\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.130052 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f88d9\" (UniqueName: \"kubernetes.io/projected/dc7852b6-2562-4a40-8ba7-01764a270e45-kube-api-access-f88d9\") pod \"neutron-operator-controller-manager-585dbc889-5kc7k\" (UID: \"dc7852b6-2562-4a40-8ba7-01764a270e45\") " pod="openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.132787 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4xtk\" (UniqueName: \"kubernetes.io/projected/74b20727-6371-480f-aece-9d33cfc2075a-kube-api-access-n4xtk\") pod \"placement-operator-controller-manager-5b964cf4cd-9v7l6\" (UID: \"74b20727-6371-480f-aece-9d33cfc2075a\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.134517 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-k6628" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.144281 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-sfckc" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.150401 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.151634 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.158688 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz57x\" (UniqueName: \"kubernetes.io/projected/4b7813c2-b207-47b4-a1da-099645dc5e7c-kube-api-access-nz57x\") pod \"ovn-operator-controller-manager-788c46999f-845mj\" (UID: \"4b7813c2-b207-47b4-a1da-099645dc5e7c\") " pod="openstack-operators/ovn-operator-controller-manager-788c46999f-845mj" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.170150 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.173505 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4k2r\" (UniqueName: \"kubernetes.io/projected/f79c4c2a-6eb9-4883-a6bf-3955b44fad05-kube-api-access-b4k2r\") pod \"octavia-operator-controller-manager-6687f8d877-7bxgj\" (UID: \"f79c4c2a-6eb9-4883-a6bf-3955b44fad05\") " pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.187576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4xtk\" (UniqueName: \"kubernetes.io/projected/74b20727-6371-480f-aece-9d33cfc2075a-kube-api-access-n4xtk\") pod \"placement-operator-controller-manager-5b964cf4cd-9v7l6\" (UID: \"74b20727-6371-480f-aece-9d33cfc2075a\") " pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.222015 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.235427 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l58t9\" (UniqueName: \"kubernetes.io/projected/186cb127-7e1c-4edf-befe-5e8c89d5d819-kube-api-access-l58t9\") pod \"swift-operator-controller-manager-68fc8c869-r9p5r\" (UID: \"186cb127-7e1c-4edf-befe-5e8c89d5d819\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.235534 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlprs\" (UniqueName: \"kubernetes.io/projected/621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c-kube-api-access-qlprs\") pod \"telemetry-operator-controller-manager-64b5b76f97-28qg2\" (UID: \"621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.248168 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.262248 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.263386 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-845mj" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.265124 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.271193 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.272111 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-ps9nr" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.336716 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlprs\" (UniqueName: \"kubernetes.io/projected/621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c-kube-api-access-qlprs\") pod \"telemetry-operator-controller-manager-64b5b76f97-28qg2\" (UID: \"621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.336829 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l58t9\" (UniqueName: \"kubernetes.io/projected/186cb127-7e1c-4edf-befe-5e8c89d5d819-kube-api-access-l58t9\") pod \"swift-operator-controller-manager-68fc8c869-r9p5r\" (UID: \"186cb127-7e1c-4edf-befe-5e8c89d5d819\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.336865 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:26 crc kubenswrapper[4909]: E0202 10:48:26.337024 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:26 crc kubenswrapper[4909]: E0202 10:48:26.337086 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert podName:2b2f22c4-e784-4742-86fa-aef6d4e31970 nodeName:}" failed. No retries permitted until 2026-02-02 10:48:27.337065017 +0000 UTC m=+1033.083165752 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert") pod "infra-operator-controller-manager-79955696d6-2rl4b" (UID: "2b2f22c4-e784-4742-86fa-aef6d4e31970") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.337410 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.353457 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-7z4k5"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.354344 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.365716 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-7z4k5"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.386949 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l58t9\" (UniqueName: \"kubernetes.io/projected/186cb127-7e1c-4edf-befe-5e8c89d5d819-kube-api-access-l58t9\") pod \"swift-operator-controller-manager-68fc8c869-r9p5r\" (UID: \"186cb127-7e1c-4edf-befe-5e8c89d5d819\") " pod="openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.390335 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dj885" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.401555 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlprs\" (UniqueName: \"kubernetes.io/projected/621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c-kube-api-access-qlprs\") pod \"telemetry-operator-controller-manager-64b5b76f97-28qg2\" (UID: \"621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c\") " pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.431544 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.432676 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.435089 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.435281 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.437726 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr2fj\" (UniqueName: \"kubernetes.io/projected/20dbebe4-193c-4657-a36f-45c3ec2c435c-kube-api-access-gr2fj\") pod \"test-operator-controller-manager-56f8bfcd9f-zj4v5\" (UID: \"20dbebe4-193c-4657-a36f-45c3ec2c435c\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.437790 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czkzz\" (UniqueName: \"kubernetes.io/projected/2ef2b6a8-5046-4a0c-82fd-b85d0e460c03-kube-api-access-czkzz\") pod \"watcher-operator-controller-manager-564965969-7z4k5\" (UID: \"2ef2b6a8-5046-4a0c-82fd-b85d0e460c03\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.447262 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-zwqt8" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.451087 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.480420 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.487772 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.490458 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-5k6dw" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.498875 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.529555 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.549529 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr2fj\" (UniqueName: \"kubernetes.io/projected/20dbebe4-193c-4657-a36f-45c3ec2c435c-kube-api-access-gr2fj\") pod \"test-operator-controller-manager-56f8bfcd9f-zj4v5\" (UID: \"20dbebe4-193c-4657-a36f-45c3ec2c435c\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.549584 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czkzz\" (UniqueName: \"kubernetes.io/projected/2ef2b6a8-5046-4a0c-82fd-b85d0e460c03-kube-api-access-czkzz\") pod \"watcher-operator-controller-manager-564965969-7z4k5\" (UID: \"2ef2b6a8-5046-4a0c-82fd-b85d0e460c03\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.549624 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.549642 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r998g\" (UniqueName: \"kubernetes.io/projected/124911b8-e918-4360-8206-e9d72be4448f-kube-api-access-r998g\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.549668 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.549709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:26 crc kubenswrapper[4909]: E0202 10:48:26.550657 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:26 crc kubenswrapper[4909]: E0202 10:48:26.550693 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert podName:12c45a89-12fb-4ce4-aa9a-35931b51407a nodeName:}" failed. No retries permitted until 2026-02-02 10:48:27.550680476 +0000 UTC m=+1033.296781211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert") pod "openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" (UID: "12c45a89-12fb-4ce4-aa9a-35931b51407a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.563294 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.567484 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.573993 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czkzz\" (UniqueName: \"kubernetes.io/projected/2ef2b6a8-5046-4a0c-82fd-b85d0e460c03-kube-api-access-czkzz\") pod \"watcher-operator-controller-manager-564965969-7z4k5\" (UID: \"2ef2b6a8-5046-4a0c-82fd-b85d0e460c03\") " pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.582004 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.595149 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr2fj\" (UniqueName: \"kubernetes.io/projected/20dbebe4-193c-4657-a36f-45c3ec2c435c-kube-api-access-gr2fj\") pod \"test-operator-controller-manager-56f8bfcd9f-zj4v5\" (UID: \"20dbebe4-193c-4657-a36f-45c3ec2c435c\") " pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.620890 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.651169 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.651751 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r998g\" (UniqueName: \"kubernetes.io/projected/124911b8-e918-4360-8206-e9d72be4448f-kube-api-access-r998g\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.651855 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.651933 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5bcv\" (UniqueName: \"kubernetes.io/projected/5d71df35-50ab-41fe-ad0a-cfbc9b06cc71-kube-api-access-j5bcv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mnk6q\" (UID: \"5d71df35-50ab-41fe-ad0a-cfbc9b06cc71\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q" Feb 02 10:48:26 crc kubenswrapper[4909]: E0202 10:48:26.651370 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:48:26 crc kubenswrapper[4909]: E0202 10:48:26.652116 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:27.152096958 +0000 UTC m=+1032.898197693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "webhook-server-cert" not found Feb 02 10:48:26 crc kubenswrapper[4909]: E0202 10:48:26.652568 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:48:26 crc kubenswrapper[4909]: E0202 10:48:26.652622 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:27.152612152 +0000 UTC m=+1032.898712887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "metrics-server-cert" not found Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.672721 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r998g\" (UniqueName: \"kubernetes.io/projected/124911b8-e918-4360-8206-e9d72be4448f-kube-api-access-r998g\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.696082 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.708126 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc"] Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.726197 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.752935 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5bcv\" (UniqueName: \"kubernetes.io/projected/5d71df35-50ab-41fe-ad0a-cfbc9b06cc71-kube-api-access-j5bcv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mnk6q\" (UID: \"5d71df35-50ab-41fe-ad0a-cfbc9b06cc71\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q" Feb 02 10:48:26 crc kubenswrapper[4909]: W0202 10:48:26.776191 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd689c48_8bd5_4200_9602_2a9c82503585.slice/crio-57e43d329dfb2f1cbd99de217fd048c2c59b40434f35af91125acacd0ff9521c WatchSource:0}: Error finding container 57e43d329dfb2f1cbd99de217fd048c2c59b40434f35af91125acacd0ff9521c: Status 404 returned error can't find the container with id 57e43d329dfb2f1cbd99de217fd048c2c59b40434f35af91125acacd0ff9521c Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.791252 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5bcv\" (UniqueName: \"kubernetes.io/projected/5d71df35-50ab-41fe-ad0a-cfbc9b06cc71-kube-api-access-j5bcv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-mnk6q\" (UID: \"5d71df35-50ab-41fe-ad0a-cfbc9b06cc71\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q" Feb 02 10:48:26 crc kubenswrapper[4909]: W0202 10:48:26.796494 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5680c7a_ca86_40e8_b724_de63f5a24da2.slice/crio-fb84a726638e483a2ac805a58e339b70f4ed0850c032208fb79e82646f4291c0 WatchSource:0}: Error finding container fb84a726638e483a2ac805a58e339b70f4ed0850c032208fb79e82646f4291c0: Status 404 returned error can't find the container with id fb84a726638e483a2ac805a58e339b70f4ed0850c032208fb79e82646f4291c0 Feb 02 10:48:26 crc kubenswrapper[4909]: W0202 10:48:26.797058 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeeaa5f8_617f_4486_86a0_122ab355e4de.slice/crio-a15c0de49ac699242cd242f7680b813b8e95a6917ee4a014bc88dcc56b95ca8c WatchSource:0}: Error finding container a15c0de49ac699242cd242f7680b813b8e95a6917ee4a014bc88dcc56b95ca8c: Status 404 returned error can't find the container with id a15c0de49ac699242cd242f7680b813b8e95a6917ee4a014bc88dcc56b95ca8c Feb 02 10:48:26 crc kubenswrapper[4909]: I0202 10:48:26.834095 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q" Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.158216 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.158257 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.158647 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.158775 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.159012 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:28.15899155 +0000 UTC m=+1033.905092355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "webhook-server-cert" not found Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.159061 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:28.159046642 +0000 UTC m=+1033.905147377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "metrics-server-cert" not found Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.215725 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw"] Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.231059 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz"] Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.238034 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7dd968899f-k6628"] Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.244341 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k"] Feb 02 10:48:27 crc kubenswrapper[4909]: W0202 10:48:27.249666 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeceab652_e90e_4c17_a629_91fbd88492e6.slice/crio-f4ef6bde429eca6950e39c44b964bb5f5a2aac36cc80c93279398c199bd986e6 WatchSource:0}: Error finding container f4ef6bde429eca6950e39c44b964bb5f5a2aac36cc80c93279398c199bd986e6: Status 404 returned error can't find the container with id f4ef6bde429eca6950e39c44b964bb5f5a2aac36cc80c93279398c199bd986e6 Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.263383 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm"] Feb 02 10:48:27 crc kubenswrapper[4909]: W0202 10:48:27.266786 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a9ea22c_1031_4060_a4c1_7b65710bcb49.slice/crio-1187d71706820f2a081cb5060fb3299990fd2a3303cbf49a38c9b2bfbb6f012c WatchSource:0}: Error finding container 1187d71706820f2a081cb5060fb3299990fd2a3303cbf49a38c9b2bfbb6f012c: Status 404 returned error can't find the container with id 1187d71706820f2a081cb5060fb3299990fd2a3303cbf49a38c9b2bfbb6f012c Feb 02 10:48:27 crc kubenswrapper[4909]: W0202 10:48:27.268801 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod113ec314_f58c_41b3_bf30_8925e5555c77.slice/crio-9b2c907eb84766613bbbaa14b89bdd4e401bdcb7db673cc6bd8bc3b21de7ef00 WatchSource:0}: Error finding container 9b2c907eb84766613bbbaa14b89bdd4e401bdcb7db673cc6bd8bc3b21de7ef00: Status 404 returned error can't find the container with id 9b2c907eb84766613bbbaa14b89bdd4e401bdcb7db673cc6bd8bc3b21de7ef00 Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.270988 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz"] Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.334585 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd"] Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.353702 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp"] Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.360659 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.360850 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.360903 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert podName:2b2f22c4-e784-4742-86fa-aef6d4e31970 nodeName:}" failed. No retries permitted until 2026-02-02 10:48:29.360887256 +0000 UTC m=+1035.106987991 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert") pod "infra-operator-controller-manager-79955696d6-2rl4b" (UID: "2b2f22c4-e784-4742-86fa-aef6d4e31970") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.362757 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-788c46999f-845mj"] Feb 02 10:48:27 crc kubenswrapper[4909]: W0202 10:48:27.380676 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4fc8502_eb04_4488_aeef_02f233cac870.slice/crio-d985d7584ba7fb8fc65506966aa8954e706f1cc3324c25645c21e77bd46c630b WatchSource:0}: Error finding container d985d7584ba7fb8fc65506966aa8954e706f1cc3324c25645c21e77bd46c630b: Status 404 returned error can't find the container with id d985d7584ba7fb8fc65506966aa8954e706f1cc3324c25645c21e77bd46c630b Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.449689 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc" event={"ID":"beeaa5f8-617f-4486-86a0-122ab355e4de","Type":"ContainerStarted","Data":"a15c0de49ac699242cd242f7680b813b8e95a6917ee4a014bc88dcc56b95ca8c"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.450916 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" event={"ID":"b2ab0041-b77e-4974-9ca4-7100b40c06e8","Type":"ContainerStarted","Data":"c9af342d3dd9222f4dace7d5b485ea888ac577dbdd2b406c62fc0ab7f2cfa4cf"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.452859 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9" event={"ID":"f5680c7a-ca86-40e8-b724-de63f5a24da2","Type":"ContainerStarted","Data":"fb84a726638e483a2ac805a58e339b70f4ed0850c032208fb79e82646f4291c0"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.454491 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz" event={"ID":"3a9ea22c-1031-4060-a4c1-7b65710bcb49","Type":"ContainerStarted","Data":"1187d71706820f2a081cb5060fb3299990fd2a3303cbf49a38c9b2bfbb6f012c"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.455712 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-845mj" event={"ID":"4b7813c2-b207-47b4-a1da-099645dc5e7c","Type":"ContainerStarted","Data":"f97c58591146282a74d43b171192705bf519c458030f4abcc18281963ed4bc86"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.456965 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp" event={"ID":"f4fc8502-eb04-4488-aeef-02f233cac870","Type":"ContainerStarted","Data":"d985d7584ba7fb8fc65506966aa8954e706f1cc3324c25645c21e77bd46c630b"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.458119 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm" event={"ID":"113ec314-f58c-41b3-bf30-8925e5555c77","Type":"ContainerStarted","Data":"9b2c907eb84766613bbbaa14b89bdd4e401bdcb7db673cc6bd8bc3b21de7ef00"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.459372 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz" event={"ID":"fd689c48-8bd5-4200-9602-2a9c82503585","Type":"ContainerStarted","Data":"57e43d329dfb2f1cbd99de217fd048c2c59b40434f35af91125acacd0ff9521c"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.462324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f" event={"ID":"b3448119-ce0b-44b6-8491-2d2bc7a1352b","Type":"ContainerStarted","Data":"3ad3140ef43f19140c3d7255fbef3fde60622dd1370128a799f4c822cdd6beb2"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.463769 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" event={"ID":"9002663c-c720-4fde-be9c-47008dfd15a6","Type":"ContainerStarted","Data":"235781e6590377151eabeec1761e7627792ab6ef71399727b257e25eb51c30f1"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.467136 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k" event={"ID":"dc7852b6-2562-4a40-8ba7-01764a270e45","Type":"ContainerStarted","Data":"a5b5a0bdee166995b73116f4bbad3537d769c1e383203a26322f31fceab81e48"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.468388 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-k6628" event={"ID":"eceab652-e90e-4c17-a629-91fbd88492e6","Type":"ContainerStarted","Data":"f4ef6bde429eca6950e39c44b964bb5f5a2aac36cc80c93279398c199bd986e6"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.469979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz" event={"ID":"939074c8-53f8-4574-8868-34d99851993d","Type":"ContainerStarted","Data":"35f25ce90516c363e1002f19455c9d11ea33e1bdd09efd982bd6b337537fbdfc"} Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.563256 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.563487 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.563566 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert podName:12c45a89-12fb-4ce4-aa9a-35931b51407a nodeName:}" failed. No retries permitted until 2026-02-02 10:48:29.563547785 +0000 UTC m=+1035.309648520 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert") pod "openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" (UID: "12c45a89-12fb-4ce4-aa9a-35931b51407a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.606748 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q"] Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.635503 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r"] Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.654470 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-564965969-7z4k5"] Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.658716 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6"] Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.673355 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5"] Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.691855 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-czkzz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-564965969-7z4k5_openstack-operators(2ef2b6a8-5046-4a0c-82fd-b85d0e460c03): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.692021 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gr2fj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-56f8bfcd9f-zj4v5_openstack-operators(20dbebe4-193c-4657-a36f-45c3ec2c435c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.694155 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" podUID="20dbebe4-193c-4657-a36f-45c3ec2c435c" Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.694205 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" podUID="2ef2b6a8-5046-4a0c-82fd-b85d0e460c03" Feb 02 10:48:27 crc kubenswrapper[4909]: W0202 10:48:27.696426 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74b20727_6371_480f_aece_9d33cfc2075a.slice/crio-f8e635676558816c861593f498c647956ce5a41a748f256e0720daa2fd3533b8 WatchSource:0}: Error finding container f8e635676558816c861593f498c647956ce5a41a748f256e0720daa2fd3533b8: Status 404 returned error can't find the container with id f8e635676558816c861593f498c647956ce5a41a748f256e0720daa2fd3533b8 Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.701730 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj"] Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.706304 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n4xtk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b964cf4cd-9v7l6_openstack-operators(74b20727-6371-480f-aece-9d33cfc2075a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:48:27 crc kubenswrapper[4909]: I0202 10:48:27.706606 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2"] Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.706842 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qlprs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-64b5b76f97-28qg2_openstack-operators(621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.707119 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b4k2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-6687f8d877-7bxgj_openstack-operators(f79c4c2a-6eb9-4883-a6bf-3955b44fad05): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.708248 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" podUID="f79c4c2a-6eb9-4883-a6bf-3955b44fad05" Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.708297 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" podUID="74b20727-6371-480f-aece-9d33cfc2075a" Feb 02 10:48:27 crc kubenswrapper[4909]: E0202 10:48:27.708323 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" podUID="621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c" Feb 02 10:48:28 crc kubenswrapper[4909]: I0202 10:48:28.174180 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:28 crc kubenswrapper[4909]: E0202 10:48:28.174335 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:48:28 crc kubenswrapper[4909]: E0202 10:48:28.174394 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:48:28 crc kubenswrapper[4909]: E0202 10:48:28.174403 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:30.17438927 +0000 UTC m=+1035.920490005 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "metrics-server-cert" not found Feb 02 10:48:28 crc kubenswrapper[4909]: E0202 10:48:28.174424 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:30.174413421 +0000 UTC m=+1035.920514156 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "webhook-server-cert" not found Feb 02 10:48:28 crc kubenswrapper[4909]: I0202 10:48:28.174344 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:28 crc kubenswrapper[4909]: I0202 10:48:28.481360 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" event={"ID":"20dbebe4-193c-4657-a36f-45c3ec2c435c","Type":"ContainerStarted","Data":"edfb7a467d048561fb3bf9c72af329a87c78d14164c89c4de97dc74cf270a71a"} Feb 02 10:48:28 crc kubenswrapper[4909]: E0202 10:48:28.483418 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" podUID="20dbebe4-193c-4657-a36f-45c3ec2c435c" Feb 02 10:48:28 crc kubenswrapper[4909]: I0202 10:48:28.486987 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" event={"ID":"f79c4c2a-6eb9-4883-a6bf-3955b44fad05","Type":"ContainerStarted","Data":"b26eb518157e2b7cc2534b9e65d8b0198dd86a6eea35c0a5ade28cc685a14103"} Feb 02 10:48:28 crc kubenswrapper[4909]: E0202 10:48:28.489316 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" podUID="f79c4c2a-6eb9-4883-a6bf-3955b44fad05" Feb 02 10:48:28 crc kubenswrapper[4909]: I0202 10:48:28.492572 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q" event={"ID":"5d71df35-50ab-41fe-ad0a-cfbc9b06cc71","Type":"ContainerStarted","Data":"52d6208bb935005e2902f288f4ee1507ba0053dae1d2d831fe357f9f2da67b8a"} Feb 02 10:48:28 crc kubenswrapper[4909]: I0202 10:48:28.495772 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" event={"ID":"2ef2b6a8-5046-4a0c-82fd-b85d0e460c03","Type":"ContainerStarted","Data":"58684f3af98891e1a62a6ac39bb22acfa229cfd21da731378d5484031b4a59ab"} Feb 02 10:48:28 crc kubenswrapper[4909]: E0202 10:48:28.499997 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" podUID="2ef2b6a8-5046-4a0c-82fd-b85d0e460c03" Feb 02 10:48:28 crc kubenswrapper[4909]: I0202 10:48:28.499996 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" event={"ID":"74b20727-6371-480f-aece-9d33cfc2075a","Type":"ContainerStarted","Data":"f8e635676558816c861593f498c647956ce5a41a748f256e0720daa2fd3533b8"} Feb 02 10:48:28 crc kubenswrapper[4909]: I0202 10:48:28.517474 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r" event={"ID":"186cb127-7e1c-4edf-befe-5e8c89d5d819","Type":"ContainerStarted","Data":"cc91c23a38dfd2128b24f548d07a342770337b2d513f712e345d0c7a53038120"} Feb 02 10:48:28 crc kubenswrapper[4909]: I0202 10:48:28.528479 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" event={"ID":"621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c","Type":"ContainerStarted","Data":"98d68c8093b8c585e882a355fe55e1b71bd61a7e77e3d52cafee3b2aa8aa8fe6"} Feb 02 10:48:28 crc kubenswrapper[4909]: E0202 10:48:28.531159 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" podUID="74b20727-6371-480f-aece-9d33cfc2075a" Feb 02 10:48:28 crc kubenswrapper[4909]: E0202 10:48:28.533253 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" podUID="621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c" Feb 02 10:48:29 crc kubenswrapper[4909]: I0202 10:48:29.411976 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:29 crc kubenswrapper[4909]: E0202 10:48:29.412185 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:29 crc kubenswrapper[4909]: E0202 10:48:29.412292 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert podName:2b2f22c4-e784-4742-86fa-aef6d4e31970 nodeName:}" failed. No retries permitted until 2026-02-02 10:48:33.41226432 +0000 UTC m=+1039.158365115 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert") pod "infra-operator-controller-manager-79955696d6-2rl4b" (UID: "2b2f22c4-e784-4742-86fa-aef6d4e31970") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:29 crc kubenswrapper[4909]: E0202 10:48:29.539765 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:f9bf288cd0c13912404027a58ea3b90d4092b641e8265adc5c88644ea7fe901a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" podUID="621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c" Feb 02 10:48:29 crc kubenswrapper[4909]: E0202 10:48:29.540978 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e0824d5d461ada59715eb3048ed9394c80abba09c45503f8f90ee3b34e525488\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" podUID="74b20727-6371-480f-aece-9d33cfc2075a" Feb 02 10:48:29 crc kubenswrapper[4909]: E0202 10:48:29.541034 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e6f2f361f1dcbb321407a5884951e16ff96e7b88942b10b548f27ad4de14a0be\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" podUID="f79c4c2a-6eb9-4883-a6bf-3955b44fad05" Feb 02 10:48:29 crc kubenswrapper[4909]: E0202 10:48:29.541065 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:3e01e99d3ca1b6c20b1bb015b00cfcbffc584f22a93dc6fe4019d63b813c0241\\\"\"" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" podUID="20dbebe4-193c-4657-a36f-45c3ec2c435c" Feb 02 10:48:29 crc kubenswrapper[4909]: E0202 10:48:29.541075 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7869203f6f97de780368d507636031090fed3b658d2f7771acbd4481bdfc870b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" podUID="2ef2b6a8-5046-4a0c-82fd-b85d0e460c03" Feb 02 10:48:29 crc kubenswrapper[4909]: I0202 10:48:29.626719 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:29 crc kubenswrapper[4909]: E0202 10:48:29.626950 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:29 crc kubenswrapper[4909]: E0202 10:48:29.627031 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert podName:12c45a89-12fb-4ce4-aa9a-35931b51407a nodeName:}" failed. No retries permitted until 2026-02-02 10:48:33.627013592 +0000 UTC m=+1039.373114327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert") pod "openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" (UID: "12c45a89-12fb-4ce4-aa9a-35931b51407a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:30 crc kubenswrapper[4909]: I0202 10:48:30.240864 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:30 crc kubenswrapper[4909]: I0202 10:48:30.240921 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:30 crc kubenswrapper[4909]: E0202 10:48:30.241011 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:48:30 crc kubenswrapper[4909]: E0202 10:48:30.241075 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:34.241057179 +0000 UTC m=+1039.987157914 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "webhook-server-cert" not found Feb 02 10:48:30 crc kubenswrapper[4909]: E0202 10:48:30.241185 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:48:30 crc kubenswrapper[4909]: E0202 10:48:30.241266 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:34.241242124 +0000 UTC m=+1039.987342859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "metrics-server-cert" not found Feb 02 10:48:33 crc kubenswrapper[4909]: I0202 10:48:33.486143 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:33 crc kubenswrapper[4909]: E0202 10:48:33.486649 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:33 crc kubenswrapper[4909]: E0202 10:48:33.486695 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert podName:2b2f22c4-e784-4742-86fa-aef6d4e31970 nodeName:}" failed. No retries permitted until 2026-02-02 10:48:41.486681364 +0000 UTC m=+1047.232782099 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert") pod "infra-operator-controller-manager-79955696d6-2rl4b" (UID: "2b2f22c4-e784-4742-86fa-aef6d4e31970") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:33 crc kubenswrapper[4909]: I0202 10:48:33.689537 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:33 crc kubenswrapper[4909]: E0202 10:48:33.689732 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:33 crc kubenswrapper[4909]: E0202 10:48:33.689828 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert podName:12c45a89-12fb-4ce4-aa9a-35931b51407a nodeName:}" failed. No retries permitted until 2026-02-02 10:48:41.689794375 +0000 UTC m=+1047.435895110 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert") pod "openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" (UID: "12c45a89-12fb-4ce4-aa9a-35931b51407a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:34 crc kubenswrapper[4909]: I0202 10:48:34.305440 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:34 crc kubenswrapper[4909]: I0202 10:48:34.305654 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:34 crc kubenswrapper[4909]: E0202 10:48:34.305827 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:48:34 crc kubenswrapper[4909]: E0202 10:48:34.305900 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:42.3058809 +0000 UTC m=+1048.051981635 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "webhook-server-cert" not found Feb 02 10:48:34 crc kubenswrapper[4909]: E0202 10:48:34.305931 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:48:34 crc kubenswrapper[4909]: E0202 10:48:34.306000 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:42.305983223 +0000 UTC m=+1048.052083958 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "metrics-server-cert" not found Feb 02 10:48:40 crc kubenswrapper[4909]: E0202 10:48:40.599568 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf" Feb 02 10:48:40 crc kubenswrapper[4909]: E0202 10:48:40.600282 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d9n7m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67bf948998-ppnjd_openstack-operators(9002663c-c720-4fde-be9c-47008dfd15a6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:48:40 crc kubenswrapper[4909]: E0202 10:48:40.601471 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" podUID="9002663c-c720-4fde-be9c-47008dfd15a6" Feb 02 10:48:40 crc kubenswrapper[4909]: E0202 10:48:40.627980 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:2d493137559b74e23edb4788b7fbdb38b3e239df0f2d7e6e540e50b2355fc3cf\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" podUID="9002663c-c720-4fde-be9c-47008dfd15a6" Feb 02 10:48:41 crc kubenswrapper[4909]: E0202 10:48:41.223334 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10" Feb 02 10:48:41 crc kubenswrapper[4909]: E0202 10:48:41.223525 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qpl98,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-69d6db494d-kqmfw_openstack-operators(b2ab0041-b77e-4974-9ca4-7100b40c06e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:48:41 crc kubenswrapper[4909]: E0202 10:48:41.224739 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" podUID="b2ab0041-b77e-4974-9ca4-7100b40c06e8" Feb 02 10:48:41 crc kubenswrapper[4909]: I0202 10:48:41.521118 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:41 crc kubenswrapper[4909]: E0202 10:48:41.521339 4909 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:41 crc kubenswrapper[4909]: E0202 10:48:41.521428 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert podName:2b2f22c4-e784-4742-86fa-aef6d4e31970 nodeName:}" failed. No retries permitted until 2026-02-02 10:48:57.521410049 +0000 UTC m=+1063.267510784 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert") pod "infra-operator-controller-manager-79955696d6-2rl4b" (UID: "2b2f22c4-e784-4742-86fa-aef6d4e31970") : secret "infra-operator-webhook-server-cert" not found Feb 02 10:48:41 crc kubenswrapper[4909]: E0202 10:48:41.635561 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:27d83ada27cf70cda0c5738f97551d81f1ea4068e83a090f3312e22172d72e10\\\"\"" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" podUID="b2ab0041-b77e-4974-9ca4-7100b40c06e8" Feb 02 10:48:41 crc kubenswrapper[4909]: I0202 10:48:41.723360 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:41 crc kubenswrapper[4909]: E0202 10:48:41.723614 4909 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:41 crc kubenswrapper[4909]: E0202 10:48:41.723722 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert podName:12c45a89-12fb-4ce4-aa9a-35931b51407a nodeName:}" failed. No retries permitted until 2026-02-02 10:48:57.723700227 +0000 UTC m=+1063.469800962 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert") pod "openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" (UID: "12c45a89-12fb-4ce4-aa9a-35931b51407a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 02 10:48:42 crc kubenswrapper[4909]: I0202 10:48:42.331061 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:42 crc kubenswrapper[4909]: I0202 10:48:42.331749 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:42 crc kubenswrapper[4909]: E0202 10:48:42.331308 4909 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 02 10:48:42 crc kubenswrapper[4909]: E0202 10:48:42.331883 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:58.331841016 +0000 UTC m=+1064.077941791 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "webhook-server-cert" not found Feb 02 10:48:42 crc kubenswrapper[4909]: E0202 10:48:42.331950 4909 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 02 10:48:42 crc kubenswrapper[4909]: E0202 10:48:42.332054 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs podName:124911b8-e918-4360-8206-e9d72be4448f nodeName:}" failed. No retries permitted until 2026-02-02 10:48:58.332027071 +0000 UTC m=+1064.078127856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs") pod "openstack-operator-controller-manager-646f757d77-z4bz9" (UID: "124911b8-e918-4360-8206-e9d72be4448f") : secret "metrics-server-cert" not found Feb 02 10:48:43 crc kubenswrapper[4909]: E0202 10:48:43.183156 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 02 10:48:43 crc kubenswrapper[4909]: E0202 10:48:43.183328 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j5bcv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-mnk6q_openstack-operators(5d71df35-50ab-41fe-ad0a-cfbc9b06cc71): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:48:43 crc kubenswrapper[4909]: E0202 10:48:43.185146 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q" podUID="5d71df35-50ab-41fe-ad0a-cfbc9b06cc71" Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.655362 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz" event={"ID":"3a9ea22c-1031-4060-a4c1-7b65710bcb49","Type":"ContainerStarted","Data":"5295242db16af88aa9dc0321eac83d0b615c391e5dcc9ae08a5c858f33e155cf"} Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.655728 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz" Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.658359 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-845mj" event={"ID":"4b7813c2-b207-47b4-a1da-099645dc5e7c","Type":"ContainerStarted","Data":"aedf2e02be92ca8d046f21940ce198e0c95cd888228aa18c14d1f9bebbdb755d"} Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.658457 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-845mj" Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.660263 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp" event={"ID":"f4fc8502-eb04-4488-aeef-02f233cac870","Type":"ContainerStarted","Data":"43fe50439e50b06fa08f0690e3a8febf1f453577490d642bc264ea38c947d408"} Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.661384 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp" Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.664028 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm" event={"ID":"113ec314-f58c-41b3-bf30-8925e5555c77","Type":"ContainerStarted","Data":"a780d812c5f66ee5f7143a02e5e26fe0e265118f40bbbdaaa3c94389890d3c02"} Feb 02 10:48:43 crc kubenswrapper[4909]: E0202 10:48:43.666169 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q" podUID="5d71df35-50ab-41fe-ad0a-cfbc9b06cc71" Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.689904 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz" podStartSLOduration=2.75170803 podStartE2EDuration="18.689707145s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.277430585 +0000 UTC m=+1033.023531320" lastFinishedPulling="2026-02-02 10:48:43.2154297 +0000 UTC m=+1048.961530435" observedRunningTime="2026-02-02 10:48:43.676232082 +0000 UTC m=+1049.422332817" watchObservedRunningTime="2026-02-02 10:48:43.689707145 +0000 UTC m=+1049.435807880" Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.721873 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-845mj" podStartSLOduration=2.843934861 podStartE2EDuration="18.721848018s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.367017401 +0000 UTC m=+1033.113118136" lastFinishedPulling="2026-02-02 10:48:43.244930558 +0000 UTC m=+1048.991031293" observedRunningTime="2026-02-02 10:48:43.719581904 +0000 UTC m=+1049.465682639" watchObservedRunningTime="2026-02-02 10:48:43.721848018 +0000 UTC m=+1049.467948793" Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.722593 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm" podStartSLOduration=2.784589025 podStartE2EDuration="18.722586189s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.277448846 +0000 UTC m=+1033.023549581" lastFinishedPulling="2026-02-02 10:48:43.21544601 +0000 UTC m=+1048.961546745" observedRunningTime="2026-02-02 10:48:43.699754941 +0000 UTC m=+1049.445855676" watchObservedRunningTime="2026-02-02 10:48:43.722586189 +0000 UTC m=+1049.468686924" Feb 02 10:48:43 crc kubenswrapper[4909]: I0202 10:48:43.750384 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp" podStartSLOduration=2.922898145 podStartE2EDuration="18.750364769s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.387985736 +0000 UTC m=+1033.134086461" lastFinishedPulling="2026-02-02 10:48:43.21545235 +0000 UTC m=+1048.961553085" observedRunningTime="2026-02-02 10:48:43.739894011 +0000 UTC m=+1049.485994746" watchObservedRunningTime="2026-02-02 10:48:43.750364769 +0000 UTC m=+1049.496465504" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.683317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k" event={"ID":"dc7852b6-2562-4a40-8ba7-01764a270e45","Type":"ContainerStarted","Data":"f5635a007ad0f29abab1275b054af804de18e6184c7bae461bafc7d658e848e9"} Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.683415 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.685023 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz" event={"ID":"939074c8-53f8-4574-8868-34d99851993d","Type":"ContainerStarted","Data":"49ae7e2297ba155dce88eb66992fbe27cd68c2cf5ad16264493bdae044e3918b"} Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.685822 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.689708 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r" event={"ID":"186cb127-7e1c-4edf-befe-5e8c89d5d819","Type":"ContainerStarted","Data":"83529beb4ca2ca38cef8600b05d1075b48f0e4f100044d6de3a33db636a71a7b"} Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.689851 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.721691 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc" event={"ID":"beeaa5f8-617f-4486-86a0-122ab355e4de","Type":"ContainerStarted","Data":"c8203b16b6a6eae7641e1c2b90f3f6af5aba8ca7089700a522905084a0f9404f"} Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.721851 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.722497 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k" podStartSLOduration=3.7057790390000003 podStartE2EDuration="19.722482479s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.244042277 +0000 UTC m=+1032.990143012" lastFinishedPulling="2026-02-02 10:48:43.260745717 +0000 UTC m=+1049.006846452" observedRunningTime="2026-02-02 10:48:44.718228138 +0000 UTC m=+1050.464328873" watchObservedRunningTime="2026-02-02 10:48:44.722482479 +0000 UTC m=+1050.468583214" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.742978 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r" podStartSLOduration=4.148980251 podStartE2EDuration="19.742957811s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.666928302 +0000 UTC m=+1033.413029037" lastFinishedPulling="2026-02-02 10:48:43.260905862 +0000 UTC m=+1049.007006597" observedRunningTime="2026-02-02 10:48:44.739993597 +0000 UTC m=+1050.486094342" watchObservedRunningTime="2026-02-02 10:48:44.742957811 +0000 UTC m=+1050.489058536" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.746924 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9" event={"ID":"f5680c7a-ca86-40e8-b724-de63f5a24da2","Type":"ContainerStarted","Data":"b1f2114a8c59b0d07ac6cd3d60f228ec14e18cb2872fd3d8b7695028e3b0118b"} Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.747544 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.754343 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz" event={"ID":"fd689c48-8bd5-4200-9602-2a9c82503585","Type":"ContainerStarted","Data":"7ba59a5ca6160da9897de30bc5e346ca13d59aef2ead2d6acbcda65810e09896"} Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.754383 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.755096 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.766216 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz" podStartSLOduration=3.824688567 podStartE2EDuration="19.766202391s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.243589814 +0000 UTC m=+1032.989690549" lastFinishedPulling="2026-02-02 10:48:43.185103638 +0000 UTC m=+1048.931204373" observedRunningTime="2026-02-02 10:48:44.76193657 +0000 UTC m=+1050.508037305" watchObservedRunningTime="2026-02-02 10:48:44.766202391 +0000 UTC m=+1050.512303116" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.779183 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9" podStartSLOduration=3.328708425 podStartE2EDuration="19.77916726s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:26.80808995 +0000 UTC m=+1032.554190685" lastFinishedPulling="2026-02-02 10:48:43.258548785 +0000 UTC m=+1049.004649520" observedRunningTime="2026-02-02 10:48:44.778274754 +0000 UTC m=+1050.524375489" watchObservedRunningTime="2026-02-02 10:48:44.77916726 +0000 UTC m=+1050.525267995" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.803438 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc" podStartSLOduration=3.357108652 podStartE2EDuration="19.803422189s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:26.814380219 +0000 UTC m=+1032.560480954" lastFinishedPulling="2026-02-02 10:48:43.260693756 +0000 UTC m=+1049.006794491" observedRunningTime="2026-02-02 10:48:44.79291848 +0000 UTC m=+1050.539019215" watchObservedRunningTime="2026-02-02 10:48:44.803422189 +0000 UTC m=+1050.549522924" Feb 02 10:48:44 crc kubenswrapper[4909]: I0202 10:48:44.830587 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz" podStartSLOduration=3.442959551 podStartE2EDuration="19.83056711s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:26.785146928 +0000 UTC m=+1032.531247663" lastFinishedPulling="2026-02-02 10:48:43.172754487 +0000 UTC m=+1048.918855222" observedRunningTime="2026-02-02 10:48:44.825947019 +0000 UTC m=+1050.572047754" watchObservedRunningTime="2026-02-02 10:48:44.83056711 +0000 UTC m=+1050.576667845" Feb 02 10:48:50 crc kubenswrapper[4909]: I0202 10:48:50.790466 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f" event={"ID":"b3448119-ce0b-44b6-8491-2d2bc7a1352b","Type":"ContainerStarted","Data":"1c9aaa96ed518d19098851085ae8caec671e4c669568085e3fd5c31fd157a844"} Feb 02 10:48:50 crc kubenswrapper[4909]: I0202 10:48:50.791198 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f" Feb 02 10:48:50 crc kubenswrapper[4909]: I0202 10:48:50.792071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-k6628" event={"ID":"eceab652-e90e-4c17-a629-91fbd88492e6","Type":"ContainerStarted","Data":"5debfc48b0531dc0c2c93ba950b961fd6b06359a9ceddcbf1ede119869782ada"} Feb 02 10:48:50 crc kubenswrapper[4909]: I0202 10:48:50.792403 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-k6628" Feb 02 10:48:50 crc kubenswrapper[4909]: I0202 10:48:50.811210 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f" podStartSLOduration=9.172081408 podStartE2EDuration="25.811189683s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:26.625585925 +0000 UTC m=+1032.371686660" lastFinishedPulling="2026-02-02 10:48:43.2646942 +0000 UTC m=+1049.010794935" observedRunningTime="2026-02-02 10:48:50.807762446 +0000 UTC m=+1056.553863181" watchObservedRunningTime="2026-02-02 10:48:50.811189683 +0000 UTC m=+1056.557290418" Feb 02 10:48:50 crc kubenswrapper[4909]: I0202 10:48:50.831406 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-k6628" podStartSLOduration=9.830427593 podStartE2EDuration="25.831388777s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.257867629 +0000 UTC m=+1033.003968354" lastFinishedPulling="2026-02-02 10:48:43.258828803 +0000 UTC m=+1049.004929538" observedRunningTime="2026-02-02 10:48:50.827327922 +0000 UTC m=+1056.573428667" watchObservedRunningTime="2026-02-02 10:48:50.831388777 +0000 UTC m=+1056.577489502" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.855011 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" event={"ID":"b2ab0041-b77e-4974-9ca4-7100b40c06e8","Type":"ContainerStarted","Data":"54a3671d5ea932a68dc2d38772462e7250c230fe015c1671ad088361c2d43fe8"} Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.855576 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.856147 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" event={"ID":"74b20727-6371-480f-aece-9d33cfc2075a","Type":"ContainerStarted","Data":"a051337c6318003f0a022ff41df8af0ad7f28431873aab1481fc05e4dbcd3116"} Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.856283 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.857275 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" event={"ID":"621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c","Type":"ContainerStarted","Data":"bd787771968339a89392e21fb6a0210ace63efef2043a9e4513bc02ca7594194"} Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.857596 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.858766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" event={"ID":"20dbebe4-193c-4657-a36f-45c3ec2c435c","Type":"ContainerStarted","Data":"f257a0c65dc7316acf5df2936b49aafa7cb6937d75b2804272d7741460c46057"} Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.858887 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.860249 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" event={"ID":"f79c4c2a-6eb9-4883-a6bf-3955b44fad05","Type":"ContainerStarted","Data":"5a59d8fbbb839648372348c2e9733e42278304cc451fba2ada0eb0db8b41aa06"} Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.860392 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.861460 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" event={"ID":"2ef2b6a8-5046-4a0c-82fd-b85d0e460c03","Type":"ContainerStarted","Data":"133c90fe09aade99b5675ff79d571f523e440de7e814f6f160dd93b95ffb46ad"} Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.861779 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.872690 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" podStartSLOduration=3.135098702 podStartE2EDuration="28.872675017s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.233548148 +0000 UTC m=+1032.979648883" lastFinishedPulling="2026-02-02 10:48:52.971124463 +0000 UTC m=+1058.717225198" observedRunningTime="2026-02-02 10:48:53.869091935 +0000 UTC m=+1059.615192670" watchObservedRunningTime="2026-02-02 10:48:53.872675017 +0000 UTC m=+1059.618775752" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.886053 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" podStartSLOduration=3.690851244 podStartE2EDuration="28.886035917s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.691673045 +0000 UTC m=+1033.437773780" lastFinishedPulling="2026-02-02 10:48:52.886857718 +0000 UTC m=+1058.632958453" observedRunningTime="2026-02-02 10:48:53.88158371 +0000 UTC m=+1059.627684445" watchObservedRunningTime="2026-02-02 10:48:53.886035917 +0000 UTC m=+1059.632136652" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.921309 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" podStartSLOduration=3.668063255 podStartE2EDuration="28.921295178s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.706175647 +0000 UTC m=+1033.452276382" lastFinishedPulling="2026-02-02 10:48:52.95940757 +0000 UTC m=+1058.705508305" observedRunningTime="2026-02-02 10:48:53.919167058 +0000 UTC m=+1059.665267793" watchObservedRunningTime="2026-02-02 10:48:53.921295178 +0000 UTC m=+1059.667395913" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.922212 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" podStartSLOduration=3.745908827 podStartE2EDuration="28.922205764s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.706739483 +0000 UTC m=+1033.452840218" lastFinishedPulling="2026-02-02 10:48:52.88303642 +0000 UTC m=+1058.629137155" observedRunningTime="2026-02-02 10:48:53.895386452 +0000 UTC m=+1059.641487197" watchObservedRunningTime="2026-02-02 10:48:53.922205764 +0000 UTC m=+1059.668306499" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.944125 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" podStartSLOduration=3.753919166 podStartE2EDuration="28.944104177s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.691942543 +0000 UTC m=+1033.438043278" lastFinishedPulling="2026-02-02 10:48:52.882127554 +0000 UTC m=+1058.628228289" observedRunningTime="2026-02-02 10:48:53.938188388 +0000 UTC m=+1059.684289123" watchObservedRunningTime="2026-02-02 10:48:53.944104177 +0000 UTC m=+1059.690204902" Feb 02 10:48:53 crc kubenswrapper[4909]: I0202 10:48:53.957336 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" podStartSLOduration=3.692165231 podStartE2EDuration="28.957321562s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.707047182 +0000 UTC m=+1033.453147917" lastFinishedPulling="2026-02-02 10:48:52.972203513 +0000 UTC m=+1058.718304248" observedRunningTime="2026-02-02 10:48:53.954930194 +0000 UTC m=+1059.701030919" watchObservedRunningTime="2026-02-02 10:48:53.957321562 +0000 UTC m=+1059.703422297" Feb 02 10:48:54 crc kubenswrapper[4909]: I0202 10:48:54.868549 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" event={"ID":"9002663c-c720-4fde-be9c-47008dfd15a6","Type":"ContainerStarted","Data":"aea350f9c839a848eb0a8226d4d5489c7ab1ac87231c218fb86393a057565e70"} Feb 02 10:48:54 crc kubenswrapper[4909]: I0202 10:48:54.893100 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" podStartSLOduration=2.8000615939999998 podStartE2EDuration="29.893077599s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.333532829 +0000 UTC m=+1033.079633564" lastFinishedPulling="2026-02-02 10:48:54.426548834 +0000 UTC m=+1060.172649569" observedRunningTime="2026-02-02 10:48:54.892135553 +0000 UTC m=+1060.638236288" watchObservedRunningTime="2026-02-02 10:48:54.893077599 +0000 UTC m=+1060.639178334" Feb 02 10:48:55 crc kubenswrapper[4909]: I0202 10:48:55.713801 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7b6c4d8c5f-dhq2f" Feb 02 10:48:55 crc kubenswrapper[4909]: I0202 10:48:55.734870 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d874c8fc-q85jz" Feb 02 10:48:55 crc kubenswrapper[4909]: I0202 10:48:55.786514 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d9697b7f4-ztcq9" Feb 02 10:48:55 crc kubenswrapper[4909]: I0202 10:48:55.839059 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-8886f4c47-prbwc" Feb 02 10:48:55 crc kubenswrapper[4909]: I0202 10:48:55.924320 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5fb775575f-h65nm" Feb 02 10:48:56 crc kubenswrapper[4909]: I0202 10:48:56.017183 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-84f48565d4-x9bhz" Feb 02 10:48:56 crc kubenswrapper[4909]: I0202 10:48:56.131124 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5f4b8bd54d-zfzjz" Feb 02 10:48:56 crc kubenswrapper[4909]: I0202 10:48:56.140670 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7dd968899f-k6628" Feb 02 10:48:56 crc kubenswrapper[4909]: I0202 10:48:56.152957 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" Feb 02 10:48:56 crc kubenswrapper[4909]: I0202 10:48:56.174650 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-585dbc889-5kc7k" Feb 02 10:48:56 crc kubenswrapper[4909]: I0202 10:48:56.226191 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-55bff696bd-vcrsp" Feb 02 10:48:56 crc kubenswrapper[4909]: I0202 10:48:56.267243 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-788c46999f-845mj" Feb 02 10:48:56 crc kubenswrapper[4909]: I0202 10:48:56.537549 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68fc8c869-r9p5r" Feb 02 10:48:56 crc kubenswrapper[4909]: I0202 10:48:56.887347 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q" event={"ID":"5d71df35-50ab-41fe-ad0a-cfbc9b06cc71","Type":"ContainerStarted","Data":"8eb34f0a5fe32937d51e81b2738fda74da6f939da4b9d1b4f52256293d4d7a90"} Feb 02 10:48:56 crc kubenswrapper[4909]: I0202 10:48:56.905686 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-mnk6q" podStartSLOduration=2.11913349 podStartE2EDuration="30.905665162s" podCreationTimestamp="2026-02-02 10:48:26 +0000 UTC" firstStartedPulling="2026-02-02 10:48:27.643707192 +0000 UTC m=+1033.389807927" lastFinishedPulling="2026-02-02 10:48:56.430238874 +0000 UTC m=+1062.176339599" observedRunningTime="2026-02-02 10:48:56.900962359 +0000 UTC m=+1062.647063094" watchObservedRunningTime="2026-02-02 10:48:56.905665162 +0000 UTC m=+1062.651765897" Feb 02 10:48:57 crc kubenswrapper[4909]: I0202 10:48:57.601716 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:57 crc kubenswrapper[4909]: I0202 10:48:57.610755 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b2f22c4-e784-4742-86fa-aef6d4e31970-cert\") pod \"infra-operator-controller-manager-79955696d6-2rl4b\" (UID: \"2b2f22c4-e784-4742-86fa-aef6d4e31970\") " pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:57 crc kubenswrapper[4909]: I0202 10:48:57.790700 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:48:57 crc kubenswrapper[4909]: I0202 10:48:57.806586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:57 crc kubenswrapper[4909]: I0202 10:48:57.812944 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12c45a89-12fb-4ce4-aa9a-35931b51407a-cert\") pod \"openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp\" (UID: \"12c45a89-12fb-4ce4-aa9a-35931b51407a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.107837 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.243145 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b"] Feb 02 10:48:58 crc kubenswrapper[4909]: W0202 10:48:58.248089 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b2f22c4_e784_4742_86fa_aef6d4e31970.slice/crio-ed1a2ece3af2d13d2e4031b5f5d016c35fc3475f1c25e70b91755089fc3a03f6 WatchSource:0}: Error finding container ed1a2ece3af2d13d2e4031b5f5d016c35fc3475f1c25e70b91755089fc3a03f6: Status 404 returned error can't find the container with id ed1a2ece3af2d13d2e4031b5f5d016c35fc3475f1c25e70b91755089fc3a03f6 Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.250953 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.417681 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.417757 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.422883 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-metrics-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.422989 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/124911b8-e918-4360-8206-e9d72be4448f-webhook-certs\") pod \"openstack-operator-controller-manager-646f757d77-z4bz9\" (UID: \"124911b8-e918-4360-8206-e9d72be4448f\") " pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.521906 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp"] Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.563095 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.904942 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" event={"ID":"12c45a89-12fb-4ce4-aa9a-35931b51407a","Type":"ContainerStarted","Data":"4a412a75698c9d243d19abc68bc206d77443e84f793b7557d3f7bffb93ac662e"} Feb 02 10:48:58 crc kubenswrapper[4909]: I0202 10:48:58.906497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" event={"ID":"2b2f22c4-e784-4742-86fa-aef6d4e31970","Type":"ContainerStarted","Data":"ed1a2ece3af2d13d2e4031b5f5d016c35fc3475f1c25e70b91755089fc3a03f6"} Feb 02 10:48:59 crc kubenswrapper[4909]: I0202 10:48:59.008850 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9"] Feb 02 10:48:59 crc kubenswrapper[4909]: I0202 10:48:59.913715 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" event={"ID":"124911b8-e918-4360-8206-e9d72be4448f","Type":"ContainerStarted","Data":"310f67486978ad22e113afa0ad2cb5d8351e2dbc772a048274ecdfc802814b45"} Feb 02 10:48:59 crc kubenswrapper[4909]: I0202 10:48:59.914092 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" event={"ID":"124911b8-e918-4360-8206-e9d72be4448f","Type":"ContainerStarted","Data":"83a213d91aa25263305bffd66a4321e74a1416b1d1440408837456f44a48453f"} Feb 02 10:48:59 crc kubenswrapper[4909]: I0202 10:48:59.914123 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:48:59 crc kubenswrapper[4909]: I0202 10:48:59.949729 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" podStartSLOduration=33.949711562 podStartE2EDuration="33.949711562s" podCreationTimestamp="2026-02-02 10:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:48:59.937529236 +0000 UTC m=+1065.683629991" watchObservedRunningTime="2026-02-02 10:48:59.949711562 +0000 UTC m=+1065.695812297" Feb 02 10:49:01 crc kubenswrapper[4909]: I0202 10:49:01.928391 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" event={"ID":"12c45a89-12fb-4ce4-aa9a-35931b51407a","Type":"ContainerStarted","Data":"496dc2cde648be454de868d41892eede28b32d2303cc1697e6a750e586cca910"} Feb 02 10:49:01 crc kubenswrapper[4909]: I0202 10:49:01.928936 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:49:01 crc kubenswrapper[4909]: I0202 10:49:01.931208 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" event={"ID":"2b2f22c4-e784-4742-86fa-aef6d4e31970","Type":"ContainerStarted","Data":"2daf2e74f73a2880f496363b22115a497e5f9f81f835b4b07d74597b01b1f071"} Feb 02 10:49:01 crc kubenswrapper[4909]: I0202 10:49:01.931614 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:49:01 crc kubenswrapper[4909]: I0202 10:49:01.954525 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" podStartSLOduration=34.448572073 podStartE2EDuration="36.954504893s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:58.530019274 +0000 UTC m=+1064.276120009" lastFinishedPulling="2026-02-02 10:49:01.035952094 +0000 UTC m=+1066.782052829" observedRunningTime="2026-02-02 10:49:01.950503329 +0000 UTC m=+1067.696604064" watchObservedRunningTime="2026-02-02 10:49:01.954504893 +0000 UTC m=+1067.700605628" Feb 02 10:49:01 crc kubenswrapper[4909]: I0202 10:49:01.973058 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" podStartSLOduration=34.191155359 podStartE2EDuration="36.973025739s" podCreationTimestamp="2026-02-02 10:48:25 +0000 UTC" firstStartedPulling="2026-02-02 10:48:58.250641936 +0000 UTC m=+1063.996742671" lastFinishedPulling="2026-02-02 10:49:01.032512316 +0000 UTC m=+1066.778613051" observedRunningTime="2026-02-02 10:49:01.964470376 +0000 UTC m=+1067.710571151" watchObservedRunningTime="2026-02-02 10:49:01.973025739 +0000 UTC m=+1067.719126524" Feb 02 10:49:05 crc kubenswrapper[4909]: I0202 10:49:05.906892 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69d6db494d-kqmfw" Feb 02 10:49:06 crc kubenswrapper[4909]: I0202 10:49:06.154329 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67bf948998-ppnjd" Feb 02 10:49:06 crc kubenswrapper[4909]: I0202 10:49:06.264793 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6687f8d877-7bxgj" Feb 02 10:49:06 crc kubenswrapper[4909]: I0202 10:49:06.341201 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b964cf4cd-9v7l6" Feb 02 10:49:06 crc kubenswrapper[4909]: I0202 10:49:06.566836 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-64b5b76f97-28qg2" Feb 02 10:49:06 crc kubenswrapper[4909]: I0202 10:49:06.624484 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-56f8bfcd9f-zj4v5" Feb 02 10:49:06 crc kubenswrapper[4909]: I0202 10:49:06.728408 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-564965969-7z4k5" Feb 02 10:49:07 crc kubenswrapper[4909]: I0202 10:49:07.796352 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79955696d6-2rl4b" Feb 02 10:49:08 crc kubenswrapper[4909]: I0202 10:49:08.113397 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp" Feb 02 10:49:08 crc kubenswrapper[4909]: I0202 10:49:08.570248 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-646f757d77-z4bz9" Feb 02 10:49:19 crc kubenswrapper[4909]: I0202 10:49:19.510568 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:49:19 crc kubenswrapper[4909]: I0202 10:49:19.511139 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.497530 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xvsxw"] Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.499461 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.501835 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.502145 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rrx4g" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.502285 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.509382 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.510842 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xvsxw"] Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.603550 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-x2xqq"] Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.604885 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.610525 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.621422 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-x2xqq"] Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.674425 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d567b63-4866-4b76-a00f-c6be088aede4-config\") pod \"dnsmasq-dns-855cbc58c5-xvsxw\" (UID: \"7d567b63-4866-4b76-a00f-c6be088aede4\") " pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.674519 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnhkk\" (UniqueName: \"kubernetes.io/projected/7d567b63-4866-4b76-a00f-c6be088aede4-kube-api-access-cnhkk\") pod \"dnsmasq-dns-855cbc58c5-xvsxw\" (UID: \"7d567b63-4866-4b76-a00f-c6be088aede4\") " pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.776268 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-config\") pod \"dnsmasq-dns-6fcf94d689-x2xqq\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.776344 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wxtw\" (UniqueName: \"kubernetes.io/projected/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-kube-api-access-7wxtw\") pod \"dnsmasq-dns-6fcf94d689-x2xqq\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.776374 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d567b63-4866-4b76-a00f-c6be088aede4-config\") pod \"dnsmasq-dns-855cbc58c5-xvsxw\" (UID: \"7d567b63-4866-4b76-a00f-c6be088aede4\") " pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.776395 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-x2xqq\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.776632 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnhkk\" (UniqueName: \"kubernetes.io/projected/7d567b63-4866-4b76-a00f-c6be088aede4-kube-api-access-cnhkk\") pod \"dnsmasq-dns-855cbc58c5-xvsxw\" (UID: \"7d567b63-4866-4b76-a00f-c6be088aede4\") " pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.777279 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d567b63-4866-4b76-a00f-c6be088aede4-config\") pod \"dnsmasq-dns-855cbc58c5-xvsxw\" (UID: \"7d567b63-4866-4b76-a00f-c6be088aede4\") " pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.799839 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnhkk\" (UniqueName: \"kubernetes.io/projected/7d567b63-4866-4b76-a00f-c6be088aede4-kube-api-access-cnhkk\") pod \"dnsmasq-dns-855cbc58c5-xvsxw\" (UID: \"7d567b63-4866-4b76-a00f-c6be088aede4\") " pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.859280 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.877679 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-config\") pod \"dnsmasq-dns-6fcf94d689-x2xqq\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.877782 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wxtw\" (UniqueName: \"kubernetes.io/projected/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-kube-api-access-7wxtw\") pod \"dnsmasq-dns-6fcf94d689-x2xqq\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.877827 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-x2xqq\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.878539 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-config\") pod \"dnsmasq-dns-6fcf94d689-x2xqq\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.879282 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-x2xqq\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.897581 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wxtw\" (UniqueName: \"kubernetes.io/projected/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-kube-api-access-7wxtw\") pod \"dnsmasq-dns-6fcf94d689-x2xqq\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:21 crc kubenswrapper[4909]: I0202 10:49:21.922991 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:22 crc kubenswrapper[4909]: I0202 10:49:22.334183 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xvsxw"] Feb 02 10:49:22 crc kubenswrapper[4909]: I0202 10:49:22.394897 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-x2xqq"] Feb 02 10:49:22 crc kubenswrapper[4909]: W0202 10:49:22.397749 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e3d62ba_8fec_4dcd_8c7c_878cb8e4de76.slice/crio-7123ef2ce8fcdd5414be44bf36d7605ef047d8314e600d9edf6b2a13d948f5f4 WatchSource:0}: Error finding container 7123ef2ce8fcdd5414be44bf36d7605ef047d8314e600d9edf6b2a13d948f5f4: Status 404 returned error can't find the container with id 7123ef2ce8fcdd5414be44bf36d7605ef047d8314e600d9edf6b2a13d948f5f4 Feb 02 10:49:23 crc kubenswrapper[4909]: I0202 10:49:23.071497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" event={"ID":"7d567b63-4866-4b76-a00f-c6be088aede4","Type":"ContainerStarted","Data":"1b27ad41fdfde9818334399fb8e6c351c669f5ca8b83f744011375d574f37187"} Feb 02 10:49:23 crc kubenswrapper[4909]: I0202 10:49:23.072636 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" event={"ID":"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76","Type":"ContainerStarted","Data":"7123ef2ce8fcdd5414be44bf36d7605ef047d8314e600d9edf6b2a13d948f5f4"} Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.430780 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-x2xqq"] Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.460291 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-4v7wk"] Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.461658 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.475525 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-4v7wk"] Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.623719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9r9c\" (UniqueName: \"kubernetes.io/projected/32baf7b0-1fd7-4303-b78d-56c2a4e29388-kube-api-access-q9r9c\") pod \"dnsmasq-dns-f54874ffc-4v7wk\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.623842 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-dns-svc\") pod \"dnsmasq-dns-f54874ffc-4v7wk\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.623899 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-config\") pod \"dnsmasq-dns-f54874ffc-4v7wk\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.730322 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-dns-svc\") pod \"dnsmasq-dns-f54874ffc-4v7wk\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.730369 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-config\") pod \"dnsmasq-dns-f54874ffc-4v7wk\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.730423 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9r9c\" (UniqueName: \"kubernetes.io/projected/32baf7b0-1fd7-4303-b78d-56c2a4e29388-kube-api-access-q9r9c\") pod \"dnsmasq-dns-f54874ffc-4v7wk\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.731537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-dns-svc\") pod \"dnsmasq-dns-f54874ffc-4v7wk\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.732135 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-config\") pod \"dnsmasq-dns-f54874ffc-4v7wk\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.773799 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9r9c\" (UniqueName: \"kubernetes.io/projected/32baf7b0-1fd7-4303-b78d-56c2a4e29388-kube-api-access-q9r9c\") pod \"dnsmasq-dns-f54874ffc-4v7wk\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.799098 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.824573 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xvsxw"] Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.927923 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-8psfb"] Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.932487 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:24 crc kubenswrapper[4909]: I0202 10:49:24.938398 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-8psfb"] Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.039254 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-config\") pod \"dnsmasq-dns-67ff45466c-8psfb\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.039576 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-dns-svc\") pod \"dnsmasq-dns-67ff45466c-8psfb\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.039621 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69rsq\" (UniqueName: \"kubernetes.io/projected/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-kube-api-access-69rsq\") pod \"dnsmasq-dns-67ff45466c-8psfb\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.141978 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-config\") pod \"dnsmasq-dns-67ff45466c-8psfb\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.142095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-dns-svc\") pod \"dnsmasq-dns-67ff45466c-8psfb\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.142157 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69rsq\" (UniqueName: \"kubernetes.io/projected/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-kube-api-access-69rsq\") pod \"dnsmasq-dns-67ff45466c-8psfb\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.142907 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-config\") pod \"dnsmasq-dns-67ff45466c-8psfb\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.143096 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-dns-svc\") pod \"dnsmasq-dns-67ff45466c-8psfb\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.161017 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-4v7wk"] Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.166445 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69rsq\" (UniqueName: \"kubernetes.io/projected/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-kube-api-access-69rsq\") pod \"dnsmasq-dns-67ff45466c-8psfb\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:25 crc kubenswrapper[4909]: W0202 10:49:25.180876 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32baf7b0_1fd7_4303_b78d_56c2a4e29388.slice/crio-94fa809a2ab739d4573ed96346e9c1028445a7598dbababbaa84c9d1c01397f5 WatchSource:0}: Error finding container 94fa809a2ab739d4573ed96346e9c1028445a7598dbababbaa84c9d1c01397f5: Status 404 returned error can't find the container with id 94fa809a2ab739d4573ed96346e9c1028445a7598dbababbaa84c9d1c01397f5 Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.249170 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.639114 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.643426 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.646222 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.646757 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.646941 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.647082 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.647167 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g9vrp" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.648189 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.650626 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.650691 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.731645 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-8psfb"] Feb 02 10:49:25 crc kubenswrapper[4909]: W0202 10:49:25.746427 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4aa36cd_48dd_48cc_87b9_c6c7bf39c535.slice/crio-8dd2a1c8a5aad984dded24285b57d01d5ad91dafb638535846811e562bbf624d WatchSource:0}: Error finding container 8dd2a1c8a5aad984dded24285b57d01d5ad91dafb638535846811e562bbf624d: Status 404 returned error can't find the container with id 8dd2a1c8a5aad984dded24285b57d01d5ad91dafb638535846811e562bbf624d Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.752376 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.752425 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.752489 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.752516 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.752558 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b441d32f-f76f-4e7b-b3fe-40e93b126567-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.752579 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.752595 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.752615 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlm2v\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-kube-api-access-rlm2v\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.752632 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b441d32f-f76f-4e7b-b3fe-40e93b126567-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.752657 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.753001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.854757 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.854838 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.854876 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.854898 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.854921 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.854962 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b441d32f-f76f-4e7b-b3fe-40e93b126567-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.854980 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.854994 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.855009 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlm2v\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-kube-api-access-rlm2v\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.855026 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b441d32f-f76f-4e7b-b3fe-40e93b126567-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.855047 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.855913 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.856352 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.856553 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.856833 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.856838 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.857494 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.863556 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b441d32f-f76f-4e7b-b3fe-40e93b126567-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.864100 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.864221 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b441d32f-f76f-4e7b-b3fe-40e93b126567-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.867466 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.876200 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.877347 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlm2v\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-kube-api-access-rlm2v\") pod \"rabbitmq-server-0\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " pod="openstack/rabbitmq-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.967844 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.969010 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.974492 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.974628 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.975238 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.975290 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.975401 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.975459 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-679gz" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.975673 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 02 10:49:25 crc kubenswrapper[4909]: I0202 10:49:25.976475 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.003602 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061545 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061635 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061664 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061706 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ab15f72-b249-42d5-8698-273c5afc7758-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061730 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061765 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061822 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2vfv\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-kube-api-access-d2vfv\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061845 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061883 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061909 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.061940 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ab15f72-b249-42d5-8698-273c5afc7758-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.115986 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" event={"ID":"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535","Type":"ContainerStarted","Data":"8dd2a1c8a5aad984dded24285b57d01d5ad91dafb638535846811e562bbf624d"} Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.120958 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" event={"ID":"32baf7b0-1fd7-4303-b78d-56c2a4e29388","Type":"ContainerStarted","Data":"94fa809a2ab739d4573ed96346e9c1028445a7598dbababbaa84c9d1c01397f5"} Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.163902 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2vfv\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-kube-api-access-d2vfv\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.163957 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164003 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164054 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164089 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ab15f72-b249-42d5-8698-273c5afc7758-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164121 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164202 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164226 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164267 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ab15f72-b249-42d5-8698-273c5afc7758-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164292 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164329 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164531 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.164884 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.165154 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.165395 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.166174 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.166893 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.172301 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ab15f72-b249-42d5-8698-273c5afc7758-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.172438 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.172706 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ab15f72-b249-42d5-8698-273c5afc7758-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.176956 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.183949 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2vfv\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-kube-api-access-d2vfv\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.191045 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.292854 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.639065 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:49:26 crc kubenswrapper[4909]: I0202 10:49:26.930096 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.074368 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.075965 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.079631 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.079891 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.080204 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ld7ld" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.087412 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.095308 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.106638 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.180971 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4r9g\" (UniqueName: \"kubernetes.io/projected/68e55a25-f51a-49a9-af91-ffbab9ad611e-kube-api-access-p4r9g\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.181460 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-kolla-config\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.181545 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.181620 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.181677 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.181703 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.181781 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-default\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.181856 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.296834 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-kolla-config\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.296917 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.296956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.296989 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.297015 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.297072 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-default\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.297113 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.297153 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4r9g\" (UniqueName: \"kubernetes.io/projected/68e55a25-f51a-49a9-af91-ffbab9ad611e-kube-api-access-p4r9g\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.298170 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.303024 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-default\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.304788 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-kolla-config\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.305084 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.305126 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.309609 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.310257 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.329115 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4r9g\" (UniqueName: \"kubernetes.io/projected/68e55a25-f51a-49a9-af91-ffbab9ad611e-kube-api-access-p4r9g\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.366914 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " pod="openstack/openstack-galera-0" Feb 02 10:49:27 crc kubenswrapper[4909]: I0202 10:49:27.431200 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.402875 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.404473 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.411550 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.411775 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.411896 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-7chst" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.411991 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.439316 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.533106 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.533180 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.533218 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.533238 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.533292 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.533320 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.533352 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.533396 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2pfm\" (UniqueName: \"kubernetes.io/projected/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kube-api-access-w2pfm\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.568106 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.575727 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.576130 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.578530 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.578863 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.579026 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dh54t" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.634423 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.634482 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.634553 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.634586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.634611 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.634681 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2pfm\" (UniqueName: \"kubernetes.io/projected/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kube-api-access-w2pfm\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.634732 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.634763 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.635636 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.636022 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.636545 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.637750 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.638201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.654185 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.654685 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.659062 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2pfm\" (UniqueName: \"kubernetes.io/projected/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kube-api-access-w2pfm\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.683973 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.736296 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.736340 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpgnr\" (UniqueName: \"kubernetes.io/projected/9dab4432-0762-45a8-88ab-3a99217a790f-kube-api-access-wpgnr\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.736361 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.736427 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-config-data\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.736491 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-kolla-config\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.749312 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.838354 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.838436 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpgnr\" (UniqueName: \"kubernetes.io/projected/9dab4432-0762-45a8-88ab-3a99217a790f-kube-api-access-wpgnr\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.838487 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.838525 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-config-data\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.839274 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-kolla-config\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.839538 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-config-data\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.839997 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-kolla-config\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.843797 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.844366 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.864275 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpgnr\" (UniqueName: \"kubernetes.io/projected/9dab4432-0762-45a8-88ab-3a99217a790f-kube-api-access-wpgnr\") pod \"memcached-0\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " pod="openstack/memcached-0" Feb 02 10:49:28 crc kubenswrapper[4909]: I0202 10:49:28.912273 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 10:49:30 crc kubenswrapper[4909]: I0202 10:49:30.628949 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:49:30 crc kubenswrapper[4909]: I0202 10:49:30.630141 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:49:30 crc kubenswrapper[4909]: I0202 10:49:30.632703 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-c6b6b" Feb 02 10:49:30 crc kubenswrapper[4909]: I0202 10:49:30.636712 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:49:30 crc kubenswrapper[4909]: I0202 10:49:30.778021 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbsfn\" (UniqueName: \"kubernetes.io/projected/6b8f1870-afe0-4ac5-a633-e87905ab1d5b-kube-api-access-zbsfn\") pod \"kube-state-metrics-0\" (UID: \"6b8f1870-afe0-4ac5-a633-e87905ab1d5b\") " pod="openstack/kube-state-metrics-0" Feb 02 10:49:30 crc kubenswrapper[4909]: I0202 10:49:30.879825 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbsfn\" (UniqueName: \"kubernetes.io/projected/6b8f1870-afe0-4ac5-a633-e87905ab1d5b-kube-api-access-zbsfn\") pod \"kube-state-metrics-0\" (UID: \"6b8f1870-afe0-4ac5-a633-e87905ab1d5b\") " pod="openstack/kube-state-metrics-0" Feb 02 10:49:30 crc kubenswrapper[4909]: I0202 10:49:30.907847 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbsfn\" (UniqueName: \"kubernetes.io/projected/6b8f1870-afe0-4ac5-a633-e87905ab1d5b-kube-api-access-zbsfn\") pod \"kube-state-metrics-0\" (UID: \"6b8f1870-afe0-4ac5-a633-e87905ab1d5b\") " pod="openstack/kube-state-metrics-0" Feb 02 10:49:30 crc kubenswrapper[4909]: I0202 10:49:30.946345 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.133515 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qpqvt"] Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.134995 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.136321 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-fxdbc" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.136982 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.137079 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.140977 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9hkv5"] Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.142370 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.155129 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qpqvt"] Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.184264 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9hkv5"] Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.213178 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1ab15f72-b249-42d5-8698-273c5afc7758","Type":"ContainerStarted","Data":"b10b7913375dd80b6c0d930e6fa0ad13deb369bf183283f054af8abe9b8de69c"} Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.214522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-combined-ca-bundle\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.214580 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-log-ovn\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.214625 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-scripts\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.214656 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.214691 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ts2l\" (UniqueName: \"kubernetes.io/projected/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-kube-api-access-7ts2l\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.214719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-ovn-controller-tls-certs\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.214739 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run-ovn\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: W0202 10:49:33.274253 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb441d32f_f76f_4e7b_b3fe_40e93b126567.slice/crio-9c883b677a4169ab8884d97967256980cb494afea7976d549974ccaea0a4f2cf WatchSource:0}: Error finding container 9c883b677a4169ab8884d97967256980cb494afea7976d549974ccaea0a4f2cf: Status 404 returned error can't find the container with id 9c883b677a4169ab8884d97967256980cb494afea7976d549974ccaea0a4f2cf Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319127 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ts2l\" (UniqueName: \"kubernetes.io/projected/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-kube-api-access-7ts2l\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319193 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-ovn-controller-tls-certs\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319222 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run-ovn\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319265 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5a90ca-c133-400b-b869-becc0b1f60a0-scripts\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319299 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-log\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-lib\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319366 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-run\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319541 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-combined-ca-bundle\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319589 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-log-ovn\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319617 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-etc-ovs\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319741 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-scripts\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9npj5\" (UniqueName: \"kubernetes.io/projected/ef5a90ca-c133-400b-b869-becc0b1f60a0-kube-api-access-9npj5\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.319878 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.320129 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run-ovn\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.320158 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-log-ovn\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.320195 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.323263 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-scripts\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.325009 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-combined-ca-bundle\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.325791 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-ovn-controller-tls-certs\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.337162 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ts2l\" (UniqueName: \"kubernetes.io/projected/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-kube-api-access-7ts2l\") pod \"ovn-controller-qpqvt\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.421798 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5a90ca-c133-400b-b869-becc0b1f60a0-scripts\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.421888 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-log\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.421905 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-lib\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.421931 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-run\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.421967 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-etc-ovs\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.422004 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9npj5\" (UniqueName: \"kubernetes.io/projected/ef5a90ca-c133-400b-b869-becc0b1f60a0-kube-api-access-9npj5\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.422504 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-log\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.422623 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-lib\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.422658 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-run\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.422767 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-etc-ovs\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.424184 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5a90ca-c133-400b-b869-becc0b1f60a0-scripts\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.437463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9npj5\" (UniqueName: \"kubernetes.io/projected/ef5a90ca-c133-400b-b869-becc0b1f60a0-kube-api-access-9npj5\") pod \"ovn-controller-ovs-9hkv5\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.462573 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:33 crc kubenswrapper[4909]: I0202 10:49:33.474621 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:34 crc kubenswrapper[4909]: I0202 10:49:34.240519 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b441d32f-f76f-4e7b-b3fe-40e93b126567","Type":"ContainerStarted","Data":"9c883b677a4169ab8884d97967256980cb494afea7976d549974ccaea0a4f2cf"} Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.150902 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.152461 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.156132 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-nbpcj" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.158028 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.161512 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.162124 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.162258 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.165251 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.190946 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.191028 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghzql\" (UniqueName: \"kubernetes.io/projected/f0a30163-0b42-493b-b775-d88218bd1844-kube-api-access-ghzql\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.191257 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.191379 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f0a30163-0b42-493b-b775-d88218bd1844-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.191418 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.191464 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.191556 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.191675 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-config\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.292866 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.292950 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.293024 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.293119 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-config\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.293155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.293178 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghzql\" (UniqueName: \"kubernetes.io/projected/f0a30163-0b42-493b-b775-d88218bd1844-kube-api-access-ghzql\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.293224 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.293271 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f0a30163-0b42-493b-b775-d88218bd1844-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.293362 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.294341 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f0a30163-0b42-493b-b775-d88218bd1844-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.294876 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.295732 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-config\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.301070 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.302407 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.303329 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.313038 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghzql\" (UniqueName: \"kubernetes.io/projected/f0a30163-0b42-493b-b775-d88218bd1844-kube-api-access-ghzql\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.321767 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.351097 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.354325 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.358055 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.358380 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-fptqp" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.358490 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.358525 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.375154 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.491719 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.495391 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.495440 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.495499 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.495536 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.495573 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv8g4\" (UniqueName: \"kubernetes.io/projected/7b658933-f66d-47df-8b75-a42cd55b9bf4-kube-api-access-dv8g4\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.495592 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.495645 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-config\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.495664 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.597541 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv8g4\" (UniqueName: \"kubernetes.io/projected/7b658933-f66d-47df-8b75-a42cd55b9bf4-kube-api-access-dv8g4\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.597608 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.597650 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-config\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.597673 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.597746 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.597782 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.597834 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.597865 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.598007 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.598335 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.598767 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-config\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.598924 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.611449 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.613521 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.619314 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.624484 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.632963 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv8g4\" (UniqueName: \"kubernetes.io/projected/7b658933-f66d-47df-8b75-a42cd55b9bf4-kube-api-access-dv8g4\") pod \"ovsdbserver-sb-0\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:37 crc kubenswrapper[4909]: I0202 10:49:37.689339 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:42 crc kubenswrapper[4909]: I0202 10:49:42.112885 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.170885 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.171432 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q9r9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-f54874ffc-4v7wk_openstack(32baf7b0-1fd7-4303-b78d-56c2a4e29388): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.172621 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" podUID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.180654 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.180801 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wxtw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-x2xqq_openstack(3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.182023 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" podUID="3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.184026 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.184171 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-69rsq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67ff45466c-8psfb_openstack(f4aa36cd-48dd-48cc-87b9-c6c7bf39c535): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.186276 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" podUID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.186623 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.186832 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cnhkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-xvsxw_openstack(7d567b63-4866-4b76-a00f-c6be088aede4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.187960 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" podUID="7d567b63-4866-4b76-a00f-c6be088aede4" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.331158 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" podUID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" Feb 02 10:49:44 crc kubenswrapper[4909]: E0202 10:49:44.331763 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" podUID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" Feb 02 10:49:45 crc kubenswrapper[4909]: W0202 10:49:45.309082 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dab4432_0762_45a8_88ab_3a99217a790f.slice/crio-9b7ddf81296dc46ea659b3831d6bab2f73b124d8cb5ba35a9e22af4b77f8a85b WatchSource:0}: Error finding container 9b7ddf81296dc46ea659b3831d6bab2f73b124d8cb5ba35a9e22af4b77f8a85b: Status 404 returned error can't find the container with id 9b7ddf81296dc46ea659b3831d6bab2f73b124d8cb5ba35a9e22af4b77f8a85b Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.330756 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"9dab4432-0762-45a8-88ab-3a99217a790f","Type":"ContainerStarted","Data":"9b7ddf81296dc46ea659b3831d6bab2f73b124d8cb5ba35a9e22af4b77f8a85b"} Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.503214 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.545157 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.662342 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-config\") pod \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.662787 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wxtw\" (UniqueName: \"kubernetes.io/projected/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-kube-api-access-7wxtw\") pod \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.662853 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnhkk\" (UniqueName: \"kubernetes.io/projected/7d567b63-4866-4b76-a00f-c6be088aede4-kube-api-access-cnhkk\") pod \"7d567b63-4866-4b76-a00f-c6be088aede4\" (UID: \"7d567b63-4866-4b76-a00f-c6be088aede4\") " Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.662894 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d567b63-4866-4b76-a00f-c6be088aede4-config\") pod \"7d567b63-4866-4b76-a00f-c6be088aede4\" (UID: \"7d567b63-4866-4b76-a00f-c6be088aede4\") " Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.662987 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-dns-svc\") pod \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\" (UID: \"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76\") " Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.663170 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-config" (OuterVolumeSpecName: "config") pod "3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76" (UID: "3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.663324 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.664011 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d567b63-4866-4b76-a00f-c6be088aede4-config" (OuterVolumeSpecName: "config") pod "7d567b63-4866-4b76-a00f-c6be088aede4" (UID: "7d567b63-4866-4b76-a00f-c6be088aede4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.664510 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76" (UID: "3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.666597 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-kube-api-access-7wxtw" (OuterVolumeSpecName: "kube-api-access-7wxtw") pod "3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76" (UID: "3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76"). InnerVolumeSpecName "kube-api-access-7wxtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.667178 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d567b63-4866-4b76-a00f-c6be088aede4-kube-api-access-cnhkk" (OuterVolumeSpecName: "kube-api-access-cnhkk") pod "7d567b63-4866-4b76-a00f-c6be088aede4" (UID: "7d567b63-4866-4b76-a00f-c6be088aede4"). InnerVolumeSpecName "kube-api-access-cnhkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.765850 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wxtw\" (UniqueName: \"kubernetes.io/projected/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-kube-api-access-7wxtw\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.765913 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnhkk\" (UniqueName: \"kubernetes.io/projected/7d567b63-4866-4b76-a00f-c6be088aede4-kube-api-access-cnhkk\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.765924 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d567b63-4866-4b76-a00f-c6be088aede4-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.765959 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.926142 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:49:45 crc kubenswrapper[4909]: I0202 10:49:45.932583 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:49:45 crc kubenswrapper[4909]: W0202 10:49:45.943962 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68e55a25_f51a_49a9_af91_ffbab9ad611e.slice/crio-a7c5a4187e08328045571b7c3962f6647c59fbeb74ad545b12297f90a448ec37 WatchSource:0}: Error finding container a7c5a4187e08328045571b7c3962f6647c59fbeb74ad545b12297f90a448ec37: Status 404 returned error can't find the container with id a7c5a4187e08328045571b7c3962f6647c59fbeb74ad545b12297f90a448ec37 Feb 02 10:49:45 crc kubenswrapper[4909]: W0202 10:49:45.950879 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b8f1870_afe0_4ac5_a633_e87905ab1d5b.slice/crio-2debc0db52befda55cf65379c84a06e45bb718483eba379a79184a62c7c47797 WatchSource:0}: Error finding container 2debc0db52befda55cf65379c84a06e45bb718483eba379a79184a62c7c47797: Status 404 returned error can't find the container with id 2debc0db52befda55cf65379c84a06e45bb718483eba379a79184a62c7c47797 Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.043636 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.070557 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9hkv5"] Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.093825 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qpqvt"] Feb 02 10:49:46 crc kubenswrapper[4909]: W0202 10:49:46.099294 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0af95ff_c608_4b73_92fa_d4a443a9eaaf.slice/crio-2ed57c154181aa9e3a9a049b34ab49ad76353cbbaf30f1aa14875cc0bdc23724 WatchSource:0}: Error finding container 2ed57c154181aa9e3a9a049b34ab49ad76353cbbaf30f1aa14875cc0bdc23724: Status 404 returned error can't find the container with id 2ed57c154181aa9e3a9a049b34ab49ad76353cbbaf30f1aa14875cc0bdc23724 Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.182038 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:49:46 crc kubenswrapper[4909]: W0202 10:49:46.198307 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef5a90ca_c133_400b_b869_becc0b1f60a0.slice/crio-4b7b9aa2d302731ab882bb6396e5cedb806ae9cfe1d2e855629879fcef6b91e8 WatchSource:0}: Error finding container 4b7b9aa2d302731ab882bb6396e5cedb806ae9cfe1d2e855629879fcef6b91e8: Status 404 returned error can't find the container with id 4b7b9aa2d302731ab882bb6396e5cedb806ae9cfe1d2e855629879fcef6b91e8 Feb 02 10:49:46 crc kubenswrapper[4909]: W0202 10:49:46.205925 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b8dd51c_207c_4fae_8d5a_7271a159f0ff.slice/crio-715516fcb25964318a24c5b3c3d1e5cae4bec401cbb1c61092a96208c284ff4f WatchSource:0}: Error finding container 715516fcb25964318a24c5b3c3d1e5cae4bec401cbb1c61092a96208c284ff4f: Status 404 returned error can't find the container with id 715516fcb25964318a24c5b3c3d1e5cae4bec401cbb1c61092a96208c284ff4f Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.339224 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f0a30163-0b42-493b-b775-d88218bd1844","Type":"ContainerStarted","Data":"eda4514efdafb9d3edc2db4164d749803408bc43963f3b15c82cc113eca3e28a"} Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.340701 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0af95ff-c608-4b73-92fa-d4a443a9eaaf","Type":"ContainerStarted","Data":"2ed57c154181aa9e3a9a049b34ab49ad76353cbbaf30f1aa14875cc0bdc23724"} Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.342014 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.342009 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-x2xqq" event={"ID":"3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76","Type":"ContainerDied","Data":"7123ef2ce8fcdd5414be44bf36d7605ef047d8314e600d9edf6b2a13d948f5f4"} Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.346206 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68e55a25-f51a-49a9-af91-ffbab9ad611e","Type":"ContainerStarted","Data":"a7c5a4187e08328045571b7c3962f6647c59fbeb74ad545b12297f90a448ec37"} Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.347766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b8f1870-afe0-4ac5-a633-e87905ab1d5b","Type":"ContainerStarted","Data":"2debc0db52befda55cf65379c84a06e45bb718483eba379a79184a62c7c47797"} Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.348857 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9hkv5" event={"ID":"ef5a90ca-c133-400b-b869-becc0b1f60a0","Type":"ContainerStarted","Data":"4b7b9aa2d302731ab882bb6396e5cedb806ae9cfe1d2e855629879fcef6b91e8"} Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.350032 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qpqvt" event={"ID":"6b8dd51c-207c-4fae-8d5a-7271a159f0ff","Type":"ContainerStarted","Data":"715516fcb25964318a24c5b3c3d1e5cae4bec401cbb1c61092a96208c284ff4f"} Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.351522 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.351490 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-xvsxw" event={"ID":"7d567b63-4866-4b76-a00f-c6be088aede4","Type":"ContainerDied","Data":"1b27ad41fdfde9818334399fb8e6c351c669f5ca8b83f744011375d574f37187"} Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.409450 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-x2xqq"] Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.427752 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-x2xqq"] Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.457558 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xvsxw"] Feb 02 10:49:46 crc kubenswrapper[4909]: I0202 10:49:46.466781 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-xvsxw"] Feb 02 10:49:47 crc kubenswrapper[4909]: I0202 10:49:47.033960 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76" path="/var/lib/kubelet/pods/3e3d62ba-8fec-4dcd-8c7c-878cb8e4de76/volumes" Feb 02 10:49:47 crc kubenswrapper[4909]: I0202 10:49:47.034345 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d567b63-4866-4b76-a00f-c6be088aede4" path="/var/lib/kubelet/pods/7d567b63-4866-4b76-a00f-c6be088aede4/volumes" Feb 02 10:49:47 crc kubenswrapper[4909]: I0202 10:49:47.036277 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:49:47 crc kubenswrapper[4909]: I0202 10:49:47.361677 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b441d32f-f76f-4e7b-b3fe-40e93b126567","Type":"ContainerStarted","Data":"c73875b9010e1e509b2e9adfe296f5305133f400c8e52a90164f9f8d577e55df"} Feb 02 10:49:47 crc kubenswrapper[4909]: I0202 10:49:47.364234 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1ab15f72-b249-42d5-8698-273c5afc7758","Type":"ContainerStarted","Data":"529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360"} Feb 02 10:49:49 crc kubenswrapper[4909]: I0202 10:49:49.511383 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:49:49 crc kubenswrapper[4909]: I0202 10:49:49.511773 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:49:50 crc kubenswrapper[4909]: W0202 10:49:50.105599 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b658933_f66d_47df_8b75_a42cd55b9bf4.slice/crio-798aff6bed83c271b8c0d250d07fb63f55fe57fd4e6aecfa75d1eb15594d1b45 WatchSource:0}: Error finding container 798aff6bed83c271b8c0d250d07fb63f55fe57fd4e6aecfa75d1eb15594d1b45: Status 404 returned error can't find the container with id 798aff6bed83c271b8c0d250d07fb63f55fe57fd4e6aecfa75d1eb15594d1b45 Feb 02 10:49:50 crc kubenswrapper[4909]: I0202 10:49:50.401279 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7b658933-f66d-47df-8b75-a42cd55b9bf4","Type":"ContainerStarted","Data":"798aff6bed83c271b8c0d250d07fb63f55fe57fd4e6aecfa75d1eb15594d1b45"} Feb 02 10:49:52 crc kubenswrapper[4909]: I0202 10:49:52.419669 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0af95ff-c608-4b73-92fa-d4a443a9eaaf","Type":"ContainerStarted","Data":"4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d"} Feb 02 10:49:52 crc kubenswrapper[4909]: I0202 10:49:52.423079 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"9dab4432-0762-45a8-88ab-3a99217a790f","Type":"ContainerStarted","Data":"a5be7b718c825caf194dfbbef2df9f7def820d23f84184c159660c56fbb1b591"} Feb 02 10:49:52 crc kubenswrapper[4909]: I0202 10:49:52.423313 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 02 10:49:52 crc kubenswrapper[4909]: I0202 10:49:52.424669 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68e55a25-f51a-49a9-af91-ffbab9ad611e","Type":"ContainerStarted","Data":"41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9"} Feb 02 10:49:52 crc kubenswrapper[4909]: I0202 10:49:52.473898 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=17.801459016 podStartE2EDuration="24.473879691s" podCreationTimestamp="2026-02-02 10:49:28 +0000 UTC" firstStartedPulling="2026-02-02 10:49:45.311520917 +0000 UTC m=+1111.057621642" lastFinishedPulling="2026-02-02 10:49:51.983941582 +0000 UTC m=+1117.730042317" observedRunningTime="2026-02-02 10:49:52.473072458 +0000 UTC m=+1118.219173193" watchObservedRunningTime="2026-02-02 10:49:52.473879691 +0000 UTC m=+1118.219980436" Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.465428 4909 generic.go:334] "Generic (PLEG): container finished" podID="68e55a25-f51a-49a9-af91-ffbab9ad611e" containerID="41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9" exitCode=0 Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.465932 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68e55a25-f51a-49a9-af91-ffbab9ad611e","Type":"ContainerDied","Data":"41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9"} Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.471035 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7b658933-f66d-47df-8b75-a42cd55b9bf4","Type":"ContainerStarted","Data":"b05bbe00b26a1b9593023419b5d4a14f74126dc62da00a8e37a5ac60df828f91"} Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.474037 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b8f1870-afe0-4ac5-a633-e87905ab1d5b","Type":"ContainerStarted","Data":"36feeb53bfb30eccb100ee6f2672b7eaa4b2f848c04254eb7c5e1d09d61b996a"} Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.474368 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.485001 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9hkv5" event={"ID":"ef5a90ca-c133-400b-b869-becc0b1f60a0","Type":"ContainerStarted","Data":"9e1decd1c118f6f232ecbb6ee4515785b72d98434f58ddfbfecc9b53e994c7ac"} Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.487426 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qpqvt" event={"ID":"6b8dd51c-207c-4fae-8d5a-7271a159f0ff","Type":"ContainerStarted","Data":"f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3"} Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.488161 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-qpqvt" Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.490222 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f0a30163-0b42-493b-b775-d88218bd1844","Type":"ContainerStarted","Data":"97f377e1596e6eafef005fa1e1978dccc7634e807d2ca9c148149e41e82b87df"} Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.492999 4909 generic.go:334] "Generic (PLEG): container finished" podID="e0af95ff-c608-4b73-92fa-d4a443a9eaaf" containerID="4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d" exitCode=0 Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.493034 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0af95ff-c608-4b73-92fa-d4a443a9eaaf","Type":"ContainerDied","Data":"4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d"} Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.512468 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qpqvt" podStartSLOduration=14.393824873 podStartE2EDuration="23.512451074s" podCreationTimestamp="2026-02-02 10:49:33 +0000 UTC" firstStartedPulling="2026-02-02 10:49:46.205759054 +0000 UTC m=+1111.951859789" lastFinishedPulling="2026-02-02 10:49:55.324385245 +0000 UTC m=+1121.070485990" observedRunningTime="2026-02-02 10:49:56.51125514 +0000 UTC m=+1122.257355875" watchObservedRunningTime="2026-02-02 10:49:56.512451074 +0000 UTC m=+1122.258551809" Feb 02 10:49:56 crc kubenswrapper[4909]: I0202 10:49:56.579672 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.461605607 podStartE2EDuration="26.579644711s" podCreationTimestamp="2026-02-02 10:49:30 +0000 UTC" firstStartedPulling="2026-02-02 10:49:45.952955917 +0000 UTC m=+1111.699056652" lastFinishedPulling="2026-02-02 10:49:56.070995021 +0000 UTC m=+1121.817095756" observedRunningTime="2026-02-02 10:49:56.57820938 +0000 UTC m=+1122.324310115" watchObservedRunningTime="2026-02-02 10:49:56.579644711 +0000 UTC m=+1122.325745446" Feb 02 10:49:57 crc kubenswrapper[4909]: I0202 10:49:57.505651 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68e55a25-f51a-49a9-af91-ffbab9ad611e","Type":"ContainerStarted","Data":"a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9"} Feb 02 10:49:57 crc kubenswrapper[4909]: I0202 10:49:57.508700 4909 generic.go:334] "Generic (PLEG): container finished" podID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerID="9e1decd1c118f6f232ecbb6ee4515785b72d98434f58ddfbfecc9b53e994c7ac" exitCode=0 Feb 02 10:49:57 crc kubenswrapper[4909]: I0202 10:49:57.508768 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9hkv5" event={"ID":"ef5a90ca-c133-400b-b869-becc0b1f60a0","Type":"ContainerDied","Data":"9e1decd1c118f6f232ecbb6ee4515785b72d98434f58ddfbfecc9b53e994c7ac"} Feb 02 10:49:57 crc kubenswrapper[4909]: I0202 10:49:57.511959 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0af95ff-c608-4b73-92fa-d4a443a9eaaf","Type":"ContainerStarted","Data":"f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c"} Feb 02 10:49:57 crc kubenswrapper[4909]: I0202 10:49:57.529510 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=25.47344801 podStartE2EDuration="31.529490827s" podCreationTimestamp="2026-02-02 10:49:26 +0000 UTC" firstStartedPulling="2026-02-02 10:49:45.946170474 +0000 UTC m=+1111.692271209" lastFinishedPulling="2026-02-02 10:49:52.002213271 +0000 UTC m=+1117.748314026" observedRunningTime="2026-02-02 10:49:57.529107266 +0000 UTC m=+1123.275208011" watchObservedRunningTime="2026-02-02 10:49:57.529490827 +0000 UTC m=+1123.275591562" Feb 02 10:49:57 crc kubenswrapper[4909]: I0202 10:49:57.570819 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.614568116 podStartE2EDuration="30.570782129s" podCreationTimestamp="2026-02-02 10:49:27 +0000 UTC" firstStartedPulling="2026-02-02 10:49:46.102912894 +0000 UTC m=+1111.849013629" lastFinishedPulling="2026-02-02 10:49:52.059126907 +0000 UTC m=+1117.805227642" observedRunningTime="2026-02-02 10:49:57.570128811 +0000 UTC m=+1123.316229566" watchObservedRunningTime="2026-02-02 10:49:57.570782129 +0000 UTC m=+1123.316882864" Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.523142 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9hkv5" event={"ID":"ef5a90ca-c133-400b-b869-becc0b1f60a0","Type":"ContainerStarted","Data":"8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839"} Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.523496 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9hkv5" event={"ID":"ef5a90ca-c133-400b-b869-becc0b1f60a0","Type":"ContainerStarted","Data":"4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18"} Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.523512 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.526849 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f0a30163-0b42-493b-b775-d88218bd1844","Type":"ContainerStarted","Data":"960092178e862c3b77d8e3bf300b8c16cf49e7db4eb0520f595f9cef3f507a97"} Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.529952 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7b658933-f66d-47df-8b75-a42cd55b9bf4","Type":"ContainerStarted","Data":"d2191dd6198a0fc9ecc9c919260563196adfa2a35bc25ae0ca32279cce086a5f"} Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.545324 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9hkv5" podStartSLOduration=16.542421239 podStartE2EDuration="25.545292644s" podCreationTimestamp="2026-02-02 10:49:33 +0000 UTC" firstStartedPulling="2026-02-02 10:49:46.200762732 +0000 UTC m=+1111.946863467" lastFinishedPulling="2026-02-02 10:49:55.203634137 +0000 UTC m=+1120.949734872" observedRunningTime="2026-02-02 10:49:58.539219842 +0000 UTC m=+1124.285320577" watchObservedRunningTime="2026-02-02 10:49:58.545292644 +0000 UTC m=+1124.291393379" Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.560089 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.266366213 podStartE2EDuration="22.560074244s" podCreationTimestamp="2026-02-02 10:49:36 +0000 UTC" firstStartedPulling="2026-02-02 10:49:46.205256189 +0000 UTC m=+1111.951356924" lastFinishedPulling="2026-02-02 10:49:57.49896422 +0000 UTC m=+1123.245064955" observedRunningTime="2026-02-02 10:49:58.557523861 +0000 UTC m=+1124.303624596" watchObservedRunningTime="2026-02-02 10:49:58.560074244 +0000 UTC m=+1124.306174979" Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.580815 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=15.197949938 podStartE2EDuration="22.580788532s" podCreationTimestamp="2026-02-02 10:49:36 +0000 UTC" firstStartedPulling="2026-02-02 10:49:50.10955571 +0000 UTC m=+1115.855656455" lastFinishedPulling="2026-02-02 10:49:57.492394314 +0000 UTC m=+1123.238495049" observedRunningTime="2026-02-02 10:49:58.575657286 +0000 UTC m=+1124.321758021" watchObservedRunningTime="2026-02-02 10:49:58.580788532 +0000 UTC m=+1124.326889267" Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.690054 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.752294 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.752668 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 02 10:49:58 crc kubenswrapper[4909]: I0202 10:49:58.914008 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 02 10:49:59 crc kubenswrapper[4909]: I0202 10:49:59.536451 4909 generic.go:334] "Generic (PLEG): container finished" podID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" containerID="be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef" exitCode=0 Feb 02 10:49:59 crc kubenswrapper[4909]: I0202 10:49:59.536546 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" event={"ID":"32baf7b0-1fd7-4303-b78d-56c2a4e29388","Type":"ContainerDied","Data":"be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef"} Feb 02 10:49:59 crc kubenswrapper[4909]: I0202 10:49:59.538508 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" containerID="fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e" exitCode=0 Feb 02 10:49:59 crc kubenswrapper[4909]: I0202 10:49:59.538536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" event={"ID":"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535","Type":"ContainerDied","Data":"fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e"} Feb 02 10:49:59 crc kubenswrapper[4909]: I0202 10:49:59.539156 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:50:00 crc kubenswrapper[4909]: I0202 10:50:00.561982 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" event={"ID":"32baf7b0-1fd7-4303-b78d-56c2a4e29388","Type":"ContainerStarted","Data":"640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e"} Feb 02 10:50:00 crc kubenswrapper[4909]: I0202 10:50:00.562978 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:50:00 crc kubenswrapper[4909]: I0202 10:50:00.568653 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" event={"ID":"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535","Type":"ContainerStarted","Data":"852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859"} Feb 02 10:50:00 crc kubenswrapper[4909]: I0202 10:50:00.584401 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" podStartSLOduration=-9223372000.270395 podStartE2EDuration="36.584381333s" podCreationTimestamp="2026-02-02 10:49:24 +0000 UTC" firstStartedPulling="2026-02-02 10:49:25.188857817 +0000 UTC m=+1090.934958552" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:00.58041496 +0000 UTC m=+1126.326515695" watchObservedRunningTime="2026-02-02 10:50:00.584381333 +0000 UTC m=+1126.330482068" Feb 02 10:50:00 crc kubenswrapper[4909]: I0202 10:50:00.600269 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" podStartSLOduration=3.635557881 podStartE2EDuration="36.600251444s" podCreationTimestamp="2026-02-02 10:49:24 +0000 UTC" firstStartedPulling="2026-02-02 10:49:25.752731268 +0000 UTC m=+1091.498832003" lastFinishedPulling="2026-02-02 10:49:58.717424831 +0000 UTC m=+1124.463525566" observedRunningTime="2026-02-02 10:50:00.597358301 +0000 UTC m=+1126.343459036" watchObservedRunningTime="2026-02-02 10:50:00.600251444 +0000 UTC m=+1126.346352179" Feb 02 10:50:00 crc kubenswrapper[4909]: I0202 10:50:00.931601 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-8psfb"] Feb 02 10:50:00 crc kubenswrapper[4909]: I0202 10:50:00.985723 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7477d666f-288f6"] Feb 02 10:50:00 crc kubenswrapper[4909]: I0202 10:50:00.989774 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:00 crc kubenswrapper[4909]: I0202 10:50:00.999064 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7477d666f-288f6"] Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.023256 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-dns-svc\") pod \"dnsmasq-dns-7477d666f-288f6\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.023300 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-config\") pod \"dnsmasq-dns-7477d666f-288f6\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.023407 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vkr5\" (UniqueName: \"kubernetes.io/projected/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-kube-api-access-4vkr5\") pod \"dnsmasq-dns-7477d666f-288f6\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.124905 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vkr5\" (UniqueName: \"kubernetes.io/projected/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-kube-api-access-4vkr5\") pod \"dnsmasq-dns-7477d666f-288f6\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.125028 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-dns-svc\") pod \"dnsmasq-dns-7477d666f-288f6\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.125054 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-config\") pod \"dnsmasq-dns-7477d666f-288f6\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.126260 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-config\") pod \"dnsmasq-dns-7477d666f-288f6\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.126377 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-dns-svc\") pod \"dnsmasq-dns-7477d666f-288f6\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.148933 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vkr5\" (UniqueName: \"kubernetes.io/projected/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-kube-api-access-4vkr5\") pod \"dnsmasq-dns-7477d666f-288f6\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.225412 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.292776 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.315257 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.492206 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.547141 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.592149 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.592287 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" podUID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" containerName="dnsmasq-dns" containerID="cri-o://852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859" gracePeriod=10 Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.592485 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.639070 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.747557 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.748135 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.794470 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.868112 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-4v7wk"] Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.873664 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7477d666f-288f6"] Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.917123 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77fc48cd97-hgklb"] Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.918271 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.929329 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 02 10:50:01 crc kubenswrapper[4909]: I0202 10:50:01.944748 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77fc48cd97-hgklb"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.010036 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-pzcjh"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.014450 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.018326 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.021674 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pzcjh"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.041143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr89f\" (UniqueName: \"kubernetes.io/projected/c24055ff-2124-4f7d-9560-6b2effc2ba4d-kube-api-access-tr89f\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.041257 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-ovsdbserver-nb\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.041328 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-config\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.041447 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-dns-svc\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.106681 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.111965 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.115071 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.115274 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.115430 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-pdpw7" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.115548 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.125823 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.126720 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.143398 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-ovsdbserver-nb\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.143448 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm4ws\" (UniqueName: \"kubernetes.io/projected/c631ddad-ab3a-488f-9947-2f3385fd912c-kube-api-access-qm4ws\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.143521 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.143549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-combined-ca-bundle\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.143578 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-config\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.143609 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovn-rundir\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.143737 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c631ddad-ab3a-488f-9947-2f3385fd912c-config\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.143839 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovs-rundir\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.143932 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-dns-svc\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.143995 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr89f\" (UniqueName: \"kubernetes.io/projected/c24055ff-2124-4f7d-9560-6b2effc2ba4d-kube-api-access-tr89f\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.144878 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-config\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.145265 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-ovsdbserver-nb\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.151532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-dns-svc\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.188685 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr89f\" (UniqueName: \"kubernetes.io/projected/c24055ff-2124-4f7d-9560-6b2effc2ba4d-kube-api-access-tr89f\") pod \"dnsmasq-dns-77fc48cd97-hgklb\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.207920 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7477d666f-288f6"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.218921 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:50:02 crc kubenswrapper[4909]: E0202 10:50:02.226520 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" containerName="init" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.226535 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" containerName="init" Feb 02 10:50:02 crc kubenswrapper[4909]: E0202 10:50:02.226581 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" containerName="dnsmasq-dns" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.226594 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" containerName="dnsmasq-dns" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.227578 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" containerName="dnsmasq-dns" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.244921 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69rsq\" (UniqueName: \"kubernetes.io/projected/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-kube-api-access-69rsq\") pod \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.245312 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-dns-svc\") pod \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.245479 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-config\") pod \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\" (UID: \"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535\") " Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.245925 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovn-rundir\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.246006 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c631ddad-ab3a-488f-9947-2f3385fd912c-config\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.246095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovs-rundir\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.246194 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-lock\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.246305 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c400ec0-7faf-4151-b34e-ee28044b89e7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.246406 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7mxn\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-kube-api-access-v7mxn\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.246485 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.246575 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.247073 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm4ws\" (UniqueName: \"kubernetes.io/projected/c631ddad-ab3a-488f-9947-2f3385fd912c-kube-api-access-qm4ws\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.247169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-cache\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.247277 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.247329 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-combined-ca-bundle\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.248903 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovs-rundir\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.250324 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovn-rundir\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.251749 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-kube-api-access-69rsq" (OuterVolumeSpecName: "kube-api-access-69rsq") pod "f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" (UID: "f4aa36cd-48dd-48cc-87b9-c6c7bf39c535"). InnerVolumeSpecName "kube-api-access-69rsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.251791 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.257381 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.259224 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.259728 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c631ddad-ab3a-488f-9947-2f3385fd912c-config\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.260743 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4zsd5" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.268572 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.268869 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.269308 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.277644 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm4ws\" (UniqueName: \"kubernetes.io/projected/c631ddad-ab3a-488f-9947-2f3385fd912c-kube-api-access-qm4ws\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.282351 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-combined-ca-bundle\") pod \"ovn-controller-metrics-pzcjh\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.293143 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" (UID: "f4aa36cd-48dd-48cc-87b9-c6c7bf39c535"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.304235 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-wx74v"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.305893 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.307803 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.313968 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.320655 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-wx74v"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349046 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-lock\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349182 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c400ec0-7faf-4151-b34e-ee28044b89e7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349212 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7mxn\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-kube-api-access-v7mxn\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349271 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349299 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349412 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-cache\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349436 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349457 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-config\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349476 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-scripts\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349514 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349531 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m82mq\" (UniqueName: \"kubernetes.io/projected/87203850-864b-4fff-b340-25e4f5c6e7c9-kube-api-access-m82mq\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349553 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349590 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.349603 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69rsq\" (UniqueName: \"kubernetes.io/projected/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-kube-api-access-69rsq\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:02 crc kubenswrapper[4909]: E0202 10:50:02.349721 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:50:02 crc kubenswrapper[4909]: E0202 10:50:02.349733 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:50:02 crc kubenswrapper[4909]: E0202 10:50:02.349771 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift podName:7c400ec0-7faf-4151-b34e-ee28044b89e7 nodeName:}" failed. No retries permitted until 2026-02-02 10:50:02.84975485 +0000 UTC m=+1128.595855575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift") pod "swift-storage-0" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7") : configmap "swift-ring-files" not found Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.350106 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-lock\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.350218 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.350589 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-cache\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.353237 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-config" (OuterVolumeSpecName: "config") pod "f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" (UID: "f4aa36cd-48dd-48cc-87b9-c6c7bf39c535"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.357340 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c400ec0-7faf-4151-b34e-ee28044b89e7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.369695 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7mxn\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-kube-api-access-v7mxn\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.379857 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.416575 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451065 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451108 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-config\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451129 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-scripts\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451169 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451185 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m82mq\" (UniqueName: \"kubernetes.io/projected/87203850-864b-4fff-b340-25e4f5c6e7c9-kube-api-access-m82mq\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451212 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451263 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-config\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451292 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrt4p\" (UniqueName: \"kubernetes.io/projected/04683aef-1e09-400c-a20d-29d191926c20-kube-api-access-rrt4p\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451313 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-nb\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451327 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-sb\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451385 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-dns-svc\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451405 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451448 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.451681 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.453313 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-config\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.453534 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-scripts\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.456476 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.459102 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.468030 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.469631 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m82mq\" (UniqueName: \"kubernetes.io/projected/87203850-864b-4fff-b340-25e4f5c6e7c9-kube-api-access-m82mq\") pod \"ovn-northd-0\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.553155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-config\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.553219 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrt4p\" (UniqueName: \"kubernetes.io/projected/04683aef-1e09-400c-a20d-29d191926c20-kube-api-access-rrt4p\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.553252 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-nb\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.553273 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-sb\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.553346 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-dns-svc\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.554732 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-nb\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.556731 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-config\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.560314 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-dns-svc\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.561177 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-sb\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.571711 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrt4p\" (UniqueName: \"kubernetes.io/projected/04683aef-1e09-400c-a20d-29d191926c20-kube-api-access-rrt4p\") pod \"dnsmasq-dns-66b577f8c-wx74v\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.598899 4909 generic.go:334] "Generic (PLEG): container finished" podID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" containerID="852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859" exitCode=0 Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.598961 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" event={"ID":"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535","Type":"ContainerDied","Data":"852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859"} Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.598988 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" event={"ID":"f4aa36cd-48dd-48cc-87b9-c6c7bf39c535","Type":"ContainerDied","Data":"8dd2a1c8a5aad984dded24285b57d01d5ad91dafb638535846811e562bbf624d"} Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.599005 4909 scope.go:117] "RemoveContainer" containerID="852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.599135 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-8psfb" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.604001 4909 generic.go:334] "Generic (PLEG): container finished" podID="01e250e0-cd5f-4f1b-ba4b-e707941ffca3" containerID="39700dcc4070c782b00e073b8169f41113a5bb5226302e61220ded416f376be5" exitCode=0 Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.604066 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7477d666f-288f6" event={"ID":"01e250e0-cd5f-4f1b-ba4b-e707941ffca3","Type":"ContainerDied","Data":"39700dcc4070c782b00e073b8169f41113a5bb5226302e61220ded416f376be5"} Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.604118 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7477d666f-288f6" event={"ID":"01e250e0-cd5f-4f1b-ba4b-e707941ffca3","Type":"ContainerStarted","Data":"fed651e97c27b7c1dbe90603f762f915589379f837bba716702180ca2e6844cc"} Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.604186 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" podUID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" containerName="dnsmasq-dns" containerID="cri-o://640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e" gracePeriod=10 Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.631779 4909 scope.go:117] "RemoveContainer" containerID="fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.654727 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-8psfb"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.669077 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-8psfb"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.708342 4909 scope.go:117] "RemoveContainer" containerID="852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859" Feb 02 10:50:02 crc kubenswrapper[4909]: E0202 10:50:02.708761 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859\": container with ID starting with 852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859 not found: ID does not exist" containerID="852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.708791 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859"} err="failed to get container status \"852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859\": rpc error: code = NotFound desc = could not find container \"852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859\": container with ID starting with 852a0e68559d3f3da6ba14cae3117561898088c94b9cd9d66cb953c12f9b6859 not found: ID does not exist" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.708827 4909 scope.go:117] "RemoveContainer" containerID="fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e" Feb 02 10:50:02 crc kubenswrapper[4909]: E0202 10:50:02.709064 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e\": container with ID starting with fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e not found: ID does not exist" containerID="fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.709092 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e"} err="failed to get container status \"fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e\": rpc error: code = NotFound desc = could not find container \"fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e\": container with ID starting with fdec11025a3c2428126023022f9b655633949f8a2b68b14af9b30e3f576a270e not found: ID does not exist" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.722897 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.731206 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.749208 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77fc48cd97-hgklb"] Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.858616 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:02 crc kubenswrapper[4909]: E0202 10:50:02.858847 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:50:02 crc kubenswrapper[4909]: E0202 10:50:02.858869 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:50:02 crc kubenswrapper[4909]: E0202 10:50:02.858913 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift podName:7c400ec0-7faf-4151-b34e-ee28044b89e7 nodeName:}" failed. No retries permitted until 2026-02-02 10:50:03.858899324 +0000 UTC m=+1129.605000049 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift") pod "swift-storage-0" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7") : configmap "swift-ring-files" not found Feb 02 10:50:02 crc kubenswrapper[4909]: I0202 10:50:02.886038 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-pzcjh"] Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.007067 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.033987 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4aa36cd-48dd-48cc-87b9-c6c7bf39c535" path="/var/lib/kubelet/pods/f4aa36cd-48dd-48cc-87b9-c6c7bf39c535/volumes" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.061280 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-dns-svc\") pod \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.061356 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-config\") pod \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.061431 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vkr5\" (UniqueName: \"kubernetes.io/projected/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-kube-api-access-4vkr5\") pod \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\" (UID: \"01e250e0-cd5f-4f1b-ba4b-e707941ffca3\") " Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.069627 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-kube-api-access-4vkr5" (OuterVolumeSpecName: "kube-api-access-4vkr5") pod "01e250e0-cd5f-4f1b-ba4b-e707941ffca3" (UID: "01e250e0-cd5f-4f1b-ba4b-e707941ffca3"). InnerVolumeSpecName "kube-api-access-4vkr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.093492 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-config" (OuterVolumeSpecName: "config") pod "01e250e0-cd5f-4f1b-ba4b-e707941ffca3" (UID: "01e250e0-cd5f-4f1b-ba4b-e707941ffca3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.094085 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "01e250e0-cd5f-4f1b-ba4b-e707941ffca3" (UID: "01e250e0-cd5f-4f1b-ba4b-e707941ffca3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.136031 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.168695 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.168737 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.168755 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vkr5\" (UniqueName: \"kubernetes.io/projected/01e250e0-cd5f-4f1b-ba4b-e707941ffca3-kube-api-access-4vkr5\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.270271 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-config\") pod \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.270891 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-dns-svc\") pod \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.270922 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9r9c\" (UniqueName: \"kubernetes.io/projected/32baf7b0-1fd7-4303-b78d-56c2a4e29388-kube-api-access-q9r9c\") pod \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\" (UID: \"32baf7b0-1fd7-4303-b78d-56c2a4e29388\") " Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.292757 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32baf7b0-1fd7-4303-b78d-56c2a4e29388-kube-api-access-q9r9c" (OuterVolumeSpecName: "kube-api-access-q9r9c") pod "32baf7b0-1fd7-4303-b78d-56c2a4e29388" (UID: "32baf7b0-1fd7-4303-b78d-56c2a4e29388"). InnerVolumeSpecName "kube-api-access-q9r9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.312175 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32baf7b0-1fd7-4303-b78d-56c2a4e29388" (UID: "32baf7b0-1fd7-4303-b78d-56c2a4e29388"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.324310 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-config" (OuterVolumeSpecName: "config") pod "32baf7b0-1fd7-4303-b78d-56c2a4e29388" (UID: "32baf7b0-1fd7-4303-b78d-56c2a4e29388"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.362123 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:50:03 crc kubenswrapper[4909]: W0202 10:50:03.362370 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87203850_864b_4fff_b340_25e4f5c6e7c9.slice/crio-e2fc144e387bccba8e52d0a45c65dc1984de3c4a640c768871b3c7e5c07b8b0f WatchSource:0}: Error finding container e2fc144e387bccba8e52d0a45c65dc1984de3c4a640c768871b3c7e5c07b8b0f: Status 404 returned error can't find the container with id e2fc144e387bccba8e52d0a45c65dc1984de3c4a640c768871b3c7e5c07b8b0f Feb 02 10:50:03 crc kubenswrapper[4909]: W0202 10:50:03.365334 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04683aef_1e09_400c_a20d_29d191926c20.slice/crio-f11ee355dd066308d1141cf0b520ed327e211b3f3bc1f990a60a3b1563ae92c2 WatchSource:0}: Error finding container f11ee355dd066308d1141cf0b520ed327e211b3f3bc1f990a60a3b1563ae92c2: Status 404 returned error can't find the container with id f11ee355dd066308d1141cf0b520ed327e211b3f3bc1f990a60a3b1563ae92c2 Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.368403 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-wx74v"] Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.372916 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9r9c\" (UniqueName: \"kubernetes.io/projected/32baf7b0-1fd7-4303-b78d-56c2a4e29388-kube-api-access-q9r9c\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.372945 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.372957 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32baf7b0-1fd7-4303-b78d-56c2a4e29388-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.612537 4909 generic.go:334] "Generic (PLEG): container finished" podID="04683aef-1e09-400c-a20d-29d191926c20" containerID="c4e6780362f55fd4568dbf235d90624d27eef7161ae83eef3eab04d7d32f586b" exitCode=0 Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.612599 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" event={"ID":"04683aef-1e09-400c-a20d-29d191926c20","Type":"ContainerDied","Data":"c4e6780362f55fd4568dbf235d90624d27eef7161ae83eef3eab04d7d32f586b"} Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.612626 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" event={"ID":"04683aef-1e09-400c-a20d-29d191926c20","Type":"ContainerStarted","Data":"f11ee355dd066308d1141cf0b520ed327e211b3f3bc1f990a60a3b1563ae92c2"} Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.618492 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pzcjh" event={"ID":"c631ddad-ab3a-488f-9947-2f3385fd912c","Type":"ContainerStarted","Data":"e616e96154cc54a51e3703d10ef0dafd77cf698728a725fab58c048715a29bb9"} Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.618548 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pzcjh" event={"ID":"c631ddad-ab3a-488f-9947-2f3385fd912c","Type":"ContainerStarted","Data":"54baae7c7b75f13bae33d983f2d0ddee0aa4501c4a00d2a1c1cf3c2c25652c45"} Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.627234 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7477d666f-288f6" event={"ID":"01e250e0-cd5f-4f1b-ba4b-e707941ffca3","Type":"ContainerDied","Data":"fed651e97c27b7c1dbe90603f762f915589379f837bba716702180ca2e6844cc"} Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.627286 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7477d666f-288f6" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.627307 4909 scope.go:117] "RemoveContainer" containerID="39700dcc4070c782b00e073b8169f41113a5bb5226302e61220ded416f376be5" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.636502 4909 generic.go:334] "Generic (PLEG): container finished" podID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" containerID="640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e" exitCode=0 Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.636631 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.636646 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" event={"ID":"32baf7b0-1fd7-4303-b78d-56c2a4e29388","Type":"ContainerDied","Data":"640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e"} Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.637010 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-4v7wk" event={"ID":"32baf7b0-1fd7-4303-b78d-56c2a4e29388","Type":"ContainerDied","Data":"94fa809a2ab739d4573ed96346e9c1028445a7598dbababbaa84c9d1c01397f5"} Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.649112 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87203850-864b-4fff-b340-25e4f5c6e7c9","Type":"ContainerStarted","Data":"e2fc144e387bccba8e52d0a45c65dc1984de3c4a640c768871b3c7e5c07b8b0f"} Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.651472 4909 generic.go:334] "Generic (PLEG): container finished" podID="c24055ff-2124-4f7d-9560-6b2effc2ba4d" containerID="acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916" exitCode=0 Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.651610 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" event={"ID":"c24055ff-2124-4f7d-9560-6b2effc2ba4d","Type":"ContainerDied","Data":"acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916"} Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.651662 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" event={"ID":"c24055ff-2124-4f7d-9560-6b2effc2ba4d","Type":"ContainerStarted","Data":"a2005f33bee72f872f34f8cd37c317b54513edff862fbac86b54ba28207e42a2"} Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.668557 4909 scope.go:117] "RemoveContainer" containerID="640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.671900 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-pzcjh" podStartSLOduration=2.671801422 podStartE2EDuration="2.671801422s" podCreationTimestamp="2026-02-02 10:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:03.654544472 +0000 UTC m=+1129.400645207" watchObservedRunningTime="2026-02-02 10:50:03.671801422 +0000 UTC m=+1129.417902167" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.697524 4909 scope.go:117] "RemoveContainer" containerID="be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.764418 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7477d666f-288f6"] Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.771922 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7477d666f-288f6"] Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.776984 4909 scope.go:117] "RemoveContainer" containerID="640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e" Feb 02 10:50:03 crc kubenswrapper[4909]: E0202 10:50:03.777733 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e\": container with ID starting with 640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e not found: ID does not exist" containerID="640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.777770 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e"} err="failed to get container status \"640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e\": rpc error: code = NotFound desc = could not find container \"640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e\": container with ID starting with 640fd9121fcea840614b6896e8edd26ceb65435a00d40430027bf253ba8d4c7e not found: ID does not exist" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.777792 4909 scope.go:117] "RemoveContainer" containerID="be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef" Feb 02 10:50:03 crc kubenswrapper[4909]: E0202 10:50:03.778028 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef\": container with ID starting with be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef not found: ID does not exist" containerID="be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.778048 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef"} err="failed to get container status \"be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef\": rpc error: code = NotFound desc = could not find container \"be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef\": container with ID starting with be5ef9c424cc5cfe77bf061f8467db4cf9034bee61bb836d5cc1d7c83db0ecef not found: ID does not exist" Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.780846 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-4v7wk"] Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.789634 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-4v7wk"] Feb 02 10:50:03 crc kubenswrapper[4909]: I0202 10:50:03.886930 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:03 crc kubenswrapper[4909]: E0202 10:50:03.887131 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:50:03 crc kubenswrapper[4909]: E0202 10:50:03.887159 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:50:03 crc kubenswrapper[4909]: E0202 10:50:03.887224 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift podName:7c400ec0-7faf-4151-b34e-ee28044b89e7 nodeName:}" failed. No retries permitted until 2026-02-02 10:50:05.887205788 +0000 UTC m=+1131.633306523 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift") pod "swift-storage-0" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7") : configmap "swift-ring-files" not found Feb 02 10:50:04 crc kubenswrapper[4909]: I0202 10:50:04.663972 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87203850-864b-4fff-b340-25e4f5c6e7c9","Type":"ContainerStarted","Data":"52e5a0b83ebdf2c32308bcba8bf2bcd22b63c5893664ccffa254ee3d4272e8b7"} Feb 02 10:50:04 crc kubenswrapper[4909]: I0202 10:50:04.667134 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" event={"ID":"c24055ff-2124-4f7d-9560-6b2effc2ba4d","Type":"ContainerStarted","Data":"a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf"} Feb 02 10:50:04 crc kubenswrapper[4909]: I0202 10:50:04.667242 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:04 crc kubenswrapper[4909]: I0202 10:50:04.670452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" event={"ID":"04683aef-1e09-400c-a20d-29d191926c20","Type":"ContainerStarted","Data":"e93409660b5a9d436b70952c8f04f7832ca2759777d66ca326bb6d3fc52d1471"} Feb 02 10:50:04 crc kubenswrapper[4909]: I0202 10:50:04.670566 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:04 crc kubenswrapper[4909]: I0202 10:50:04.694038 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" podStartSLOduration=3.694016493 podStartE2EDuration="3.694016493s" podCreationTimestamp="2026-02-02 10:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:04.686261592 +0000 UTC m=+1130.432362327" watchObservedRunningTime="2026-02-02 10:50:04.694016493 +0000 UTC m=+1130.440117228" Feb 02 10:50:04 crc kubenswrapper[4909]: I0202 10:50:04.714291 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" podStartSLOduration=2.714272418 podStartE2EDuration="2.714272418s" podCreationTimestamp="2026-02-02 10:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:04.707530576 +0000 UTC m=+1130.453631311" watchObservedRunningTime="2026-02-02 10:50:04.714272418 +0000 UTC m=+1130.460373153" Feb 02 10:50:05 crc kubenswrapper[4909]: I0202 10:50:05.028704 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01e250e0-cd5f-4f1b-ba4b-e707941ffca3" path="/var/lib/kubelet/pods/01e250e0-cd5f-4f1b-ba4b-e707941ffca3/volumes" Feb 02 10:50:05 crc kubenswrapper[4909]: I0202 10:50:05.029353 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" path="/var/lib/kubelet/pods/32baf7b0-1fd7-4303-b78d-56c2a4e29388/volumes" Feb 02 10:50:05 crc kubenswrapper[4909]: I0202 10:50:05.680053 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87203850-864b-4fff-b340-25e4f5c6e7c9","Type":"ContainerStarted","Data":"c74c134d5ea53fc5d68abe3fe06e4cfbcfa8770404a1e48f3df9ace3dc76800d"} Feb 02 10:50:05 crc kubenswrapper[4909]: I0202 10:50:05.680610 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 02 10:50:05 crc kubenswrapper[4909]: I0202 10:50:05.708991 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.6858370689999997 podStartE2EDuration="3.708958475s" podCreationTimestamp="2026-02-02 10:50:02 +0000 UTC" firstStartedPulling="2026-02-02 10:50:03.368714438 +0000 UTC m=+1129.114815173" lastFinishedPulling="2026-02-02 10:50:04.391835844 +0000 UTC m=+1130.137936579" observedRunningTime="2026-02-02 10:50:05.700857615 +0000 UTC m=+1131.446958390" watchObservedRunningTime="2026-02-02 10:50:05.708958475 +0000 UTC m=+1131.455059210" Feb 02 10:50:05 crc kubenswrapper[4909]: I0202 10:50:05.929768 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:05 crc kubenswrapper[4909]: E0202 10:50:05.929959 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:50:05 crc kubenswrapper[4909]: E0202 10:50:05.930169 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:50:05 crc kubenswrapper[4909]: E0202 10:50:05.930265 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift podName:7c400ec0-7faf-4151-b34e-ee28044b89e7 nodeName:}" failed. No retries permitted until 2026-02-02 10:50:09.930247328 +0000 UTC m=+1135.676348063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift") pod "swift-storage-0" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7") : configmap "swift-ring-files" not found Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.142385 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9l5qz"] Feb 02 10:50:06 crc kubenswrapper[4909]: E0202 10:50:06.142727 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01e250e0-cd5f-4f1b-ba4b-e707941ffca3" containerName="init" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.142744 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="01e250e0-cd5f-4f1b-ba4b-e707941ffca3" containerName="init" Feb 02 10:50:06 crc kubenswrapper[4909]: E0202 10:50:06.142764 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" containerName="dnsmasq-dns" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.142771 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" containerName="dnsmasq-dns" Feb 02 10:50:06 crc kubenswrapper[4909]: E0202 10:50:06.142788 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" containerName="init" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.142796 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" containerName="init" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.143029 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="01e250e0-cd5f-4f1b-ba4b-e707941ffca3" containerName="init" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.143056 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="32baf7b0-1fd7-4303-b78d-56c2a4e29388" containerName="dnsmasq-dns" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.143572 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.147012 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.147195 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.147350 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.156068 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9l5qz"] Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.164024 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9l5qz"] Feb 02 10:50:06 crc kubenswrapper[4909]: E0202 10:50:06.164635 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-bthlp ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-bthlp ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-9l5qz" podUID="af1f4d80-90d2-42c0-81dd-cde854a72f01" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.189295 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vxm4q"] Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.190349 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.205339 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vxm4q"] Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq9g4\" (UniqueName: \"kubernetes.io/projected/bf7b0419-b32f-44ab-b35d-eb06765be89d-kube-api-access-mq9g4\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235221 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-ring-data-devices\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235279 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bthlp\" (UniqueName: \"kubernetes.io/projected/af1f4d80-90d2-42c0-81dd-cde854a72f01-kube-api-access-bthlp\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235305 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-combined-ca-bundle\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235355 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf7b0419-b32f-44ab-b35d-eb06765be89d-etc-swift\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235377 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-dispersionconf\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235427 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-swiftconf\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235454 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-scripts\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235472 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-swiftconf\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235490 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-scripts\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235592 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-combined-ca-bundle\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235637 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af1f4d80-90d2-42c0-81dd-cde854a72f01-etc-swift\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235659 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-ring-data-devices\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.235694 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-dispersionconf\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337484 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-swiftconf\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-scripts\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337594 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-combined-ca-bundle\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337614 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af1f4d80-90d2-42c0-81dd-cde854a72f01-etc-swift\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337632 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-ring-data-devices\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337663 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-dispersionconf\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq9g4\" (UniqueName: \"kubernetes.io/projected/bf7b0419-b32f-44ab-b35d-eb06765be89d-kube-api-access-mq9g4\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337711 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-ring-data-devices\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337750 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bthlp\" (UniqueName: \"kubernetes.io/projected/af1f4d80-90d2-42c0-81dd-cde854a72f01-kube-api-access-bthlp\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337771 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-combined-ca-bundle\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337799 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf7b0419-b32f-44ab-b35d-eb06765be89d-etc-swift\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337832 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-dispersionconf\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337853 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-swiftconf\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.337869 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-scripts\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.338779 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-scripts\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.338799 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf7b0419-b32f-44ab-b35d-eb06765be89d-etc-swift\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.338947 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-ring-data-devices\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.339046 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-scripts\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.339218 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af1f4d80-90d2-42c0-81dd-cde854a72f01-etc-swift\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.339864 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-ring-data-devices\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.344304 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-swiftconf\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.344472 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-dispersionconf\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.344876 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-combined-ca-bundle\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.345542 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-combined-ca-bundle\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.350251 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-swiftconf\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.350563 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-dispersionconf\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.356768 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bthlp\" (UniqueName: \"kubernetes.io/projected/af1f4d80-90d2-42c0-81dd-cde854a72f01-kube-api-access-bthlp\") pod \"swift-ring-rebalance-9l5qz\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.359157 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq9g4\" (UniqueName: \"kubernetes.io/projected/bf7b0419-b32f-44ab-b35d-eb06765be89d-kube-api-access-mq9g4\") pod \"swift-ring-rebalance-vxm4q\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.510115 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.686220 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.703647 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.743460 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-combined-ca-bundle\") pod \"af1f4d80-90d2-42c0-81dd-cde854a72f01\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.743524 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-ring-data-devices\") pod \"af1f4d80-90d2-42c0-81dd-cde854a72f01\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.743547 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af1f4d80-90d2-42c0-81dd-cde854a72f01-etc-swift\") pod \"af1f4d80-90d2-42c0-81dd-cde854a72f01\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.743615 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-scripts\") pod \"af1f4d80-90d2-42c0-81dd-cde854a72f01\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.743674 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-swiftconf\") pod \"af1f4d80-90d2-42c0-81dd-cde854a72f01\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.743696 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-dispersionconf\") pod \"af1f4d80-90d2-42c0-81dd-cde854a72f01\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.743719 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bthlp\" (UniqueName: \"kubernetes.io/projected/af1f4d80-90d2-42c0-81dd-cde854a72f01-kube-api-access-bthlp\") pod \"af1f4d80-90d2-42c0-81dd-cde854a72f01\" (UID: \"af1f4d80-90d2-42c0-81dd-cde854a72f01\") " Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.744052 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af1f4d80-90d2-42c0-81dd-cde854a72f01-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "af1f4d80-90d2-42c0-81dd-cde854a72f01" (UID: "af1f4d80-90d2-42c0-81dd-cde854a72f01"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.744386 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-scripts" (OuterVolumeSpecName: "scripts") pod "af1f4d80-90d2-42c0-81dd-cde854a72f01" (UID: "af1f4d80-90d2-42c0-81dd-cde854a72f01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.744401 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "af1f4d80-90d2-42c0-81dd-cde854a72f01" (UID: "af1f4d80-90d2-42c0-81dd-cde854a72f01"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.744685 4909 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/af1f4d80-90d2-42c0-81dd-cde854a72f01-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.748751 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1f4d80-90d2-42c0-81dd-cde854a72f01-kube-api-access-bthlp" (OuterVolumeSpecName: "kube-api-access-bthlp") pod "af1f4d80-90d2-42c0-81dd-cde854a72f01" (UID: "af1f4d80-90d2-42c0-81dd-cde854a72f01"). InnerVolumeSpecName "kube-api-access-bthlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.749280 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "af1f4d80-90d2-42c0-81dd-cde854a72f01" (UID: "af1f4d80-90d2-42c0-81dd-cde854a72f01"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.751016 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "af1f4d80-90d2-42c0-81dd-cde854a72f01" (UID: "af1f4d80-90d2-42c0-81dd-cde854a72f01"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.753263 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af1f4d80-90d2-42c0-81dd-cde854a72f01" (UID: "af1f4d80-90d2-42c0-81dd-cde854a72f01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.845272 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.845313 4909 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.845325 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af1f4d80-90d2-42c0-81dd-cde854a72f01-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.845336 4909 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.845344 4909 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/af1f4d80-90d2-42c0-81dd-cde854a72f01-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.845354 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bthlp\" (UniqueName: \"kubernetes.io/projected/af1f4d80-90d2-42c0-81dd-cde854a72f01-kube-api-access-bthlp\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:06 crc kubenswrapper[4909]: I0202 10:50:06.979616 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vxm4q"] Feb 02 10:50:06 crc kubenswrapper[4909]: W0202 10:50:06.982906 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf7b0419_b32f_44ab_b35d_eb06765be89d.slice/crio-0ae2c139794feea29be34af77ba28df51230b0f58c7246b6bf8fafa8940f6348 WatchSource:0}: Error finding container 0ae2c139794feea29be34af77ba28df51230b0f58c7246b6bf8fafa8940f6348: Status 404 returned error can't find the container with id 0ae2c139794feea29be34af77ba28df51230b0f58c7246b6bf8fafa8940f6348 Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.434592 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.434934 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.444957 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tkd66"] Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.446175 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tkd66" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.448134 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.448779 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tkd66"] Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.453670 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d6zl\" (UniqueName: \"kubernetes.io/projected/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-kube-api-access-9d6zl\") pod \"root-account-create-update-tkd66\" (UID: \"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb\") " pod="openstack/root-account-create-update-tkd66" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.453719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-operator-scripts\") pod \"root-account-create-update-tkd66\" (UID: \"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb\") " pod="openstack/root-account-create-update-tkd66" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.515097 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.555278 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d6zl\" (UniqueName: \"kubernetes.io/projected/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-kube-api-access-9d6zl\") pod \"root-account-create-update-tkd66\" (UID: \"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb\") " pod="openstack/root-account-create-update-tkd66" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.555330 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-operator-scripts\") pod \"root-account-create-update-tkd66\" (UID: \"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb\") " pod="openstack/root-account-create-update-tkd66" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.556590 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-operator-scripts\") pod \"root-account-create-update-tkd66\" (UID: \"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb\") " pod="openstack/root-account-create-update-tkd66" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.571378 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d6zl\" (UniqueName: \"kubernetes.io/projected/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-kube-api-access-9d6zl\") pod \"root-account-create-update-tkd66\" (UID: \"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb\") " pod="openstack/root-account-create-update-tkd66" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.695474 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vxm4q" event={"ID":"bf7b0419-b32f-44ab-b35d-eb06765be89d","Type":"ContainerStarted","Data":"0ae2c139794feea29be34af77ba28df51230b0f58c7246b6bf8fafa8940f6348"} Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.695530 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9l5qz" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.742556 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9l5qz"] Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.742610 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-9l5qz"] Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.770973 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tkd66" Feb 02 10:50:07 crc kubenswrapper[4909]: I0202 10:50:07.850646 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.249022 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tkd66"] Feb 02 10:50:08 crc kubenswrapper[4909]: W0202 10:50:08.257985 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb68aca0f_b5c9_411a_b8d2_b79e272b5bdb.slice/crio-be03416dcaf7b366ba3eecb0e72d87b903844d23096212bc83eb7a53bae88660 WatchSource:0}: Error finding container be03416dcaf7b366ba3eecb0e72d87b903844d23096212bc83eb7a53bae88660: Status 404 returned error can't find the container with id be03416dcaf7b366ba3eecb0e72d87b903844d23096212bc83eb7a53bae88660 Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.482049 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-h4czn"] Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.483736 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h4czn" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.516720 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-h4czn"] Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.575408 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7lrk\" (UniqueName: \"kubernetes.io/projected/9b651773-9cf9-4458-9c39-37b9104ff41e-kube-api-access-h7lrk\") pod \"keystone-db-create-h4czn\" (UID: \"9b651773-9cf9-4458-9c39-37b9104ff41e\") " pod="openstack/keystone-db-create-h4czn" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.575577 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b651773-9cf9-4458-9c39-37b9104ff41e-operator-scripts\") pod \"keystone-db-create-h4czn\" (UID: \"9b651773-9cf9-4458-9c39-37b9104ff41e\") " pod="openstack/keystone-db-create-h4czn" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.576816 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-910a-account-create-update-lvntk"] Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.578018 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-910a-account-create-update-lvntk" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.580744 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.598046 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-910a-account-create-update-lvntk"] Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.677671 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7lrk\" (UniqueName: \"kubernetes.io/projected/9b651773-9cf9-4458-9c39-37b9104ff41e-kube-api-access-h7lrk\") pod \"keystone-db-create-h4czn\" (UID: \"9b651773-9cf9-4458-9c39-37b9104ff41e\") " pod="openstack/keystone-db-create-h4czn" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.677814 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmffj\" (UniqueName: \"kubernetes.io/projected/3aa83e5f-8a9a-4275-989a-105cf6370d74-kube-api-access-lmffj\") pod \"keystone-910a-account-create-update-lvntk\" (UID: \"3aa83e5f-8a9a-4275-989a-105cf6370d74\") " pod="openstack/keystone-910a-account-create-update-lvntk" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.677992 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b651773-9cf9-4458-9c39-37b9104ff41e-operator-scripts\") pod \"keystone-db-create-h4czn\" (UID: \"9b651773-9cf9-4458-9c39-37b9104ff41e\") " pod="openstack/keystone-db-create-h4czn" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.678048 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3aa83e5f-8a9a-4275-989a-105cf6370d74-operator-scripts\") pod \"keystone-910a-account-create-update-lvntk\" (UID: \"3aa83e5f-8a9a-4275-989a-105cf6370d74\") " pod="openstack/keystone-910a-account-create-update-lvntk" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.678967 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b651773-9cf9-4458-9c39-37b9104ff41e-operator-scripts\") pod \"keystone-db-create-h4czn\" (UID: \"9b651773-9cf9-4458-9c39-37b9104ff41e\") " pod="openstack/keystone-db-create-h4czn" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.703286 4909 generic.go:334] "Generic (PLEG): container finished" podID="b68aca0f-b5c9-411a-b8d2-b79e272b5bdb" containerID="32bfe0b197161f5feef754bfc301d5f4c34647785352315e9754fe2f1b3e279c" exitCode=0 Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.705017 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tkd66" event={"ID":"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb","Type":"ContainerDied","Data":"32bfe0b197161f5feef754bfc301d5f4c34647785352315e9754fe2f1b3e279c"} Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.705170 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tkd66" event={"ID":"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb","Type":"ContainerStarted","Data":"be03416dcaf7b366ba3eecb0e72d87b903844d23096212bc83eb7a53bae88660"} Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.705685 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7lrk\" (UniqueName: \"kubernetes.io/projected/9b651773-9cf9-4458-9c39-37b9104ff41e-kube-api-access-h7lrk\") pod \"keystone-db-create-h4czn\" (UID: \"9b651773-9cf9-4458-9c39-37b9104ff41e\") " pod="openstack/keystone-db-create-h4czn" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.780764 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmffj\" (UniqueName: \"kubernetes.io/projected/3aa83e5f-8a9a-4275-989a-105cf6370d74-kube-api-access-lmffj\") pod \"keystone-910a-account-create-update-lvntk\" (UID: \"3aa83e5f-8a9a-4275-989a-105cf6370d74\") " pod="openstack/keystone-910a-account-create-update-lvntk" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.780895 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3aa83e5f-8a9a-4275-989a-105cf6370d74-operator-scripts\") pod \"keystone-910a-account-create-update-lvntk\" (UID: \"3aa83e5f-8a9a-4275-989a-105cf6370d74\") " pod="openstack/keystone-910a-account-create-update-lvntk" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.781829 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3aa83e5f-8a9a-4275-989a-105cf6370d74-operator-scripts\") pod \"keystone-910a-account-create-update-lvntk\" (UID: \"3aa83e5f-8a9a-4275-989a-105cf6370d74\") " pod="openstack/keystone-910a-account-create-update-lvntk" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.805759 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmffj\" (UniqueName: \"kubernetes.io/projected/3aa83e5f-8a9a-4275-989a-105cf6370d74-kube-api-access-lmffj\") pod \"keystone-910a-account-create-update-lvntk\" (UID: \"3aa83e5f-8a9a-4275-989a-105cf6370d74\") " pod="openstack/keystone-910a-account-create-update-lvntk" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.818025 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h4czn" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.831624 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-fgzzd"] Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.833000 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fgzzd" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.838505 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-fgzzd"] Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.899954 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d833-account-create-update-dbfhx"] Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.900953 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d833-account-create-update-dbfhx" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.903550 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.905305 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-910a-account-create-update-lvntk" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.908322 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d833-account-create-update-dbfhx"] Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.984046 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq8qh\" (UniqueName: \"kubernetes.io/projected/0834981f-5c9e-48dc-a18a-4108e2eb24f4-kube-api-access-rq8qh\") pod \"placement-d833-account-create-update-dbfhx\" (UID: \"0834981f-5c9e-48dc-a18a-4108e2eb24f4\") " pod="openstack/placement-d833-account-create-update-dbfhx" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.984126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqcb\" (UniqueName: \"kubernetes.io/projected/359391a8-8d1e-47a2-b849-e7d574bd0613-kube-api-access-9vqcb\") pod \"placement-db-create-fgzzd\" (UID: \"359391a8-8d1e-47a2-b849-e7d574bd0613\") " pod="openstack/placement-db-create-fgzzd" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.984189 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359391a8-8d1e-47a2-b849-e7d574bd0613-operator-scripts\") pod \"placement-db-create-fgzzd\" (UID: \"359391a8-8d1e-47a2-b849-e7d574bd0613\") " pod="openstack/placement-db-create-fgzzd" Feb 02 10:50:08 crc kubenswrapper[4909]: I0202 10:50:08.984282 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0834981f-5c9e-48dc-a18a-4108e2eb24f4-operator-scripts\") pod \"placement-d833-account-create-update-dbfhx\" (UID: \"0834981f-5c9e-48dc-a18a-4108e2eb24f4\") " pod="openstack/placement-d833-account-create-update-dbfhx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.025588 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1f4d80-90d2-42c0-81dd-cde854a72f01" path="/var/lib/kubelet/pods/af1f4d80-90d2-42c0-81dd-cde854a72f01/volumes" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.090366 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359391a8-8d1e-47a2-b849-e7d574bd0613-operator-scripts\") pod \"placement-db-create-fgzzd\" (UID: \"359391a8-8d1e-47a2-b849-e7d574bd0613\") " pod="openstack/placement-db-create-fgzzd" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.090477 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0834981f-5c9e-48dc-a18a-4108e2eb24f4-operator-scripts\") pod \"placement-d833-account-create-update-dbfhx\" (UID: \"0834981f-5c9e-48dc-a18a-4108e2eb24f4\") " pod="openstack/placement-d833-account-create-update-dbfhx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.091463 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq8qh\" (UniqueName: \"kubernetes.io/projected/0834981f-5c9e-48dc-a18a-4108e2eb24f4-kube-api-access-rq8qh\") pod \"placement-d833-account-create-update-dbfhx\" (UID: \"0834981f-5c9e-48dc-a18a-4108e2eb24f4\") " pod="openstack/placement-d833-account-create-update-dbfhx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.091555 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqcb\" (UniqueName: \"kubernetes.io/projected/359391a8-8d1e-47a2-b849-e7d574bd0613-kube-api-access-9vqcb\") pod \"placement-db-create-fgzzd\" (UID: \"359391a8-8d1e-47a2-b849-e7d574bd0613\") " pod="openstack/placement-db-create-fgzzd" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.092671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0834981f-5c9e-48dc-a18a-4108e2eb24f4-operator-scripts\") pod \"placement-d833-account-create-update-dbfhx\" (UID: \"0834981f-5c9e-48dc-a18a-4108e2eb24f4\") " pod="openstack/placement-d833-account-create-update-dbfhx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.092725 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359391a8-8d1e-47a2-b849-e7d574bd0613-operator-scripts\") pod \"placement-db-create-fgzzd\" (UID: \"359391a8-8d1e-47a2-b849-e7d574bd0613\") " pod="openstack/placement-db-create-fgzzd" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.101980 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-p67xw"] Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.103043 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p67xw" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.105726 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-p67xw"] Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.119496 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq8qh\" (UniqueName: \"kubernetes.io/projected/0834981f-5c9e-48dc-a18a-4108e2eb24f4-kube-api-access-rq8qh\") pod \"placement-d833-account-create-update-dbfhx\" (UID: \"0834981f-5c9e-48dc-a18a-4108e2eb24f4\") " pod="openstack/placement-d833-account-create-update-dbfhx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.120161 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqcb\" (UniqueName: \"kubernetes.io/projected/359391a8-8d1e-47a2-b849-e7d574bd0613-kube-api-access-9vqcb\") pod \"placement-db-create-fgzzd\" (UID: \"359391a8-8d1e-47a2-b849-e7d574bd0613\") " pod="openstack/placement-db-create-fgzzd" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.185408 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fgzzd" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.193202 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csjc6\" (UniqueName: \"kubernetes.io/projected/daee344f-d74a-43d0-9fc4-f651011ef32f-kube-api-access-csjc6\") pod \"glance-db-create-p67xw\" (UID: \"daee344f-d74a-43d0-9fc4-f651011ef32f\") " pod="openstack/glance-db-create-p67xw" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.193421 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daee344f-d74a-43d0-9fc4-f651011ef32f-operator-scripts\") pod \"glance-db-create-p67xw\" (UID: \"daee344f-d74a-43d0-9fc4-f651011ef32f\") " pod="openstack/glance-db-create-p67xw" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.199946 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a6cd-account-create-update-kvdwx"] Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.201475 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a6cd-account-create-update-kvdwx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.206186 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.214325 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a6cd-account-create-update-kvdwx"] Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.217165 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d833-account-create-update-dbfhx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.294753 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwqjl\" (UniqueName: \"kubernetes.io/projected/81b16bcd-5191-4481-bb9a-a1762143bc89-kube-api-access-bwqjl\") pod \"glance-a6cd-account-create-update-kvdwx\" (UID: \"81b16bcd-5191-4481-bb9a-a1762143bc89\") " pod="openstack/glance-a6cd-account-create-update-kvdwx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.294944 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csjc6\" (UniqueName: \"kubernetes.io/projected/daee344f-d74a-43d0-9fc4-f651011ef32f-kube-api-access-csjc6\") pod \"glance-db-create-p67xw\" (UID: \"daee344f-d74a-43d0-9fc4-f651011ef32f\") " pod="openstack/glance-db-create-p67xw" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.295000 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daee344f-d74a-43d0-9fc4-f651011ef32f-operator-scripts\") pod \"glance-db-create-p67xw\" (UID: \"daee344f-d74a-43d0-9fc4-f651011ef32f\") " pod="openstack/glance-db-create-p67xw" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.295034 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b16bcd-5191-4481-bb9a-a1762143bc89-operator-scripts\") pod \"glance-a6cd-account-create-update-kvdwx\" (UID: \"81b16bcd-5191-4481-bb9a-a1762143bc89\") " pod="openstack/glance-a6cd-account-create-update-kvdwx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.300218 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daee344f-d74a-43d0-9fc4-f651011ef32f-operator-scripts\") pod \"glance-db-create-p67xw\" (UID: \"daee344f-d74a-43d0-9fc4-f651011ef32f\") " pod="openstack/glance-db-create-p67xw" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.310027 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csjc6\" (UniqueName: \"kubernetes.io/projected/daee344f-d74a-43d0-9fc4-f651011ef32f-kube-api-access-csjc6\") pod \"glance-db-create-p67xw\" (UID: \"daee344f-d74a-43d0-9fc4-f651011ef32f\") " pod="openstack/glance-db-create-p67xw" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.396993 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwqjl\" (UniqueName: \"kubernetes.io/projected/81b16bcd-5191-4481-bb9a-a1762143bc89-kube-api-access-bwqjl\") pod \"glance-a6cd-account-create-update-kvdwx\" (UID: \"81b16bcd-5191-4481-bb9a-a1762143bc89\") " pod="openstack/glance-a6cd-account-create-update-kvdwx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.397138 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b16bcd-5191-4481-bb9a-a1762143bc89-operator-scripts\") pod \"glance-a6cd-account-create-update-kvdwx\" (UID: \"81b16bcd-5191-4481-bb9a-a1762143bc89\") " pod="openstack/glance-a6cd-account-create-update-kvdwx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.397972 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b16bcd-5191-4481-bb9a-a1762143bc89-operator-scripts\") pod \"glance-a6cd-account-create-update-kvdwx\" (UID: \"81b16bcd-5191-4481-bb9a-a1762143bc89\") " pod="openstack/glance-a6cd-account-create-update-kvdwx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.412332 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwqjl\" (UniqueName: \"kubernetes.io/projected/81b16bcd-5191-4481-bb9a-a1762143bc89-kube-api-access-bwqjl\") pod \"glance-a6cd-account-create-update-kvdwx\" (UID: \"81b16bcd-5191-4481-bb9a-a1762143bc89\") " pod="openstack/glance-a6cd-account-create-update-kvdwx" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.499094 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p67xw" Feb 02 10:50:09 crc kubenswrapper[4909]: I0202 10:50:09.522483 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a6cd-account-create-update-kvdwx" Feb 02 10:50:10 crc kubenswrapper[4909]: I0202 10:50:10.030077 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:10 crc kubenswrapper[4909]: E0202 10:50:10.030330 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:50:10 crc kubenswrapper[4909]: E0202 10:50:10.030347 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:50:10 crc kubenswrapper[4909]: E0202 10:50:10.030395 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift podName:7c400ec0-7faf-4151-b34e-ee28044b89e7 nodeName:}" failed. No retries permitted until 2026-02-02 10:50:18.030379349 +0000 UTC m=+1143.776480084 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift") pod "swift-storage-0" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7") : configmap "swift-ring-files" not found Feb 02 10:50:10 crc kubenswrapper[4909]: I0202 10:50:10.812081 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tkd66" Feb 02 10:50:10 crc kubenswrapper[4909]: I0202 10:50:10.947233 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d6zl\" (UniqueName: \"kubernetes.io/projected/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-kube-api-access-9d6zl\") pod \"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb\" (UID: \"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb\") " Feb 02 10:50:10 crc kubenswrapper[4909]: I0202 10:50:10.947800 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-operator-scripts\") pod \"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb\" (UID: \"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb\") " Feb 02 10:50:10 crc kubenswrapper[4909]: I0202 10:50:10.948762 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b68aca0f-b5c9-411a-b8d2-b79e272b5bdb" (UID: "b68aca0f-b5c9-411a-b8d2-b79e272b5bdb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:10 crc kubenswrapper[4909]: I0202 10:50:10.956655 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 02 10:50:10 crc kubenswrapper[4909]: I0202 10:50:10.960027 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-kube-api-access-9d6zl" (OuterVolumeSpecName: "kube-api-access-9d6zl") pod "b68aca0f-b5c9-411a-b8d2-b79e272b5bdb" (UID: "b68aca0f-b5c9-411a-b8d2-b79e272b5bdb"). InnerVolumeSpecName "kube-api-access-9d6zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.051513 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d6zl\" (UniqueName: \"kubernetes.io/projected/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-kube-api-access-9d6zl\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.051536 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.363615 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-910a-account-create-update-lvntk"] Feb 02 10:50:11 crc kubenswrapper[4909]: W0202 10:50:11.367850 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aa83e5f_8a9a_4275_989a_105cf6370d74.slice/crio-918acb7ff24ff94384e3cef57c907bb6f73e9c4740d8aa219c1842d1ff62e6a0 WatchSource:0}: Error finding container 918acb7ff24ff94384e3cef57c907bb6f73e9c4740d8aa219c1842d1ff62e6a0: Status 404 returned error can't find the container with id 918acb7ff24ff94384e3cef57c907bb6f73e9c4740d8aa219c1842d1ff62e6a0 Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.464773 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-h4czn"] Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.477671 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a6cd-account-create-update-kvdwx"] Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.484920 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-fgzzd"] Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.622222 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-p67xw"] Feb 02 10:50:11 crc kubenswrapper[4909]: W0202 10:50:11.628713 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaee344f_d74a_43d0_9fc4_f651011ef32f.slice/crio-45808c597d99d28f81bb5756a424918dcfce990f1af1177f4a0bec1e37a7cf54 WatchSource:0}: Error finding container 45808c597d99d28f81bb5756a424918dcfce990f1af1177f4a0bec1e37a7cf54: Status 404 returned error can't find the container with id 45808c597d99d28f81bb5756a424918dcfce990f1af1177f4a0bec1e37a7cf54 Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.636775 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d833-account-create-update-dbfhx"] Feb 02 10:50:11 crc kubenswrapper[4909]: W0202 10:50:11.647735 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0834981f_5c9e_48dc_a18a_4108e2eb24f4.slice/crio-ff7a098c0dc878efcb59b73e6fe90f5e1ae02454364beb8277f6cdafa702bc1f WatchSource:0}: Error finding container ff7a098c0dc878efcb59b73e6fe90f5e1ae02454364beb8277f6cdafa702bc1f: Status 404 returned error can't find the container with id ff7a098c0dc878efcb59b73e6fe90f5e1ae02454364beb8277f6cdafa702bc1f Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.728754 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h4czn" event={"ID":"9b651773-9cf9-4458-9c39-37b9104ff41e","Type":"ContainerStarted","Data":"4f99a6d61521e8ea4cc86c6b2d1f469b35483740c9efb1d2feef012e700ba6d8"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.729126 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h4czn" event={"ID":"9b651773-9cf9-4458-9c39-37b9104ff41e","Type":"ContainerStarted","Data":"a9de098fe994ed0d0095b8132d5db3d3c8929b25eaf2151e556dc8b2356f5968"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.730825 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-910a-account-create-update-lvntk" event={"ID":"3aa83e5f-8a9a-4275-989a-105cf6370d74","Type":"ContainerStarted","Data":"5892737acb052e6a3f93ad5723bf350253f3421da0cd429a24a03c2315a6b594"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.730852 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-910a-account-create-update-lvntk" event={"ID":"3aa83e5f-8a9a-4275-989a-105cf6370d74","Type":"ContainerStarted","Data":"918acb7ff24ff94384e3cef57c907bb6f73e9c4740d8aa219c1842d1ff62e6a0"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.743783 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-h4czn" podStartSLOduration=3.743769692 podStartE2EDuration="3.743769692s" podCreationTimestamp="2026-02-02 10:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:11.741174289 +0000 UTC m=+1137.487275024" watchObservedRunningTime="2026-02-02 10:50:11.743769692 +0000 UTC m=+1137.489870427" Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.751211 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a6cd-account-create-update-kvdwx" event={"ID":"81b16bcd-5191-4481-bb9a-a1762143bc89","Type":"ContainerStarted","Data":"1852ba1ba50c691b49413bcf9313ce9948ed4b0019ad1761ebd8f1c4dc0d19b4"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.751258 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a6cd-account-create-update-kvdwx" event={"ID":"81b16bcd-5191-4481-bb9a-a1762143bc89","Type":"ContainerStarted","Data":"2b2dab6bdd0c08629afcf6b53bf9e665bab8dbf01047a6ad5bbef954e9cc1d2f"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.762913 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tkd66" Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.762963 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tkd66" event={"ID":"b68aca0f-b5c9-411a-b8d2-b79e272b5bdb","Type":"ContainerDied","Data":"be03416dcaf7b366ba3eecb0e72d87b903844d23096212bc83eb7a53bae88660"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.763021 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be03416dcaf7b366ba3eecb0e72d87b903844d23096212bc83eb7a53bae88660" Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.766312 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p67xw" event={"ID":"daee344f-d74a-43d0-9fc4-f651011ef32f","Type":"ContainerStarted","Data":"45808c597d99d28f81bb5756a424918dcfce990f1af1177f4a0bec1e37a7cf54"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.771332 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vxm4q" event={"ID":"bf7b0419-b32f-44ab-b35d-eb06765be89d","Type":"ContainerStarted","Data":"c7b0f3850dc7a45c5f81260746518a4b0fb79c1686ae003921f7039e6da8ea56"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.776067 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fgzzd" event={"ID":"359391a8-8d1e-47a2-b849-e7d574bd0613","Type":"ContainerStarted","Data":"cb02a83bbb4061df53b24177176c6e857d4ccc4c9e403fa8c36ecca2742b0d60"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.776115 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fgzzd" event={"ID":"359391a8-8d1e-47a2-b849-e7d574bd0613","Type":"ContainerStarted","Data":"6f993d40335ddd562231761f5ca8da697bb66967137d599b1ff1080185caf0b5"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.777977 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d833-account-create-update-dbfhx" event={"ID":"0834981f-5c9e-48dc-a18a-4108e2eb24f4","Type":"ContainerStarted","Data":"ff7a098c0dc878efcb59b73e6fe90f5e1ae02454364beb8277f6cdafa702bc1f"} Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.787523 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-910a-account-create-update-lvntk" podStartSLOduration=3.787505504 podStartE2EDuration="3.787505504s" podCreationTimestamp="2026-02-02 10:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:11.759470468 +0000 UTC m=+1137.505571203" watchObservedRunningTime="2026-02-02 10:50:11.787505504 +0000 UTC m=+1137.533606239" Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.788878 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-a6cd-account-create-update-kvdwx" podStartSLOduration=2.788871993 podStartE2EDuration="2.788871993s" podCreationTimestamp="2026-02-02 10:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:11.77326458 +0000 UTC m=+1137.519365315" watchObservedRunningTime="2026-02-02 10:50:11.788871993 +0000 UTC m=+1137.534972728" Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.798216 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vxm4q" podStartSLOduration=1.90921023 podStartE2EDuration="5.798199467s" podCreationTimestamp="2026-02-02 10:50:06 +0000 UTC" firstStartedPulling="2026-02-02 10:50:06.985136386 +0000 UTC m=+1132.731237121" lastFinishedPulling="2026-02-02 10:50:10.874125623 +0000 UTC m=+1136.620226358" observedRunningTime="2026-02-02 10:50:11.796977773 +0000 UTC m=+1137.543078498" watchObservedRunningTime="2026-02-02 10:50:11.798199467 +0000 UTC m=+1137.544300202" Feb 02 10:50:11 crc kubenswrapper[4909]: I0202 10:50:11.817723 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-fgzzd" podStartSLOduration=3.81767859 podStartE2EDuration="3.81767859s" podCreationTimestamp="2026-02-02 10:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:11.813925914 +0000 UTC m=+1137.560026659" watchObservedRunningTime="2026-02-02 10:50:11.81767859 +0000 UTC m=+1137.563779325" Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.254855 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.733022 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.799026 4909 generic.go:334] "Generic (PLEG): container finished" podID="0834981f-5c9e-48dc-a18a-4108e2eb24f4" containerID="ad4f42df5133abae699ddc4b8cf0d5ba9d9b30b6dca926531a75eee4db6df541" exitCode=0 Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.799110 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d833-account-create-update-dbfhx" event={"ID":"0834981f-5c9e-48dc-a18a-4108e2eb24f4","Type":"ContainerDied","Data":"ad4f42df5133abae699ddc4b8cf0d5ba9d9b30b6dca926531a75eee4db6df541"} Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.804504 4909 generic.go:334] "Generic (PLEG): container finished" podID="9b651773-9cf9-4458-9c39-37b9104ff41e" containerID="4f99a6d61521e8ea4cc86c6b2d1f469b35483740c9efb1d2feef012e700ba6d8" exitCode=0 Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.804632 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h4czn" event={"ID":"9b651773-9cf9-4458-9c39-37b9104ff41e","Type":"ContainerDied","Data":"4f99a6d61521e8ea4cc86c6b2d1f469b35483740c9efb1d2feef012e700ba6d8"} Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.808172 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77fc48cd97-hgklb"] Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.809879 4909 generic.go:334] "Generic (PLEG): container finished" podID="3aa83e5f-8a9a-4275-989a-105cf6370d74" containerID="5892737acb052e6a3f93ad5723bf350253f3421da0cd429a24a03c2315a6b594" exitCode=0 Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.809936 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-910a-account-create-update-lvntk" event={"ID":"3aa83e5f-8a9a-4275-989a-105cf6370d74","Type":"ContainerDied","Data":"5892737acb052e6a3f93ad5723bf350253f3421da0cd429a24a03c2315a6b594"} Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.813411 4909 generic.go:334] "Generic (PLEG): container finished" podID="81b16bcd-5191-4481-bb9a-a1762143bc89" containerID="1852ba1ba50c691b49413bcf9313ce9948ed4b0019ad1761ebd8f1c4dc0d19b4" exitCode=0 Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.813464 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a6cd-account-create-update-kvdwx" event={"ID":"81b16bcd-5191-4481-bb9a-a1762143bc89","Type":"ContainerDied","Data":"1852ba1ba50c691b49413bcf9313ce9948ed4b0019ad1761ebd8f1c4dc0d19b4"} Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.818801 4909 generic.go:334] "Generic (PLEG): container finished" podID="daee344f-d74a-43d0-9fc4-f651011ef32f" containerID="f85aea9a9630fd571f870636f8d0a16bc08fbc31cf3802093c4a240a71c26962" exitCode=0 Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.818948 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p67xw" event={"ID":"daee344f-d74a-43d0-9fc4-f651011ef32f","Type":"ContainerDied","Data":"f85aea9a9630fd571f870636f8d0a16bc08fbc31cf3802093c4a240a71c26962"} Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.820386 4909 generic.go:334] "Generic (PLEG): container finished" podID="359391a8-8d1e-47a2-b849-e7d574bd0613" containerID="cb02a83bbb4061df53b24177176c6e857d4ccc4c9e403fa8c36ecca2742b0d60" exitCode=0 Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.821197 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fgzzd" event={"ID":"359391a8-8d1e-47a2-b849-e7d574bd0613","Type":"ContainerDied","Data":"cb02a83bbb4061df53b24177176c6e857d4ccc4c9e403fa8c36ecca2742b0d60"} Feb 02 10:50:12 crc kubenswrapper[4909]: I0202 10:50:12.821331 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" podUID="c24055ff-2124-4f7d-9560-6b2effc2ba4d" containerName="dnsmasq-dns" containerID="cri-o://a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf" gracePeriod=10 Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.334137 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.505776 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr89f\" (UniqueName: \"kubernetes.io/projected/c24055ff-2124-4f7d-9560-6b2effc2ba4d-kube-api-access-tr89f\") pod \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.505879 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-config\") pod \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.505922 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-ovsdbserver-nb\") pod \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.505985 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-dns-svc\") pod \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\" (UID: \"c24055ff-2124-4f7d-9560-6b2effc2ba4d\") " Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.514126 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24055ff-2124-4f7d-9560-6b2effc2ba4d-kube-api-access-tr89f" (OuterVolumeSpecName: "kube-api-access-tr89f") pod "c24055ff-2124-4f7d-9560-6b2effc2ba4d" (UID: "c24055ff-2124-4f7d-9560-6b2effc2ba4d"). InnerVolumeSpecName "kube-api-access-tr89f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.553956 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c24055ff-2124-4f7d-9560-6b2effc2ba4d" (UID: "c24055ff-2124-4f7d-9560-6b2effc2ba4d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.560672 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-config" (OuterVolumeSpecName: "config") pod "c24055ff-2124-4f7d-9560-6b2effc2ba4d" (UID: "c24055ff-2124-4f7d-9560-6b2effc2ba4d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.569276 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c24055ff-2124-4f7d-9560-6b2effc2ba4d" (UID: "c24055ff-2124-4f7d-9560-6b2effc2ba4d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.607852 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.607882 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr89f\" (UniqueName: \"kubernetes.io/projected/c24055ff-2124-4f7d-9560-6b2effc2ba4d-kube-api-access-tr89f\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.607891 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.607899 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24055ff-2124-4f7d-9560-6b2effc2ba4d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.829630 4909 generic.go:334] "Generic (PLEG): container finished" podID="c24055ff-2124-4f7d-9560-6b2effc2ba4d" containerID="a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf" exitCode=0 Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.829710 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.829837 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" event={"ID":"c24055ff-2124-4f7d-9560-6b2effc2ba4d","Type":"ContainerDied","Data":"a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf"} Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.829870 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fc48cd97-hgklb" event={"ID":"c24055ff-2124-4f7d-9560-6b2effc2ba4d","Type":"ContainerDied","Data":"a2005f33bee72f872f34f8cd37c317b54513edff862fbac86b54ba28207e42a2"} Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.829899 4909 scope.go:117] "RemoveContainer" containerID="a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.868340 4909 scope.go:117] "RemoveContainer" containerID="acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.880838 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77fc48cd97-hgklb"] Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.890673 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77fc48cd97-hgklb"] Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.904555 4909 scope.go:117] "RemoveContainer" containerID="a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf" Feb 02 10:50:13 crc kubenswrapper[4909]: E0202 10:50:13.904952 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf\": container with ID starting with a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf not found: ID does not exist" containerID="a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.904990 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf"} err="failed to get container status \"a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf\": rpc error: code = NotFound desc = could not find container \"a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf\": container with ID starting with a78064dd1831ceaadebaa99ba081255731751d336b266c3bf5383225a49616cf not found: ID does not exist" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.905017 4909 scope.go:117] "RemoveContainer" containerID="acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916" Feb 02 10:50:13 crc kubenswrapper[4909]: E0202 10:50:13.905280 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916\": container with ID starting with acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916 not found: ID does not exist" containerID="acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916" Feb 02 10:50:13 crc kubenswrapper[4909]: I0202 10:50:13.905314 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916"} err="failed to get container status \"acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916\": rpc error: code = NotFound desc = could not find container \"acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916\": container with ID starting with acc571dc1b0a84dd719974aaa298b63367c51caae2991325dd8b261e55564916 not found: ID does not exist" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.244915 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d833-account-create-update-dbfhx" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.325890 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0834981f-5c9e-48dc-a18a-4108e2eb24f4-operator-scripts\") pod \"0834981f-5c9e-48dc-a18a-4108e2eb24f4\" (UID: \"0834981f-5c9e-48dc-a18a-4108e2eb24f4\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.325940 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq8qh\" (UniqueName: \"kubernetes.io/projected/0834981f-5c9e-48dc-a18a-4108e2eb24f4-kube-api-access-rq8qh\") pod \"0834981f-5c9e-48dc-a18a-4108e2eb24f4\" (UID: \"0834981f-5c9e-48dc-a18a-4108e2eb24f4\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.326545 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0834981f-5c9e-48dc-a18a-4108e2eb24f4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0834981f-5c9e-48dc-a18a-4108e2eb24f4" (UID: "0834981f-5c9e-48dc-a18a-4108e2eb24f4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.333943 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0834981f-5c9e-48dc-a18a-4108e2eb24f4-kube-api-access-rq8qh" (OuterVolumeSpecName: "kube-api-access-rq8qh") pod "0834981f-5c9e-48dc-a18a-4108e2eb24f4" (UID: "0834981f-5c9e-48dc-a18a-4108e2eb24f4"). InnerVolumeSpecName "kube-api-access-rq8qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.427545 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0834981f-5c9e-48dc-a18a-4108e2eb24f4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.427580 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq8qh\" (UniqueName: \"kubernetes.io/projected/0834981f-5c9e-48dc-a18a-4108e2eb24f4-kube-api-access-rq8qh\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.434026 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h4czn" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.440732 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-910a-account-create-update-lvntk" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.464612 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fgzzd" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.478396 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p67xw" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.488893 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a6cd-account-create-update-kvdwx" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.534068 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b651773-9cf9-4458-9c39-37b9104ff41e-operator-scripts\") pod \"9b651773-9cf9-4458-9c39-37b9104ff41e\" (UID: \"9b651773-9cf9-4458-9c39-37b9104ff41e\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.534211 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmffj\" (UniqueName: \"kubernetes.io/projected/3aa83e5f-8a9a-4275-989a-105cf6370d74-kube-api-access-lmffj\") pod \"3aa83e5f-8a9a-4275-989a-105cf6370d74\" (UID: \"3aa83e5f-8a9a-4275-989a-105cf6370d74\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.534248 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7lrk\" (UniqueName: \"kubernetes.io/projected/9b651773-9cf9-4458-9c39-37b9104ff41e-kube-api-access-h7lrk\") pod \"9b651773-9cf9-4458-9c39-37b9104ff41e\" (UID: \"9b651773-9cf9-4458-9c39-37b9104ff41e\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.534748 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b651773-9cf9-4458-9c39-37b9104ff41e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9b651773-9cf9-4458-9c39-37b9104ff41e" (UID: "9b651773-9cf9-4458-9c39-37b9104ff41e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.534801 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3aa83e5f-8a9a-4275-989a-105cf6370d74-operator-scripts\") pod \"3aa83e5f-8a9a-4275-989a-105cf6370d74\" (UID: \"3aa83e5f-8a9a-4275-989a-105cf6370d74\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.535482 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9b651773-9cf9-4458-9c39-37b9104ff41e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.536387 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aa83e5f-8a9a-4275-989a-105cf6370d74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3aa83e5f-8a9a-4275-989a-105cf6370d74" (UID: "3aa83e5f-8a9a-4275-989a-105cf6370d74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.540029 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b651773-9cf9-4458-9c39-37b9104ff41e-kube-api-access-h7lrk" (OuterVolumeSpecName: "kube-api-access-h7lrk") pod "9b651773-9cf9-4458-9c39-37b9104ff41e" (UID: "9b651773-9cf9-4458-9c39-37b9104ff41e"). InnerVolumeSpecName "kube-api-access-h7lrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.553114 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aa83e5f-8a9a-4275-989a-105cf6370d74-kube-api-access-lmffj" (OuterVolumeSpecName: "kube-api-access-lmffj") pod "3aa83e5f-8a9a-4275-989a-105cf6370d74" (UID: "3aa83e5f-8a9a-4275-989a-105cf6370d74"). InnerVolumeSpecName "kube-api-access-lmffj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.636475 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b16bcd-5191-4481-bb9a-a1762143bc89-operator-scripts\") pod \"81b16bcd-5191-4481-bb9a-a1762143bc89\" (UID: \"81b16bcd-5191-4481-bb9a-a1762143bc89\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.636610 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daee344f-d74a-43d0-9fc4-f651011ef32f-operator-scripts\") pod \"daee344f-d74a-43d0-9fc4-f651011ef32f\" (UID: \"daee344f-d74a-43d0-9fc4-f651011ef32f\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.636642 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359391a8-8d1e-47a2-b849-e7d574bd0613-operator-scripts\") pod \"359391a8-8d1e-47a2-b849-e7d574bd0613\" (UID: \"359391a8-8d1e-47a2-b849-e7d574bd0613\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.636670 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwqjl\" (UniqueName: \"kubernetes.io/projected/81b16bcd-5191-4481-bb9a-a1762143bc89-kube-api-access-bwqjl\") pod \"81b16bcd-5191-4481-bb9a-a1762143bc89\" (UID: \"81b16bcd-5191-4481-bb9a-a1762143bc89\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.636722 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vqcb\" (UniqueName: \"kubernetes.io/projected/359391a8-8d1e-47a2-b849-e7d574bd0613-kube-api-access-9vqcb\") pod \"359391a8-8d1e-47a2-b849-e7d574bd0613\" (UID: \"359391a8-8d1e-47a2-b849-e7d574bd0613\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.636793 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csjc6\" (UniqueName: \"kubernetes.io/projected/daee344f-d74a-43d0-9fc4-f651011ef32f-kube-api-access-csjc6\") pod \"daee344f-d74a-43d0-9fc4-f651011ef32f\" (UID: \"daee344f-d74a-43d0-9fc4-f651011ef32f\") " Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.637246 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmffj\" (UniqueName: \"kubernetes.io/projected/3aa83e5f-8a9a-4275-989a-105cf6370d74-kube-api-access-lmffj\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.637270 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7lrk\" (UniqueName: \"kubernetes.io/projected/9b651773-9cf9-4458-9c39-37b9104ff41e-kube-api-access-h7lrk\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.637287 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3aa83e5f-8a9a-4275-989a-105cf6370d74-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.637408 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81b16bcd-5191-4481-bb9a-a1762143bc89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81b16bcd-5191-4481-bb9a-a1762143bc89" (UID: "81b16bcd-5191-4481-bb9a-a1762143bc89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.637516 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/359391a8-8d1e-47a2-b849-e7d574bd0613-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "359391a8-8d1e-47a2-b849-e7d574bd0613" (UID: "359391a8-8d1e-47a2-b849-e7d574bd0613"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.637614 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daee344f-d74a-43d0-9fc4-f651011ef32f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "daee344f-d74a-43d0-9fc4-f651011ef32f" (UID: "daee344f-d74a-43d0-9fc4-f651011ef32f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.640039 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daee344f-d74a-43d0-9fc4-f651011ef32f-kube-api-access-csjc6" (OuterVolumeSpecName: "kube-api-access-csjc6") pod "daee344f-d74a-43d0-9fc4-f651011ef32f" (UID: "daee344f-d74a-43d0-9fc4-f651011ef32f"). InnerVolumeSpecName "kube-api-access-csjc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.640516 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81b16bcd-5191-4481-bb9a-a1762143bc89-kube-api-access-bwqjl" (OuterVolumeSpecName: "kube-api-access-bwqjl") pod "81b16bcd-5191-4481-bb9a-a1762143bc89" (UID: "81b16bcd-5191-4481-bb9a-a1762143bc89"). InnerVolumeSpecName "kube-api-access-bwqjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.642061 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/359391a8-8d1e-47a2-b849-e7d574bd0613-kube-api-access-9vqcb" (OuterVolumeSpecName: "kube-api-access-9vqcb") pod "359391a8-8d1e-47a2-b849-e7d574bd0613" (UID: "359391a8-8d1e-47a2-b849-e7d574bd0613"). InnerVolumeSpecName "kube-api-access-9vqcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.738822 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/daee344f-d74a-43d0-9fc4-f651011ef32f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.738858 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/359391a8-8d1e-47a2-b849-e7d574bd0613-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.738871 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwqjl\" (UniqueName: \"kubernetes.io/projected/81b16bcd-5191-4481-bb9a-a1762143bc89-kube-api-access-bwqjl\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.738885 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vqcb\" (UniqueName: \"kubernetes.io/projected/359391a8-8d1e-47a2-b849-e7d574bd0613-kube-api-access-9vqcb\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.738896 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csjc6\" (UniqueName: \"kubernetes.io/projected/daee344f-d74a-43d0-9fc4-f651011ef32f-kube-api-access-csjc6\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.738909 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81b16bcd-5191-4481-bb9a-a1762143bc89-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.838307 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p67xw" event={"ID":"daee344f-d74a-43d0-9fc4-f651011ef32f","Type":"ContainerDied","Data":"45808c597d99d28f81bb5756a424918dcfce990f1af1177f4a0bec1e37a7cf54"} Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.838353 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45808c597d99d28f81bb5756a424918dcfce990f1af1177f4a0bec1e37a7cf54" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.838450 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p67xw" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.841972 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fgzzd" event={"ID":"359391a8-8d1e-47a2-b849-e7d574bd0613","Type":"ContainerDied","Data":"6f993d40335ddd562231761f5ca8da697bb66967137d599b1ff1080185caf0b5"} Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.842027 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f993d40335ddd562231761f5ca8da697bb66967137d599b1ff1080185caf0b5" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.841985 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fgzzd" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.843483 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d833-account-create-update-dbfhx" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.843496 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d833-account-create-update-dbfhx" event={"ID":"0834981f-5c9e-48dc-a18a-4108e2eb24f4","Type":"ContainerDied","Data":"ff7a098c0dc878efcb59b73e6fe90f5e1ae02454364beb8277f6cdafa702bc1f"} Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.843681 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff7a098c0dc878efcb59b73e6fe90f5e1ae02454364beb8277f6cdafa702bc1f" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.846508 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-h4czn" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.846532 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-h4czn" event={"ID":"9b651773-9cf9-4458-9c39-37b9104ff41e","Type":"ContainerDied","Data":"a9de098fe994ed0d0095b8132d5db3d3c8929b25eaf2151e556dc8b2356f5968"} Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.847004 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9de098fe994ed0d0095b8132d5db3d3c8929b25eaf2151e556dc8b2356f5968" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.848379 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-910a-account-create-update-lvntk" event={"ID":"3aa83e5f-8a9a-4275-989a-105cf6370d74","Type":"ContainerDied","Data":"918acb7ff24ff94384e3cef57c907bb6f73e9c4740d8aa219c1842d1ff62e6a0"} Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.848703 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="918acb7ff24ff94384e3cef57c907bb6f73e9c4740d8aa219c1842d1ff62e6a0" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.848459 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-910a-account-create-update-lvntk" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.849846 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a6cd-account-create-update-kvdwx" event={"ID":"81b16bcd-5191-4481-bb9a-a1762143bc89","Type":"ContainerDied","Data":"2b2dab6bdd0c08629afcf6b53bf9e665bab8dbf01047a6ad5bbef954e9cc1d2f"} Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.849873 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b2dab6bdd0c08629afcf6b53bf9e665bab8dbf01047a6ad5bbef954e9cc1d2f" Feb 02 10:50:14 crc kubenswrapper[4909]: I0202 10:50:14.849918 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a6cd-account-create-update-kvdwx" Feb 02 10:50:15 crc kubenswrapper[4909]: I0202 10:50:15.027193 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24055ff-2124-4f7d-9560-6b2effc2ba4d" path="/var/lib/kubelet/pods/c24055ff-2124-4f7d-9560-6b2effc2ba4d/volumes" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.028994 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tkd66"] Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.035363 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tkd66"] Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.129368 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-mhgf5"] Feb 02 10:50:16 crc kubenswrapper[4909]: E0202 10:50:16.129833 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b16bcd-5191-4481-bb9a-a1762143bc89" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.129855 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b16bcd-5191-4481-bb9a-a1762143bc89" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: E0202 10:50:16.129878 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="359391a8-8d1e-47a2-b849-e7d574bd0613" containerName="mariadb-database-create" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.129887 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="359391a8-8d1e-47a2-b849-e7d574bd0613" containerName="mariadb-database-create" Feb 02 10:50:16 crc kubenswrapper[4909]: E0202 10:50:16.129903 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aa83e5f-8a9a-4275-989a-105cf6370d74" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.129911 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aa83e5f-8a9a-4275-989a-105cf6370d74" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: E0202 10:50:16.129924 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24055ff-2124-4f7d-9560-6b2effc2ba4d" containerName="dnsmasq-dns" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.129932 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24055ff-2124-4f7d-9560-6b2effc2ba4d" containerName="dnsmasq-dns" Feb 02 10:50:16 crc kubenswrapper[4909]: E0202 10:50:16.129941 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b651773-9cf9-4458-9c39-37b9104ff41e" containerName="mariadb-database-create" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.129948 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b651773-9cf9-4458-9c39-37b9104ff41e" containerName="mariadb-database-create" Feb 02 10:50:16 crc kubenswrapper[4909]: E0202 10:50:16.129960 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b68aca0f-b5c9-411a-b8d2-b79e272b5bdb" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.129968 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b68aca0f-b5c9-411a-b8d2-b79e272b5bdb" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: E0202 10:50:16.129982 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24055ff-2124-4f7d-9560-6b2effc2ba4d" containerName="init" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.129990 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24055ff-2124-4f7d-9560-6b2effc2ba4d" containerName="init" Feb 02 10:50:16 crc kubenswrapper[4909]: E0202 10:50:16.130005 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0834981f-5c9e-48dc-a18a-4108e2eb24f4" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.130016 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0834981f-5c9e-48dc-a18a-4108e2eb24f4" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: E0202 10:50:16.130034 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daee344f-d74a-43d0-9fc4-f651011ef32f" containerName="mariadb-database-create" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.130042 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="daee344f-d74a-43d0-9fc4-f651011ef32f" containerName="mariadb-database-create" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.130223 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0834981f-5c9e-48dc-a18a-4108e2eb24f4" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.130237 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="daee344f-d74a-43d0-9fc4-f651011ef32f" containerName="mariadb-database-create" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.130251 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="359391a8-8d1e-47a2-b849-e7d574bd0613" containerName="mariadb-database-create" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.130264 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b16bcd-5191-4481-bb9a-a1762143bc89" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.130276 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24055ff-2124-4f7d-9560-6b2effc2ba4d" containerName="dnsmasq-dns" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.130289 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aa83e5f-8a9a-4275-989a-105cf6370d74" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.130298 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b68aca0f-b5c9-411a-b8d2-b79e272b5bdb" containerName="mariadb-account-create-update" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.130311 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b651773-9cf9-4458-9c39-37b9104ff41e" containerName="mariadb-database-create" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.131009 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mhgf5" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.133663 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.139850 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mhgf5"] Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.261076 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ec9c7cf-acf0-454c-9267-a87b03460d6b-operator-scripts\") pod \"root-account-create-update-mhgf5\" (UID: \"4ec9c7cf-acf0-454c-9267-a87b03460d6b\") " pod="openstack/root-account-create-update-mhgf5" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.261228 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5sxl\" (UniqueName: \"kubernetes.io/projected/4ec9c7cf-acf0-454c-9267-a87b03460d6b-kube-api-access-k5sxl\") pod \"root-account-create-update-mhgf5\" (UID: \"4ec9c7cf-acf0-454c-9267-a87b03460d6b\") " pod="openstack/root-account-create-update-mhgf5" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.362734 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ec9c7cf-acf0-454c-9267-a87b03460d6b-operator-scripts\") pod \"root-account-create-update-mhgf5\" (UID: \"4ec9c7cf-acf0-454c-9267-a87b03460d6b\") " pod="openstack/root-account-create-update-mhgf5" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.363107 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5sxl\" (UniqueName: \"kubernetes.io/projected/4ec9c7cf-acf0-454c-9267-a87b03460d6b-kube-api-access-k5sxl\") pod \"root-account-create-update-mhgf5\" (UID: \"4ec9c7cf-acf0-454c-9267-a87b03460d6b\") " pod="openstack/root-account-create-update-mhgf5" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.363459 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ec9c7cf-acf0-454c-9267-a87b03460d6b-operator-scripts\") pod \"root-account-create-update-mhgf5\" (UID: \"4ec9c7cf-acf0-454c-9267-a87b03460d6b\") " pod="openstack/root-account-create-update-mhgf5" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.389015 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5sxl\" (UniqueName: \"kubernetes.io/projected/4ec9c7cf-acf0-454c-9267-a87b03460d6b-kube-api-access-k5sxl\") pod \"root-account-create-update-mhgf5\" (UID: \"4ec9c7cf-acf0-454c-9267-a87b03460d6b\") " pod="openstack/root-account-create-update-mhgf5" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.453062 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mhgf5" Feb 02 10:50:16 crc kubenswrapper[4909]: I0202 10:50:16.855032 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-mhgf5"] Feb 02 10:50:17 crc kubenswrapper[4909]: I0202 10:50:17.028876 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b68aca0f-b5c9-411a-b8d2-b79e272b5bdb" path="/var/lib/kubelet/pods/b68aca0f-b5c9-411a-b8d2-b79e272b5bdb/volumes" Feb 02 10:50:17 crc kubenswrapper[4909]: I0202 10:50:17.874234 4909 generic.go:334] "Generic (PLEG): container finished" podID="4ec9c7cf-acf0-454c-9267-a87b03460d6b" containerID="f7f1646054ee4622c5aa78a07f3d9c809ee0e9476cf483d1d536142af4148580" exitCode=0 Feb 02 10:50:17 crc kubenswrapper[4909]: I0202 10:50:17.874275 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mhgf5" event={"ID":"4ec9c7cf-acf0-454c-9267-a87b03460d6b","Type":"ContainerDied","Data":"f7f1646054ee4622c5aa78a07f3d9c809ee0e9476cf483d1d536142af4148580"} Feb 02 10:50:17 crc kubenswrapper[4909]: I0202 10:50:17.874319 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mhgf5" event={"ID":"4ec9c7cf-acf0-454c-9267-a87b03460d6b","Type":"ContainerStarted","Data":"776f932ef7b00f60de023dbc9987d0a322ced87c48f465fe5f5090590ae05cee"} Feb 02 10:50:18 crc kubenswrapper[4909]: I0202 10:50:18.091172 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:18 crc kubenswrapper[4909]: E0202 10:50:18.091885 4909 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 02 10:50:18 crc kubenswrapper[4909]: E0202 10:50:18.092001 4909 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 02 10:50:18 crc kubenswrapper[4909]: E0202 10:50:18.092118 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift podName:7c400ec0-7faf-4151-b34e-ee28044b89e7 nodeName:}" failed. No retries permitted until 2026-02-02 10:50:34.092103957 +0000 UTC m=+1159.838204682 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift") pod "swift-storage-0" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7") : configmap "swift-ring-files" not found Feb 02 10:50:18 crc kubenswrapper[4909]: E0202 10:50:18.857059 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb441d32f_f76f_4e7b_b3fe_40e93b126567.slice/crio-conmon-c73875b9010e1e509b2e9adfe296f5305133f400c8e52a90164f9f8d577e55df.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:50:18 crc kubenswrapper[4909]: I0202 10:50:18.882227 4909 generic.go:334] "Generic (PLEG): container finished" podID="b441d32f-f76f-4e7b-b3fe-40e93b126567" containerID="c73875b9010e1e509b2e9adfe296f5305133f400c8e52a90164f9f8d577e55df" exitCode=0 Feb 02 10:50:18 crc kubenswrapper[4909]: I0202 10:50:18.882308 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b441d32f-f76f-4e7b-b3fe-40e93b126567","Type":"ContainerDied","Data":"c73875b9010e1e509b2e9adfe296f5305133f400c8e52a90164f9f8d577e55df"} Feb 02 10:50:18 crc kubenswrapper[4909]: I0202 10:50:18.885780 4909 generic.go:334] "Generic (PLEG): container finished" podID="bf7b0419-b32f-44ab-b35d-eb06765be89d" containerID="c7b0f3850dc7a45c5f81260746518a4b0fb79c1686ae003921f7039e6da8ea56" exitCode=0 Feb 02 10:50:18 crc kubenswrapper[4909]: I0202 10:50:18.885850 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vxm4q" event={"ID":"bf7b0419-b32f-44ab-b35d-eb06765be89d","Type":"ContainerDied","Data":"c7b0f3850dc7a45c5f81260746518a4b0fb79c1686ae003921f7039e6da8ea56"} Feb 02 10:50:18 crc kubenswrapper[4909]: I0202 10:50:18.887978 4909 generic.go:334] "Generic (PLEG): container finished" podID="1ab15f72-b249-42d5-8698-273c5afc7758" containerID="529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360" exitCode=0 Feb 02 10:50:18 crc kubenswrapper[4909]: I0202 10:50:18.888154 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1ab15f72-b249-42d5-8698-273c5afc7758","Type":"ContainerDied","Data":"529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360"} Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.176904 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mhgf5" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.312429 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5sxl\" (UniqueName: \"kubernetes.io/projected/4ec9c7cf-acf0-454c-9267-a87b03460d6b-kube-api-access-k5sxl\") pod \"4ec9c7cf-acf0-454c-9267-a87b03460d6b\" (UID: \"4ec9c7cf-acf0-454c-9267-a87b03460d6b\") " Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.312487 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ec9c7cf-acf0-454c-9267-a87b03460d6b-operator-scripts\") pod \"4ec9c7cf-acf0-454c-9267-a87b03460d6b\" (UID: \"4ec9c7cf-acf0-454c-9267-a87b03460d6b\") " Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.313011 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ec9c7cf-acf0-454c-9267-a87b03460d6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4ec9c7cf-acf0-454c-9267-a87b03460d6b" (UID: "4ec9c7cf-acf0-454c-9267-a87b03460d6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.316670 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ec9c7cf-acf0-454c-9267-a87b03460d6b-kube-api-access-k5sxl" (OuterVolumeSpecName: "kube-api-access-k5sxl") pod "4ec9c7cf-acf0-454c-9267-a87b03460d6b" (UID: "4ec9c7cf-acf0-454c-9267-a87b03460d6b"). InnerVolumeSpecName "kube-api-access-k5sxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.413949 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5sxl\" (UniqueName: \"kubernetes.io/projected/4ec9c7cf-acf0-454c-9267-a87b03460d6b-kube-api-access-k5sxl\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.413992 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4ec9c7cf-acf0-454c-9267-a87b03460d6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.500547 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wn6km"] Feb 02 10:50:19 crc kubenswrapper[4909]: E0202 10:50:19.501579 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ec9c7cf-acf0-454c-9267-a87b03460d6b" containerName="mariadb-account-create-update" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.501599 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ec9c7cf-acf0-454c-9267-a87b03460d6b" containerName="mariadb-account-create-update" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.501750 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ec9c7cf-acf0-454c-9267-a87b03460d6b" containerName="mariadb-account-create-update" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.502356 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.504050 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.510835 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.510893 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.510940 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.511799 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8890ae7c7b5156c4d584d7bd5581da3d2b944d91026e2fa8dff7d54c88b8b78c"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.511885 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://8890ae7c7b5156c4d584d7bd5581da3d2b944d91026e2fa8dff7d54c88b8b78c" gracePeriod=600 Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.513599 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-c2scs" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.515526 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wn6km"] Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.617644 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-config-data\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.618038 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdgzj\" (UniqueName: \"kubernetes.io/projected/db6d95b6-71f4-47be-90e2-64ebcf72442c-kube-api-access-fdgzj\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.618070 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-combined-ca-bundle\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.618098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-db-sync-config-data\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.719394 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-config-data\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.719531 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdgzj\" (UniqueName: \"kubernetes.io/projected/db6d95b6-71f4-47be-90e2-64ebcf72442c-kube-api-access-fdgzj\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.719583 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-combined-ca-bundle\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.719603 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-db-sync-config-data\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.724480 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-config-data\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.724752 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-combined-ca-bundle\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.724866 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-db-sync-config-data\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.736720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdgzj\" (UniqueName: \"kubernetes.io/projected/db6d95b6-71f4-47be-90e2-64ebcf72442c-kube-api-access-fdgzj\") pod \"glance-db-sync-wn6km\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.815640 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.915564 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="8890ae7c7b5156c4d584d7bd5581da3d2b944d91026e2fa8dff7d54c88b8b78c" exitCode=0 Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.915872 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"8890ae7c7b5156c4d584d7bd5581da3d2b944d91026e2fa8dff7d54c88b8b78c"} Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.915905 4909 scope.go:117] "RemoveContainer" containerID="e0831b6285fe2493141946d0a4e8629f9b6b1551f717985b11e1d8a63f78fa44" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.927413 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b441d32f-f76f-4e7b-b3fe-40e93b126567","Type":"ContainerStarted","Data":"7fce69eec287d7c5a7114d96e223cf91ca30b188e5b132915844731ca8a68ec2"} Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.927920 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.929859 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-mhgf5" event={"ID":"4ec9c7cf-acf0-454c-9267-a87b03460d6b","Type":"ContainerDied","Data":"776f932ef7b00f60de023dbc9987d0a322ced87c48f465fe5f5090590ae05cee"} Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.929877 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="776f932ef7b00f60de023dbc9987d0a322ced87c48f465fe5f5090590ae05cee" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.929925 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-mhgf5" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.945278 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1ab15f72-b249-42d5-8698-273c5afc7758","Type":"ContainerStarted","Data":"13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9"} Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.945726 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.962062 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=43.79842846 podStartE2EDuration="55.962040693s" podCreationTimestamp="2026-02-02 10:49:24 +0000 UTC" firstStartedPulling="2026-02-02 10:49:33.276868605 +0000 UTC m=+1099.022969330" lastFinishedPulling="2026-02-02 10:49:45.440480828 +0000 UTC m=+1111.186581563" observedRunningTime="2026-02-02 10:50:19.957753951 +0000 UTC m=+1145.703854686" watchObservedRunningTime="2026-02-02 10:50:19.962040693 +0000 UTC m=+1145.708141448" Feb 02 10:50:19 crc kubenswrapper[4909]: I0202 10:50:19.991591 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.024359537 podStartE2EDuration="55.991572361s" podCreationTimestamp="2026-02-02 10:49:24 +0000 UTC" firstStartedPulling="2026-02-02 10:49:32.459292407 +0000 UTC m=+1098.205393142" lastFinishedPulling="2026-02-02 10:49:45.426505231 +0000 UTC m=+1111.172605966" observedRunningTime="2026-02-02 10:50:19.984909012 +0000 UTC m=+1145.731009747" watchObservedRunningTime="2026-02-02 10:50:19.991572361 +0000 UTC m=+1145.737673096" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.388963 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.432308 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf7b0419-b32f-44ab-b35d-eb06765be89d-etc-swift\") pod \"bf7b0419-b32f-44ab-b35d-eb06765be89d\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.432421 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-swiftconf\") pod \"bf7b0419-b32f-44ab-b35d-eb06765be89d\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.432463 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-dispersionconf\") pod \"bf7b0419-b32f-44ab-b35d-eb06765be89d\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.432488 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-scripts\") pod \"bf7b0419-b32f-44ab-b35d-eb06765be89d\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.432507 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-combined-ca-bundle\") pod \"bf7b0419-b32f-44ab-b35d-eb06765be89d\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.432539 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq9g4\" (UniqueName: \"kubernetes.io/projected/bf7b0419-b32f-44ab-b35d-eb06765be89d-kube-api-access-mq9g4\") pod \"bf7b0419-b32f-44ab-b35d-eb06765be89d\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.432579 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-ring-data-devices\") pod \"bf7b0419-b32f-44ab-b35d-eb06765be89d\" (UID: \"bf7b0419-b32f-44ab-b35d-eb06765be89d\") " Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.433474 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bf7b0419-b32f-44ab-b35d-eb06765be89d" (UID: "bf7b0419-b32f-44ab-b35d-eb06765be89d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.434277 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf7b0419-b32f-44ab-b35d-eb06765be89d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bf7b0419-b32f-44ab-b35d-eb06765be89d" (UID: "bf7b0419-b32f-44ab-b35d-eb06765be89d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.438337 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf7b0419-b32f-44ab-b35d-eb06765be89d-kube-api-access-mq9g4" (OuterVolumeSpecName: "kube-api-access-mq9g4") pod "bf7b0419-b32f-44ab-b35d-eb06765be89d" (UID: "bf7b0419-b32f-44ab-b35d-eb06765be89d"). InnerVolumeSpecName "kube-api-access-mq9g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.455693 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bf7b0419-b32f-44ab-b35d-eb06765be89d" (UID: "bf7b0419-b32f-44ab-b35d-eb06765be89d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.456083 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-scripts" (OuterVolumeSpecName: "scripts") pod "bf7b0419-b32f-44ab-b35d-eb06765be89d" (UID: "bf7b0419-b32f-44ab-b35d-eb06765be89d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.459909 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf7b0419-b32f-44ab-b35d-eb06765be89d" (UID: "bf7b0419-b32f-44ab-b35d-eb06765be89d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.471537 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bf7b0419-b32f-44ab-b35d-eb06765be89d" (UID: "bf7b0419-b32f-44ab-b35d-eb06765be89d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.518566 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wn6km"] Feb 02 10:50:20 crc kubenswrapper[4909]: W0202 10:50:20.523717 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb6d95b6_71f4_47be_90e2_64ebcf72442c.slice/crio-881c9b582aa0a555ca889c481e82710dd15b4f1bad373cc5e41d2e398dd4826e WatchSource:0}: Error finding container 881c9b582aa0a555ca889c481e82710dd15b4f1bad373cc5e41d2e398dd4826e: Status 404 returned error can't find the container with id 881c9b582aa0a555ca889c481e82710dd15b4f1bad373cc5e41d2e398dd4826e Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.535712 4909 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf7b0419-b32f-44ab-b35d-eb06765be89d-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.536059 4909 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.536069 4909 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.536079 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.536087 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf7b0419-b32f-44ab-b35d-eb06765be89d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.536098 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq9g4\" (UniqueName: \"kubernetes.io/projected/bf7b0419-b32f-44ab-b35d-eb06765be89d-kube-api-access-mq9g4\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.536106 4909 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf7b0419-b32f-44ab-b35d-eb06765be89d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.954638 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"ad73216e79d924c2922053100514e06765aa5c63e49cfea0b056d73eebae4d59"} Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.956387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vxm4q" event={"ID":"bf7b0419-b32f-44ab-b35d-eb06765be89d","Type":"ContainerDied","Data":"0ae2c139794feea29be34af77ba28df51230b0f58c7246b6bf8fafa8940f6348"} Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.956412 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vxm4q" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.956423 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ae2c139794feea29be34af77ba28df51230b0f58c7246b6bf8fafa8940f6348" Feb 02 10:50:20 crc kubenswrapper[4909]: I0202 10:50:20.957500 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wn6km" event={"ID":"db6d95b6-71f4-47be-90e2-64ebcf72442c","Type":"ContainerStarted","Data":"881c9b582aa0a555ca889c481e82710dd15b4f1bad373cc5e41d2e398dd4826e"} Feb 02 10:50:22 crc kubenswrapper[4909]: I0202 10:50:22.798225 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.516252 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qpqvt" podUID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" containerName="ovn-controller" probeResult="failure" output=< Feb 02 10:50:28 crc kubenswrapper[4909]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 02 10:50:28 crc kubenswrapper[4909]: > Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.522309 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.527797 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.748936 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-qpqvt-config-w8f2c"] Feb 02 10:50:28 crc kubenswrapper[4909]: E0202 10:50:28.749566 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf7b0419-b32f-44ab-b35d-eb06765be89d" containerName="swift-ring-rebalance" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.749585 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf7b0419-b32f-44ab-b35d-eb06765be89d" containerName="swift-ring-rebalance" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.749746 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf7b0419-b32f-44ab-b35d-eb06765be89d" containerName="swift-ring-rebalance" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.750325 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.753015 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.757172 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qpqvt-config-w8f2c"] Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.784707 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.784774 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-scripts\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.784797 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn97f\" (UniqueName: \"kubernetes.io/projected/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-kube-api-access-rn97f\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.784920 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-log-ovn\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.784993 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run-ovn\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.785080 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-additional-scripts\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.886946 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-additional-scripts\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.887016 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.887046 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-scripts\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.887063 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn97f\" (UniqueName: \"kubernetes.io/projected/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-kube-api-access-rn97f\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.887090 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-log-ovn\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.887126 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run-ovn\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.887318 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.887632 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-log-ovn\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.887336 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run-ovn\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.887930 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-additional-scripts\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.889404 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-scripts\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:28 crc kubenswrapper[4909]: I0202 10:50:28.914240 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn97f\" (UniqueName: \"kubernetes.io/projected/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-kube-api-access-rn97f\") pod \"ovn-controller-qpqvt-config-w8f2c\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:29 crc kubenswrapper[4909]: I0202 10:50:29.083541 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:33 crc kubenswrapper[4909]: I0202 10:50:33.504674 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qpqvt" podUID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" containerName="ovn-controller" probeResult="failure" output=< Feb 02 10:50:33 crc kubenswrapper[4909]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 02 10:50:33 crc kubenswrapper[4909]: > Feb 02 10:50:34 crc kubenswrapper[4909]: I0202 10:50:34.164983 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:34 crc kubenswrapper[4909]: I0202 10:50:34.169111 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") pod \"swift-storage-0\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " pod="openstack/swift-storage-0" Feb 02 10:50:34 crc kubenswrapper[4909]: I0202 10:50:34.271933 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-qpqvt-config-w8f2c"] Feb 02 10:50:34 crc kubenswrapper[4909]: W0202 10:50:34.272899 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9b1ee43_97f6_4b9c_9abe_ba79f5694c48.slice/crio-ba55d395ed2afadbd5cc97722cccf046077fdec712bf599a6ac37715bf1cd5c2 WatchSource:0}: Error finding container ba55d395ed2afadbd5cc97722cccf046077fdec712bf599a6ac37715bf1cd5c2: Status 404 returned error can't find the container with id ba55d395ed2afadbd5cc97722cccf046077fdec712bf599a6ac37715bf1cd5c2 Feb 02 10:50:34 crc kubenswrapper[4909]: I0202 10:50:34.344882 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 02 10:50:34 crc kubenswrapper[4909]: I0202 10:50:34.949578 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:50:35 crc kubenswrapper[4909]: I0202 10:50:35.079393 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"360dc222948004e550a1dca18a33e231919a25bfefa73d0be631cf2a93456e82"} Feb 02 10:50:35 crc kubenswrapper[4909]: I0202 10:50:35.081280 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qpqvt-config-w8f2c" event={"ID":"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48","Type":"ContainerStarted","Data":"7ec049f0d04a0e397b9675c802ee25f2308b6c17de78026626e4edae11e67077"} Feb 02 10:50:35 crc kubenswrapper[4909]: I0202 10:50:35.081315 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qpqvt-config-w8f2c" event={"ID":"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48","Type":"ContainerStarted","Data":"ba55d395ed2afadbd5cc97722cccf046077fdec712bf599a6ac37715bf1cd5c2"} Feb 02 10:50:35 crc kubenswrapper[4909]: I0202 10:50:35.085739 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wn6km" event={"ID":"db6d95b6-71f4-47be-90e2-64ebcf72442c","Type":"ContainerStarted","Data":"aa704490c8fecb94ae45cd2a957887fc69b4f0520c85a9a5d292b5f18c60a975"} Feb 02 10:50:35 crc kubenswrapper[4909]: I0202 10:50:35.097978 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-qpqvt-config-w8f2c" podStartSLOduration=7.097962051 podStartE2EDuration="7.097962051s" podCreationTimestamp="2026-02-02 10:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:35.096663904 +0000 UTC m=+1160.842764639" watchObservedRunningTime="2026-02-02 10:50:35.097962051 +0000 UTC m=+1160.844062786" Feb 02 10:50:35 crc kubenswrapper[4909]: I0202 10:50:35.115062 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wn6km" podStartSLOduration=2.687137157 podStartE2EDuration="16.115043846s" podCreationTimestamp="2026-02-02 10:50:19 +0000 UTC" firstStartedPulling="2026-02-02 10:50:20.525711055 +0000 UTC m=+1146.271811800" lastFinishedPulling="2026-02-02 10:50:33.953617754 +0000 UTC m=+1159.699718489" observedRunningTime="2026-02-02 10:50:35.108862851 +0000 UTC m=+1160.854963606" watchObservedRunningTime="2026-02-02 10:50:35.115043846 +0000 UTC m=+1160.861144581" Feb 02 10:50:35 crc kubenswrapper[4909]: I0202 10:50:35.979925 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.100613 4909 generic.go:334] "Generic (PLEG): container finished" podID="c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" containerID="7ec049f0d04a0e397b9675c802ee25f2308b6c17de78026626e4edae11e67077" exitCode=0 Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.100866 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qpqvt-config-w8f2c" event={"ID":"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48","Type":"ContainerDied","Data":"7ec049f0d04a0e397b9675c802ee25f2308b6c17de78026626e4edae11e67077"} Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.117858 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"bdf18a1a02fe85f1bb6cf3ed370c1c5c18587a314b6cf14fea7469bf28031585"} Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.295054 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.574382 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-dgnzc"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.575328 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dgnzc" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.589089 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dgnzc"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.595171 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-dd3e-account-create-update-phmjt"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.596177 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dd3e-account-create-update-phmjt" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.606095 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.648320 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-dd3e-account-create-update-phmjt"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.705672 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd9c5\" (UniqueName: \"kubernetes.io/projected/784c4213-ccd0-4c0a-9205-9f251e470297-kube-api-access-nd9c5\") pod \"cinder-db-create-dgnzc\" (UID: \"784c4213-ccd0-4c0a-9205-9f251e470297\") " pod="openstack/cinder-db-create-dgnzc" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.705821 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85mlh\" (UniqueName: \"kubernetes.io/projected/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-kube-api-access-85mlh\") pod \"barbican-dd3e-account-create-update-phmjt\" (UID: \"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c\") " pod="openstack/barbican-dd3e-account-create-update-phmjt" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.705854 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784c4213-ccd0-4c0a-9205-9f251e470297-operator-scripts\") pod \"cinder-db-create-dgnzc\" (UID: \"784c4213-ccd0-4c0a-9205-9f251e470297\") " pod="openstack/cinder-db-create-dgnzc" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.705892 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-operator-scripts\") pod \"barbican-dd3e-account-create-update-phmjt\" (UID: \"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c\") " pod="openstack/barbican-dd3e-account-create-update-phmjt" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.757894 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-082b-account-create-update-wx89t"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.758844 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-082b-account-create-update-wx89t" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.762859 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.763482 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-rx2rn"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.764799 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rx2rn" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.772775 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-082b-account-create-update-wx89t"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.788999 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rx2rn"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.807989 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85mlh\" (UniqueName: \"kubernetes.io/projected/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-kube-api-access-85mlh\") pod \"barbican-dd3e-account-create-update-phmjt\" (UID: \"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c\") " pod="openstack/barbican-dd3e-account-create-update-phmjt" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.808033 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784c4213-ccd0-4c0a-9205-9f251e470297-operator-scripts\") pod \"cinder-db-create-dgnzc\" (UID: \"784c4213-ccd0-4c0a-9205-9f251e470297\") " pod="openstack/cinder-db-create-dgnzc" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.808059 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-operator-scripts\") pod \"barbican-dd3e-account-create-update-phmjt\" (UID: \"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c\") " pod="openstack/barbican-dd3e-account-create-update-phmjt" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.808156 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd9c5\" (UniqueName: \"kubernetes.io/projected/784c4213-ccd0-4c0a-9205-9f251e470297-kube-api-access-nd9c5\") pod \"cinder-db-create-dgnzc\" (UID: \"784c4213-ccd0-4c0a-9205-9f251e470297\") " pod="openstack/cinder-db-create-dgnzc" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.808966 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-operator-scripts\") pod \"barbican-dd3e-account-create-update-phmjt\" (UID: \"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c\") " pod="openstack/barbican-dd3e-account-create-update-phmjt" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.809095 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784c4213-ccd0-4c0a-9205-9f251e470297-operator-scripts\") pod \"cinder-db-create-dgnzc\" (UID: \"784c4213-ccd0-4c0a-9205-9f251e470297\") " pod="openstack/cinder-db-create-dgnzc" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.832179 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85mlh\" (UniqueName: \"kubernetes.io/projected/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-kube-api-access-85mlh\") pod \"barbican-dd3e-account-create-update-phmjt\" (UID: \"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c\") " pod="openstack/barbican-dd3e-account-create-update-phmjt" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.846008 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd9c5\" (UniqueName: \"kubernetes.io/projected/784c4213-ccd0-4c0a-9205-9f251e470297-kube-api-access-nd9c5\") pod \"cinder-db-create-dgnzc\" (UID: \"784c4213-ccd0-4c0a-9205-9f251e470297\") " pod="openstack/cinder-db-create-dgnzc" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.864961 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-smqnr"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.868700 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.872053 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.872173 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l4g7z" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.872347 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.874177 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.881497 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-smqnr"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.897906 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dgnzc" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.909194 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-operator-scripts\") pod \"cinder-082b-account-create-update-wx89t\" (UID: \"72af2fcd-83e7-4adf-bbf6-399f49b07e5b\") " pod="openstack/cinder-082b-account-create-update-wx89t" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.909265 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad7649-e463-477c-b9d9-b317be65e8d1-operator-scripts\") pod \"barbican-db-create-rx2rn\" (UID: \"6bad7649-e463-477c-b9d9-b317be65e8d1\") " pod="openstack/barbican-db-create-rx2rn" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.909338 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6vp5\" (UniqueName: \"kubernetes.io/projected/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-kube-api-access-t6vp5\") pod \"cinder-082b-account-create-update-wx89t\" (UID: \"72af2fcd-83e7-4adf-bbf6-399f49b07e5b\") " pod="openstack/cinder-082b-account-create-update-wx89t" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.909363 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq8cz\" (UniqueName: \"kubernetes.io/projected/6bad7649-e463-477c-b9d9-b317be65e8d1-kube-api-access-lq8cz\") pod \"barbican-db-create-rx2rn\" (UID: \"6bad7649-e463-477c-b9d9-b317be65e8d1\") " pod="openstack/barbican-db-create-rx2rn" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.920566 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dd3e-account-create-update-phmjt" Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.978871 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-nlvbk"] Feb 02 10:50:36 crc kubenswrapper[4909]: I0202 10:50:36.986555 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nlvbk" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.003527 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nlvbk"] Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.010539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-operator-scripts\") pod \"cinder-082b-account-create-update-wx89t\" (UID: \"72af2fcd-83e7-4adf-bbf6-399f49b07e5b\") " pod="openstack/cinder-082b-account-create-update-wx89t" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.010595 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad7649-e463-477c-b9d9-b317be65e8d1-operator-scripts\") pod \"barbican-db-create-rx2rn\" (UID: \"6bad7649-e463-477c-b9d9-b317be65e8d1\") " pod="openstack/barbican-db-create-rx2rn" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.010625 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvmlv\" (UniqueName: \"kubernetes.io/projected/588438b1-3078-46cb-a08a-8f1f215ee5f4-kube-api-access-vvmlv\") pod \"keystone-db-sync-smqnr\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.010661 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-combined-ca-bundle\") pod \"keystone-db-sync-smqnr\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.010697 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq8cz\" (UniqueName: \"kubernetes.io/projected/6bad7649-e463-477c-b9d9-b317be65e8d1-kube-api-access-lq8cz\") pod \"barbican-db-create-rx2rn\" (UID: \"6bad7649-e463-477c-b9d9-b317be65e8d1\") " pod="openstack/barbican-db-create-rx2rn" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.010718 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6vp5\" (UniqueName: \"kubernetes.io/projected/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-kube-api-access-t6vp5\") pod \"cinder-082b-account-create-update-wx89t\" (UID: \"72af2fcd-83e7-4adf-bbf6-399f49b07e5b\") " pod="openstack/cinder-082b-account-create-update-wx89t" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.010744 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-config-data\") pod \"keystone-db-sync-smqnr\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.011462 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-operator-scripts\") pod \"cinder-082b-account-create-update-wx89t\" (UID: \"72af2fcd-83e7-4adf-bbf6-399f49b07e5b\") " pod="openstack/cinder-082b-account-create-update-wx89t" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.015376 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad7649-e463-477c-b9d9-b317be65e8d1-operator-scripts\") pod \"barbican-db-create-rx2rn\" (UID: \"6bad7649-e463-477c-b9d9-b317be65e8d1\") " pod="openstack/barbican-db-create-rx2rn" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.045302 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq8cz\" (UniqueName: \"kubernetes.io/projected/6bad7649-e463-477c-b9d9-b317be65e8d1-kube-api-access-lq8cz\") pod \"barbican-db-create-rx2rn\" (UID: \"6bad7649-e463-477c-b9d9-b317be65e8d1\") " pod="openstack/barbican-db-create-rx2rn" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.050701 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c8ec-account-create-update-cphbx"] Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.053254 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c8ec-account-create-update-cphbx"] Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.053464 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c8ec-account-create-update-cphbx" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.056078 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.058393 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6vp5\" (UniqueName: \"kubernetes.io/projected/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-kube-api-access-t6vp5\") pod \"cinder-082b-account-create-update-wx89t\" (UID: \"72af2fcd-83e7-4adf-bbf6-399f49b07e5b\") " pod="openstack/cinder-082b-account-create-update-wx89t" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.078527 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-082b-account-create-update-wx89t" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.089348 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rx2rn" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.113662 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dks2\" (UniqueName: \"kubernetes.io/projected/05519046-6a41-47f7-9247-03cff29382a5-kube-api-access-7dks2\") pod \"neutron-db-create-nlvbk\" (UID: \"05519046-6a41-47f7-9247-03cff29382a5\") " pod="openstack/neutron-db-create-nlvbk" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.113856 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-config-data\") pod \"keystone-db-sync-smqnr\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.116328 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05519046-6a41-47f7-9247-03cff29382a5-operator-scripts\") pod \"neutron-db-create-nlvbk\" (UID: \"05519046-6a41-47f7-9247-03cff29382a5\") " pod="openstack/neutron-db-create-nlvbk" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.116470 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvmlv\" (UniqueName: \"kubernetes.io/projected/588438b1-3078-46cb-a08a-8f1f215ee5f4-kube-api-access-vvmlv\") pod \"keystone-db-sync-smqnr\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.116586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-combined-ca-bundle\") pod \"keystone-db-sync-smqnr\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.120304 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-config-data\") pod \"keystone-db-sync-smqnr\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.121073 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-combined-ca-bundle\") pod \"keystone-db-sync-smqnr\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.143117 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvmlv\" (UniqueName: \"kubernetes.io/projected/588438b1-3078-46cb-a08a-8f1f215ee5f4-kube-api-access-vvmlv\") pod \"keystone-db-sync-smqnr\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.202579 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.212313 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"bb0826ef7db615ea25639f354b4e3d66a6b4b7a4c615c65933bc47849ffdcf70"} Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.212370 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"688f9c6773a03040cd5ec3fcaab21a892ba396e408fdcffda51c117011469111"} Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.217996 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2be7766e-5d21-45ce-8d1f-23b264467c79-operator-scripts\") pod \"neutron-c8ec-account-create-update-cphbx\" (UID: \"2be7766e-5d21-45ce-8d1f-23b264467c79\") " pod="openstack/neutron-c8ec-account-create-update-cphbx" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.218049 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnbfv\" (UniqueName: \"kubernetes.io/projected/2be7766e-5d21-45ce-8d1f-23b264467c79-kube-api-access-mnbfv\") pod \"neutron-c8ec-account-create-update-cphbx\" (UID: \"2be7766e-5d21-45ce-8d1f-23b264467c79\") " pod="openstack/neutron-c8ec-account-create-update-cphbx" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.218085 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05519046-6a41-47f7-9247-03cff29382a5-operator-scripts\") pod \"neutron-db-create-nlvbk\" (UID: \"05519046-6a41-47f7-9247-03cff29382a5\") " pod="openstack/neutron-db-create-nlvbk" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.218136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dks2\" (UniqueName: \"kubernetes.io/projected/05519046-6a41-47f7-9247-03cff29382a5-kube-api-access-7dks2\") pod \"neutron-db-create-nlvbk\" (UID: \"05519046-6a41-47f7-9247-03cff29382a5\") " pod="openstack/neutron-db-create-nlvbk" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.218886 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05519046-6a41-47f7-9247-03cff29382a5-operator-scripts\") pod \"neutron-db-create-nlvbk\" (UID: \"05519046-6a41-47f7-9247-03cff29382a5\") " pod="openstack/neutron-db-create-nlvbk" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.259737 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dks2\" (UniqueName: \"kubernetes.io/projected/05519046-6a41-47f7-9247-03cff29382a5-kube-api-access-7dks2\") pod \"neutron-db-create-nlvbk\" (UID: \"05519046-6a41-47f7-9247-03cff29382a5\") " pod="openstack/neutron-db-create-nlvbk" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.319689 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2be7766e-5d21-45ce-8d1f-23b264467c79-operator-scripts\") pod \"neutron-c8ec-account-create-update-cphbx\" (UID: \"2be7766e-5d21-45ce-8d1f-23b264467c79\") " pod="openstack/neutron-c8ec-account-create-update-cphbx" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.319733 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnbfv\" (UniqueName: \"kubernetes.io/projected/2be7766e-5d21-45ce-8d1f-23b264467c79-kube-api-access-mnbfv\") pod \"neutron-c8ec-account-create-update-cphbx\" (UID: \"2be7766e-5d21-45ce-8d1f-23b264467c79\") " pod="openstack/neutron-c8ec-account-create-update-cphbx" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.320961 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2be7766e-5d21-45ce-8d1f-23b264467c79-operator-scripts\") pod \"neutron-c8ec-account-create-update-cphbx\" (UID: \"2be7766e-5d21-45ce-8d1f-23b264467c79\") " pod="openstack/neutron-c8ec-account-create-update-cphbx" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.345874 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnbfv\" (UniqueName: \"kubernetes.io/projected/2be7766e-5d21-45ce-8d1f-23b264467c79-kube-api-access-mnbfv\") pod \"neutron-c8ec-account-create-update-cphbx\" (UID: \"2be7766e-5d21-45ce-8d1f-23b264467c79\") " pod="openstack/neutron-c8ec-account-create-update-cphbx" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.512329 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nlvbk" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.522992 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c8ec-account-create-update-cphbx" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.635210 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-dgnzc"] Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.791110 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.810671 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-dd3e-account-create-update-phmjt"] Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.914246 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-smqnr"] Feb 02 10:50:37 crc kubenswrapper[4909]: W0202 10:50:37.916695 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72af2fcd_83e7_4adf_bbf6_399f49b07e5b.slice/crio-cd818de7840a641fc1bad85c6fa8b017db6dc04d2ff3608a41dd6e7391826179 WatchSource:0}: Error finding container cd818de7840a641fc1bad85c6fa8b017db6dc04d2ff3608a41dd6e7391826179: Status 404 returned error can't find the container with id cd818de7840a641fc1bad85c6fa8b017db6dc04d2ff3608a41dd6e7391826179 Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.923721 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-082b-account-create-update-wx89t"] Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.936303 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run\") pod \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.936341 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn97f\" (UniqueName: \"kubernetes.io/projected/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-kube-api-access-rn97f\") pod \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.936364 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run-ovn\") pod \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.936409 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-log-ovn\") pod \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.936469 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-scripts\") pod \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.936526 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-additional-scripts\") pod \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\" (UID: \"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48\") " Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.937775 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" (UID: "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.937831 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" (UID: "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.937832 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run" (OuterVolumeSpecName: "var-run") pod "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" (UID: "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.938274 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" (UID: "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.938662 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-scripts" (OuterVolumeSpecName: "scripts") pod "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" (UID: "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:37 crc kubenswrapper[4909]: I0202 10:50:37.947708 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-kube-api-access-rn97f" (OuterVolumeSpecName: "kube-api-access-rn97f") pod "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" (UID: "c9b1ee43-97f6-4b9c-9abe-ba79f5694c48"). InnerVolumeSpecName "kube-api-access-rn97f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.039011 4909 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.039056 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn97f\" (UniqueName: \"kubernetes.io/projected/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-kube-api-access-rn97f\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.039065 4909 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.039074 4909 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.039083 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.039092 4909 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.088918 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-rx2rn"] Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.212518 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nlvbk"] Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.220407 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-082b-account-create-update-wx89t" event={"ID":"72af2fcd-83e7-4adf-bbf6-399f49b07e5b","Type":"ContainerStarted","Data":"45eafabca7b2a34226244d7ec3e8248a5f7fb392fbe4f2ca423e285105b0e989"} Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.220457 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-082b-account-create-update-wx89t" event={"ID":"72af2fcd-83e7-4adf-bbf6-399f49b07e5b","Type":"ContainerStarted","Data":"cd818de7840a641fc1bad85c6fa8b017db6dc04d2ff3608a41dd6e7391826179"} Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.223386 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"f4ba97736c676ea7105906bcc956b64814d9063116ac1ae4c7dd355c1861de1a"} Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.228021 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qpqvt-config-w8f2c" event={"ID":"c9b1ee43-97f6-4b9c-9abe-ba79f5694c48","Type":"ContainerDied","Data":"ba55d395ed2afadbd5cc97722cccf046077fdec712bf599a6ac37715bf1cd5c2"} Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.228063 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba55d395ed2afadbd5cc97722cccf046077fdec712bf599a6ac37715bf1cd5c2" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.228169 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qpqvt-config-w8f2c" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.229874 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dd3e-account-create-update-phmjt" event={"ID":"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c","Type":"ContainerStarted","Data":"a76a4b10b76a12071750070b74105df86f63621b5651becc758ed36c8b2db6f5"} Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.229912 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dd3e-account-create-update-phmjt" event={"ID":"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c","Type":"ContainerStarted","Data":"f9ccf384aac3f5769c9c8678dc5b7a5e2f1f002b96ba0f8f3f03309a704153a3"} Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.231263 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-smqnr" event={"ID":"588438b1-3078-46cb-a08a-8f1f215ee5f4","Type":"ContainerStarted","Data":"142d1251734acd34c247ff0c8e39fb2801bc55b3f004355276a0ce5478f98b0a"} Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.233058 4909 generic.go:334] "Generic (PLEG): container finished" podID="784c4213-ccd0-4c0a-9205-9f251e470297" containerID="0395a622a8f92c7a1198e417ae8f5716c8ea8837376e2548a497195b64fc2694" exitCode=0 Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.233096 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dgnzc" event={"ID":"784c4213-ccd0-4c0a-9205-9f251e470297","Type":"ContainerDied","Data":"0395a622a8f92c7a1198e417ae8f5716c8ea8837376e2548a497195b64fc2694"} Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.233180 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dgnzc" event={"ID":"784c4213-ccd0-4c0a-9205-9f251e470297","Type":"ContainerStarted","Data":"47193905a68142216152262425fc600e466892295a23357434f96955397cb84a"} Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.234960 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rx2rn" event={"ID":"6bad7649-e463-477c-b9d9-b317be65e8d1","Type":"ContainerStarted","Data":"ceebe850da0f7fb41c257783ea54a447e4548790c38908664532cf16613dde78"} Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.260734 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-082b-account-create-update-wx89t" podStartSLOduration=2.260712799 podStartE2EDuration="2.260712799s" podCreationTimestamp="2026-02-02 10:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:38.250434147 +0000 UTC m=+1163.996534882" watchObservedRunningTime="2026-02-02 10:50:38.260712799 +0000 UTC m=+1164.006813534" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.318018 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c8ec-account-create-update-cphbx"] Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.318981 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-dd3e-account-create-update-phmjt" podStartSLOduration=2.318963883 podStartE2EDuration="2.318963883s" podCreationTimestamp="2026-02-02 10:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:38.296553396 +0000 UTC m=+1164.042654131" watchObservedRunningTime="2026-02-02 10:50:38.318963883 +0000 UTC m=+1164.065064618" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.551028 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-qpqvt" Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.900399 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qpqvt-config-w8f2c"] Feb 02 10:50:38 crc kubenswrapper[4909]: I0202 10:50:38.913824 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qpqvt-config-w8f2c"] Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.031275 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" path="/var/lib/kubelet/pods/c9b1ee43-97f6-4b9c-9abe-ba79f5694c48/volumes" Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.247033 4909 generic.go:334] "Generic (PLEG): container finished" podID="2be7766e-5d21-45ce-8d1f-23b264467c79" containerID="a11c0ce704f2fc246b98371f13e40088f3fdc710ddd0a844e54c8d408f25ead1" exitCode=0 Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.247123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c8ec-account-create-update-cphbx" event={"ID":"2be7766e-5d21-45ce-8d1f-23b264467c79","Type":"ContainerDied","Data":"a11c0ce704f2fc246b98371f13e40088f3fdc710ddd0a844e54c8d408f25ead1"} Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.247474 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c8ec-account-create-update-cphbx" event={"ID":"2be7766e-5d21-45ce-8d1f-23b264467c79","Type":"ContainerStarted","Data":"dd9093e2774d44a641ba4958a52756ccfdb18f98035e8e04cfd25c3af11b58ed"} Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.249685 4909 generic.go:334] "Generic (PLEG): container finished" podID="05519046-6a41-47f7-9247-03cff29382a5" containerID="b8c6530aa64d5e6ef2bc02008781ac63633b0840657292c0f1c40fded0ce080d" exitCode=0 Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.249743 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nlvbk" event={"ID":"05519046-6a41-47f7-9247-03cff29382a5","Type":"ContainerDied","Data":"b8c6530aa64d5e6ef2bc02008781ac63633b0840657292c0f1c40fded0ce080d"} Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.249764 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nlvbk" event={"ID":"05519046-6a41-47f7-9247-03cff29382a5","Type":"ContainerStarted","Data":"344b0716db4dcf1a18eec274ebadc5df95f77e8ddb1e0df372122d4f86e7f571"} Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.251255 4909 generic.go:334] "Generic (PLEG): container finished" podID="6bad7649-e463-477c-b9d9-b317be65e8d1" containerID="9cf5e3332653530749e0dd6aada31d63b66c5777b51173a45a0c7aa3c4572f39" exitCode=0 Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.251326 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rx2rn" event={"ID":"6bad7649-e463-477c-b9d9-b317be65e8d1","Type":"ContainerDied","Data":"9cf5e3332653530749e0dd6aada31d63b66c5777b51173a45a0c7aa3c4572f39"} Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.252506 4909 generic.go:334] "Generic (PLEG): container finished" podID="72af2fcd-83e7-4adf-bbf6-399f49b07e5b" containerID="45eafabca7b2a34226244d7ec3e8248a5f7fb392fbe4f2ca423e285105b0e989" exitCode=0 Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.252560 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-082b-account-create-update-wx89t" event={"ID":"72af2fcd-83e7-4adf-bbf6-399f49b07e5b","Type":"ContainerDied","Data":"45eafabca7b2a34226244d7ec3e8248a5f7fb392fbe4f2ca423e285105b0e989"} Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.267692 4909 generic.go:334] "Generic (PLEG): container finished" podID="5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c" containerID="a76a4b10b76a12071750070b74105df86f63621b5651becc758ed36c8b2db6f5" exitCode=0 Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.267943 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dd3e-account-create-update-phmjt" event={"ID":"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c","Type":"ContainerDied","Data":"a76a4b10b76a12071750070b74105df86f63621b5651becc758ed36c8b2db6f5"} Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.614517 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dgnzc" Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.774186 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784c4213-ccd0-4c0a-9205-9f251e470297-operator-scripts\") pod \"784c4213-ccd0-4c0a-9205-9f251e470297\" (UID: \"784c4213-ccd0-4c0a-9205-9f251e470297\") " Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.774366 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd9c5\" (UniqueName: \"kubernetes.io/projected/784c4213-ccd0-4c0a-9205-9f251e470297-kube-api-access-nd9c5\") pod \"784c4213-ccd0-4c0a-9205-9f251e470297\" (UID: \"784c4213-ccd0-4c0a-9205-9f251e470297\") " Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.775116 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784c4213-ccd0-4c0a-9205-9f251e470297-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "784c4213-ccd0-4c0a-9205-9f251e470297" (UID: "784c4213-ccd0-4c0a-9205-9f251e470297"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.780982 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784c4213-ccd0-4c0a-9205-9f251e470297-kube-api-access-nd9c5" (OuterVolumeSpecName: "kube-api-access-nd9c5") pod "784c4213-ccd0-4c0a-9205-9f251e470297" (UID: "784c4213-ccd0-4c0a-9205-9f251e470297"). InnerVolumeSpecName "kube-api-access-nd9c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.880555 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd9c5\" (UniqueName: \"kubernetes.io/projected/784c4213-ccd0-4c0a-9205-9f251e470297-kube-api-access-nd9c5\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:39 crc kubenswrapper[4909]: I0202 10:50:39.880647 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/784c4213-ccd0-4c0a-9205-9f251e470297-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.277936 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"cbfed4f1069ee483c0b5622d010bb4050eb703f82417b83b9be6a69f91f6416e"} Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.278261 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"15ada96398d5e422dc198589740e54091d46278fea5a1976da718efa78d1aea0"} Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.278274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"1a36d372a28220791f7d96800ea8889bd6721cfb062a1df523373e1709e54828"} Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.278283 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"75e96eb7331791bb4bb7609ae1be6e72027c31244a81358a0312c1fe5e88d79b"} Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.280441 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-dgnzc" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.280548 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-dgnzc" event={"ID":"784c4213-ccd0-4c0a-9205-9f251e470297","Type":"ContainerDied","Data":"47193905a68142216152262425fc600e466892295a23357434f96955397cb84a"} Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.280636 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47193905a68142216152262425fc600e466892295a23357434f96955397cb84a" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.690325 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rx2rn" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.811911 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq8cz\" (UniqueName: \"kubernetes.io/projected/6bad7649-e463-477c-b9d9-b317be65e8d1-kube-api-access-lq8cz\") pod \"6bad7649-e463-477c-b9d9-b317be65e8d1\" (UID: \"6bad7649-e463-477c-b9d9-b317be65e8d1\") " Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.812000 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad7649-e463-477c-b9d9-b317be65e8d1-operator-scripts\") pod \"6bad7649-e463-477c-b9d9-b317be65e8d1\" (UID: \"6bad7649-e463-477c-b9d9-b317be65e8d1\") " Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.813134 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bad7649-e463-477c-b9d9-b317be65e8d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bad7649-e463-477c-b9d9-b317be65e8d1" (UID: "6bad7649-e463-477c-b9d9-b317be65e8d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.825126 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bad7649-e463-477c-b9d9-b317be65e8d1-kube-api-access-lq8cz" (OuterVolumeSpecName: "kube-api-access-lq8cz") pod "6bad7649-e463-477c-b9d9-b317be65e8d1" (UID: "6bad7649-e463-477c-b9d9-b317be65e8d1"). InnerVolumeSpecName "kube-api-access-lq8cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.853575 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nlvbk" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.861219 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c8ec-account-create-update-cphbx" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.881394 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dd3e-account-create-update-phmjt" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.901454 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-082b-account-create-update-wx89t" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.919508 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq8cz\" (UniqueName: \"kubernetes.io/projected/6bad7649-e463-477c-b9d9-b317be65e8d1-kube-api-access-lq8cz\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:40 crc kubenswrapper[4909]: I0202 10:50:40.919556 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad7649-e463-477c-b9d9-b317be65e8d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.020663 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2be7766e-5d21-45ce-8d1f-23b264467c79-operator-scripts\") pod \"2be7766e-5d21-45ce-8d1f-23b264467c79\" (UID: \"2be7766e-5d21-45ce-8d1f-23b264467c79\") " Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.021308 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be7766e-5d21-45ce-8d1f-23b264467c79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2be7766e-5d21-45ce-8d1f-23b264467c79" (UID: "2be7766e-5d21-45ce-8d1f-23b264467c79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.021879 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dks2\" (UniqueName: \"kubernetes.io/projected/05519046-6a41-47f7-9247-03cff29382a5-kube-api-access-7dks2\") pod \"05519046-6a41-47f7-9247-03cff29382a5\" (UID: \"05519046-6a41-47f7-9247-03cff29382a5\") " Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.022337 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnbfv\" (UniqueName: \"kubernetes.io/projected/2be7766e-5d21-45ce-8d1f-23b264467c79-kube-api-access-mnbfv\") pod \"2be7766e-5d21-45ce-8d1f-23b264467c79\" (UID: \"2be7766e-5d21-45ce-8d1f-23b264467c79\") " Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.022363 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6vp5\" (UniqueName: \"kubernetes.io/projected/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-kube-api-access-t6vp5\") pod \"72af2fcd-83e7-4adf-bbf6-399f49b07e5b\" (UID: \"72af2fcd-83e7-4adf-bbf6-399f49b07e5b\") " Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.022435 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-operator-scripts\") pod \"72af2fcd-83e7-4adf-bbf6-399f49b07e5b\" (UID: \"72af2fcd-83e7-4adf-bbf6-399f49b07e5b\") " Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.022559 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-operator-scripts\") pod \"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c\" (UID: \"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c\") " Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.022627 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85mlh\" (UniqueName: \"kubernetes.io/projected/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-kube-api-access-85mlh\") pod \"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c\" (UID: \"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c\") " Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.022653 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05519046-6a41-47f7-9247-03cff29382a5-operator-scripts\") pod \"05519046-6a41-47f7-9247-03cff29382a5\" (UID: \"05519046-6a41-47f7-9247-03cff29382a5\") " Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.023143 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2be7766e-5d21-45ce-8d1f-23b264467c79-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.023582 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72af2fcd-83e7-4adf-bbf6-399f49b07e5b" (UID: "72af2fcd-83e7-4adf-bbf6-399f49b07e5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.023745 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05519046-6a41-47f7-9247-03cff29382a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "05519046-6a41-47f7-9247-03cff29382a5" (UID: "05519046-6a41-47f7-9247-03cff29382a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.024164 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c" (UID: "5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.030130 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-kube-api-access-t6vp5" (OuterVolumeSpecName: "kube-api-access-t6vp5") pod "72af2fcd-83e7-4adf-bbf6-399f49b07e5b" (UID: "72af2fcd-83e7-4adf-bbf6-399f49b07e5b"). InnerVolumeSpecName "kube-api-access-t6vp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.030196 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05519046-6a41-47f7-9247-03cff29382a5-kube-api-access-7dks2" (OuterVolumeSpecName: "kube-api-access-7dks2") pod "05519046-6a41-47f7-9247-03cff29382a5" (UID: "05519046-6a41-47f7-9247-03cff29382a5"). InnerVolumeSpecName "kube-api-access-7dks2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.030226 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be7766e-5d21-45ce-8d1f-23b264467c79-kube-api-access-mnbfv" (OuterVolumeSpecName: "kube-api-access-mnbfv") pod "2be7766e-5d21-45ce-8d1f-23b264467c79" (UID: "2be7766e-5d21-45ce-8d1f-23b264467c79"). InnerVolumeSpecName "kube-api-access-mnbfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.030252 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-kube-api-access-85mlh" (OuterVolumeSpecName: "kube-api-access-85mlh") pod "5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c" (UID: "5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c"). InnerVolumeSpecName "kube-api-access-85mlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.124686 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.124717 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85mlh\" (UniqueName: \"kubernetes.io/projected/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c-kube-api-access-85mlh\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.124730 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/05519046-6a41-47f7-9247-03cff29382a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.124740 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dks2\" (UniqueName: \"kubernetes.io/projected/05519046-6a41-47f7-9247-03cff29382a5-kube-api-access-7dks2\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.124795 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnbfv\" (UniqueName: \"kubernetes.io/projected/2be7766e-5d21-45ce-8d1f-23b264467c79-kube-api-access-mnbfv\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.124815 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6vp5\" (UniqueName: \"kubernetes.io/projected/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-kube-api-access-t6vp5\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.124824 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72af2fcd-83e7-4adf-bbf6-399f49b07e5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.290786 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nlvbk" event={"ID":"05519046-6a41-47f7-9247-03cff29382a5","Type":"ContainerDied","Data":"344b0716db4dcf1a18eec274ebadc5df95f77e8ddb1e0df372122d4f86e7f571"} Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.291188 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="344b0716db4dcf1a18eec274ebadc5df95f77e8ddb1e0df372122d4f86e7f571" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.291284 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nlvbk" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.297894 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-rx2rn" event={"ID":"6bad7649-e463-477c-b9d9-b317be65e8d1","Type":"ContainerDied","Data":"ceebe850da0f7fb41c257783ea54a447e4548790c38908664532cf16613dde78"} Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.297938 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceebe850da0f7fb41c257783ea54a447e4548790c38908664532cf16613dde78" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.297953 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-rx2rn" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.301035 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-082b-account-create-update-wx89t" event={"ID":"72af2fcd-83e7-4adf-bbf6-399f49b07e5b","Type":"ContainerDied","Data":"cd818de7840a641fc1bad85c6fa8b017db6dc04d2ff3608a41dd6e7391826179"} Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.301081 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd818de7840a641fc1bad85c6fa8b017db6dc04d2ff3608a41dd6e7391826179" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.301052 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-082b-account-create-update-wx89t" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.303279 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-dd3e-account-create-update-phmjt" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.303280 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-dd3e-account-create-update-phmjt" event={"ID":"5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c","Type":"ContainerDied","Data":"f9ccf384aac3f5769c9c8678dc5b7a5e2f1f002b96ba0f8f3f03309a704153a3"} Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.303381 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9ccf384aac3f5769c9c8678dc5b7a5e2f1f002b96ba0f8f3f03309a704153a3" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.306891 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c8ec-account-create-update-cphbx" event={"ID":"2be7766e-5d21-45ce-8d1f-23b264467c79","Type":"ContainerDied","Data":"dd9093e2774d44a641ba4958a52756ccfdb18f98035e8e04cfd25c3af11b58ed"} Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.306921 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd9093e2774d44a641ba4958a52756ccfdb18f98035e8e04cfd25c3af11b58ed" Feb 02 10:50:41 crc kubenswrapper[4909]: I0202 10:50:41.306986 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c8ec-account-create-update-cphbx" Feb 02 10:50:45 crc kubenswrapper[4909]: I0202 10:50:45.351472 4909 generic.go:334] "Generic (PLEG): container finished" podID="db6d95b6-71f4-47be-90e2-64ebcf72442c" containerID="aa704490c8fecb94ae45cd2a957887fc69b4f0520c85a9a5d292b5f18c60a975" exitCode=0 Feb 02 10:50:45 crc kubenswrapper[4909]: I0202 10:50:45.351554 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wn6km" event={"ID":"db6d95b6-71f4-47be-90e2-64ebcf72442c","Type":"ContainerDied","Data":"aa704490c8fecb94ae45cd2a957887fc69b4f0520c85a9a5d292b5f18c60a975"} Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.363705 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-smqnr" event={"ID":"588438b1-3078-46cb-a08a-8f1f215ee5f4","Type":"ContainerStarted","Data":"65512ab62f349dbb8f62b135a336a5935d43e474385b936dc740477c5a0ddcb9"} Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.392618 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-smqnr" podStartSLOduration=2.963383908 podStartE2EDuration="10.392596539s" podCreationTimestamp="2026-02-02 10:50:36 +0000 UTC" firstStartedPulling="2026-02-02 10:50:37.94021758 +0000 UTC m=+1163.686318315" lastFinishedPulling="2026-02-02 10:50:45.369430211 +0000 UTC m=+1171.115530946" observedRunningTime="2026-02-02 10:50:46.384237731 +0000 UTC m=+1172.130338476" watchObservedRunningTime="2026-02-02 10:50:46.392596539 +0000 UTC m=+1172.138697294" Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.790159 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.927457 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-db-sync-config-data\") pod \"db6d95b6-71f4-47be-90e2-64ebcf72442c\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.927538 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdgzj\" (UniqueName: \"kubernetes.io/projected/db6d95b6-71f4-47be-90e2-64ebcf72442c-kube-api-access-fdgzj\") pod \"db6d95b6-71f4-47be-90e2-64ebcf72442c\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.927576 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-combined-ca-bundle\") pod \"db6d95b6-71f4-47be-90e2-64ebcf72442c\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.927659 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-config-data\") pod \"db6d95b6-71f4-47be-90e2-64ebcf72442c\" (UID: \"db6d95b6-71f4-47be-90e2-64ebcf72442c\") " Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.942511 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "db6d95b6-71f4-47be-90e2-64ebcf72442c" (UID: "db6d95b6-71f4-47be-90e2-64ebcf72442c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.943078 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db6d95b6-71f4-47be-90e2-64ebcf72442c-kube-api-access-fdgzj" (OuterVolumeSpecName: "kube-api-access-fdgzj") pod "db6d95b6-71f4-47be-90e2-64ebcf72442c" (UID: "db6d95b6-71f4-47be-90e2-64ebcf72442c"). InnerVolumeSpecName "kube-api-access-fdgzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.959846 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db6d95b6-71f4-47be-90e2-64ebcf72442c" (UID: "db6d95b6-71f4-47be-90e2-64ebcf72442c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:46 crc kubenswrapper[4909]: I0202 10:50:46.976470 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-config-data" (OuterVolumeSpecName: "config-data") pod "db6d95b6-71f4-47be-90e2-64ebcf72442c" (UID: "db6d95b6-71f4-47be-90e2-64ebcf72442c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.030506 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.030541 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdgzj\" (UniqueName: \"kubernetes.io/projected/db6d95b6-71f4-47be-90e2-64ebcf72442c-kube-api-access-fdgzj\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.030555 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.030566 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db6d95b6-71f4-47be-90e2-64ebcf72442c-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.378730 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"4769dc8794676e2cda42fe1e1591dca654572c12d01ef7df149e868e62a3a604"} Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.378795 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"858adda0671141db145aafebaea7fb2c8fe86cafddc9fe78b0569ebbb13f0012"} Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.378829 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"3c20875f1de9fb350baed752230d656b218f7820f84b09af0fae63228ac55300"} Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.381068 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wn6km" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.381071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wn6km" event={"ID":"db6d95b6-71f4-47be-90e2-64ebcf72442c","Type":"ContainerDied","Data":"881c9b582aa0a555ca889c481e82710dd15b4f1bad373cc5e41d2e398dd4826e"} Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.381103 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="881c9b582aa0a555ca889c481e82710dd15b4f1bad373cc5e41d2e398dd4826e" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.807407 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d489f5d97-4skz9"] Feb 02 10:50:47 crc kubenswrapper[4909]: E0202 10:50:47.808048 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784c4213-ccd0-4c0a-9205-9f251e470297" containerName="mariadb-database-create" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808064 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="784c4213-ccd0-4c0a-9205-9f251e470297" containerName="mariadb-database-create" Feb 02 10:50:47 crc kubenswrapper[4909]: E0202 10:50:47.808077 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" containerName="ovn-config" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808083 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" containerName="ovn-config" Feb 02 10:50:47 crc kubenswrapper[4909]: E0202 10:50:47.808095 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72af2fcd-83e7-4adf-bbf6-399f49b07e5b" containerName="mariadb-account-create-update" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808104 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="72af2fcd-83e7-4adf-bbf6-399f49b07e5b" containerName="mariadb-account-create-update" Feb 02 10:50:47 crc kubenswrapper[4909]: E0202 10:50:47.808120 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c" containerName="mariadb-account-create-update" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808127 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c" containerName="mariadb-account-create-update" Feb 02 10:50:47 crc kubenswrapper[4909]: E0202 10:50:47.808148 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bad7649-e463-477c-b9d9-b317be65e8d1" containerName="mariadb-database-create" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808155 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bad7649-e463-477c-b9d9-b317be65e8d1" containerName="mariadb-database-create" Feb 02 10:50:47 crc kubenswrapper[4909]: E0202 10:50:47.808166 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05519046-6a41-47f7-9247-03cff29382a5" containerName="mariadb-database-create" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808174 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="05519046-6a41-47f7-9247-03cff29382a5" containerName="mariadb-database-create" Feb 02 10:50:47 crc kubenswrapper[4909]: E0202 10:50:47.808186 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be7766e-5d21-45ce-8d1f-23b264467c79" containerName="mariadb-account-create-update" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808193 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be7766e-5d21-45ce-8d1f-23b264467c79" containerName="mariadb-account-create-update" Feb 02 10:50:47 crc kubenswrapper[4909]: E0202 10:50:47.808202 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db6d95b6-71f4-47be-90e2-64ebcf72442c" containerName="glance-db-sync" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808207 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="db6d95b6-71f4-47be-90e2-64ebcf72442c" containerName="glance-db-sync" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808382 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="db6d95b6-71f4-47be-90e2-64ebcf72442c" containerName="glance-db-sync" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808403 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c" containerName="mariadb-account-create-update" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808412 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be7766e-5d21-45ce-8d1f-23b264467c79" containerName="mariadb-account-create-update" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808423 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="05519046-6a41-47f7-9247-03cff29382a5" containerName="mariadb-database-create" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808434 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="72af2fcd-83e7-4adf-bbf6-399f49b07e5b" containerName="mariadb-account-create-update" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808441 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="784c4213-ccd0-4c0a-9205-9f251e470297" containerName="mariadb-database-create" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808448 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b1ee43-97f6-4b9c-9abe-ba79f5694c48" containerName="ovn-config" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.808459 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bad7649-e463-477c-b9d9-b317be65e8d1" containerName="mariadb-database-create" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.809307 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.831098 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d489f5d97-4skz9"] Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.952692 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7gdp\" (UniqueName: \"kubernetes.io/projected/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-kube-api-access-g7gdp\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.952786 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-config\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.952876 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-sb\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.952993 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-dns-svc\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:47 crc kubenswrapper[4909]: I0202 10:50:47.953164 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-nb\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.055035 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7gdp\" (UniqueName: \"kubernetes.io/projected/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-kube-api-access-g7gdp\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.055090 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-config\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.055155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-sb\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.055199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-dns-svc\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.055260 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-nb\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.056688 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-config\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.056725 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-nb\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.056776 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-sb\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.056954 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-dns-svc\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.074910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7gdp\" (UniqueName: \"kubernetes.io/projected/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-kube-api-access-g7gdp\") pod \"dnsmasq-dns-d489f5d97-4skz9\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.188517 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.400646 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"403544c94cece49fecf3e837a0ceb79d6332a055fdfa13f162fa0295add4bbb7"} Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.401023 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"e7bf2be62b7e97abc8524922264d489ae88403ad72a536c23b1a2c1f6278395b"} Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.401036 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"2e0f7d484d5647dce2a899dda4783d32b2783471d37c33f12abbe7c324c5a495"} Feb 02 10:50:48 crc kubenswrapper[4909]: I0202 10:50:48.476372 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d489f5d97-4skz9"] Feb 02 10:50:48 crc kubenswrapper[4909]: W0202 10:50:48.498239 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod136d29c0_9fdd_4ae0_a6ca_53d4eab8006a.slice/crio-31efa8592125fc4c83556ed581bda1a8ff17f3178b28f49e2348f4187482a303 WatchSource:0}: Error finding container 31efa8592125fc4c83556ed581bda1a8ff17f3178b28f49e2348f4187482a303: Status 404 returned error can't find the container with id 31efa8592125fc4c83556ed581bda1a8ff17f3178b28f49e2348f4187482a303 Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.412824 4909 generic.go:334] "Generic (PLEG): container finished" podID="588438b1-3078-46cb-a08a-8f1f215ee5f4" containerID="65512ab62f349dbb8f62b135a336a5935d43e474385b936dc740477c5a0ddcb9" exitCode=0 Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.412888 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-smqnr" event={"ID":"588438b1-3078-46cb-a08a-8f1f215ee5f4","Type":"ContainerDied","Data":"65512ab62f349dbb8f62b135a336a5935d43e474385b936dc740477c5a0ddcb9"} Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.416867 4909 generic.go:334] "Generic (PLEG): container finished" podID="136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" containerID="8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808" exitCode=0 Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.416932 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" event={"ID":"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a","Type":"ContainerDied","Data":"8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808"} Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.416952 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" event={"ID":"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a","Type":"ContainerStarted","Data":"31efa8592125fc4c83556ed581bda1a8ff17f3178b28f49e2348f4187482a303"} Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.435651 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerStarted","Data":"5e88d5dc47ad71a01580306942e0f3a9a30eb6a37d0332e786a59186dadbb937"} Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.755517 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.051948573 podStartE2EDuration="48.755498229s" podCreationTimestamp="2026-02-02 10:50:01 +0000 UTC" firstStartedPulling="2026-02-02 10:50:34.962375752 +0000 UTC m=+1160.708476487" lastFinishedPulling="2026-02-02 10:50:46.665925408 +0000 UTC m=+1172.412026143" observedRunningTime="2026-02-02 10:50:49.497426343 +0000 UTC m=+1175.243527108" watchObservedRunningTime="2026-02-02 10:50:49.755498229 +0000 UTC m=+1175.501598964" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.762464 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d489f5d97-4skz9"] Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.791792 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-c4gld"] Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.793196 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.795601 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.808369 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-c4gld"] Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.888566 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-sb\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.888668 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-nb\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.888709 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bm77\" (UniqueName: \"kubernetes.io/projected/02ef6e61-480d-4840-a976-0c2567811583-kube-api-access-8bm77\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.888762 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-config\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.888847 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-swift-storage-0\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.888953 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-svc\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.990151 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-config\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.990204 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-swift-storage-0\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.990310 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-svc\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.990350 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-sb\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.990386 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-nb\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.990416 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bm77\" (UniqueName: \"kubernetes.io/projected/02ef6e61-480d-4840-a976-0c2567811583-kube-api-access-8bm77\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.991420 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-config\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.991463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-swift-storage-0\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.991496 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-sb\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.991463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-svc\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:49 crc kubenswrapper[4909]: I0202 10:50:49.991624 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-nb\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:50 crc kubenswrapper[4909]: I0202 10:50:50.007199 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bm77\" (UniqueName: \"kubernetes.io/projected/02ef6e61-480d-4840-a976-0c2567811583-kube-api-access-8bm77\") pod \"dnsmasq-dns-96fb4d4c9-c4gld\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:50 crc kubenswrapper[4909]: I0202 10:50:50.116550 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:50 crc kubenswrapper[4909]: I0202 10:50:50.448161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" event={"ID":"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a","Type":"ContainerStarted","Data":"651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b"} Feb 02 10:50:50 crc kubenswrapper[4909]: I0202 10:50:50.448379 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:50 crc kubenswrapper[4909]: I0202 10:50:50.478994 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" podStartSLOduration=3.478971418 podStartE2EDuration="3.478971418s" podCreationTimestamp="2026-02-02 10:50:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:50.465913248 +0000 UTC m=+1176.212013983" watchObservedRunningTime="2026-02-02 10:50:50.478971418 +0000 UTC m=+1176.225072153" Feb 02 10:50:50 crc kubenswrapper[4909]: I0202 10:50:50.601467 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-c4gld"] Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.183773 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.310587 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-config-data\") pod \"588438b1-3078-46cb-a08a-8f1f215ee5f4\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.310745 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvmlv\" (UniqueName: \"kubernetes.io/projected/588438b1-3078-46cb-a08a-8f1f215ee5f4-kube-api-access-vvmlv\") pod \"588438b1-3078-46cb-a08a-8f1f215ee5f4\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.311378 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-combined-ca-bundle\") pod \"588438b1-3078-46cb-a08a-8f1f215ee5f4\" (UID: \"588438b1-3078-46cb-a08a-8f1f215ee5f4\") " Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.316991 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588438b1-3078-46cb-a08a-8f1f215ee5f4-kube-api-access-vvmlv" (OuterVolumeSpecName: "kube-api-access-vvmlv") pod "588438b1-3078-46cb-a08a-8f1f215ee5f4" (UID: "588438b1-3078-46cb-a08a-8f1f215ee5f4"). InnerVolumeSpecName "kube-api-access-vvmlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.336366 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "588438b1-3078-46cb-a08a-8f1f215ee5f4" (UID: "588438b1-3078-46cb-a08a-8f1f215ee5f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.351154 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-config-data" (OuterVolumeSpecName: "config-data") pod "588438b1-3078-46cb-a08a-8f1f215ee5f4" (UID: "588438b1-3078-46cb-a08a-8f1f215ee5f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.413806 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.413862 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588438b1-3078-46cb-a08a-8f1f215ee5f4-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.413876 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvmlv\" (UniqueName: \"kubernetes.io/projected/588438b1-3078-46cb-a08a-8f1f215ee5f4-kube-api-access-vvmlv\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.458201 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-smqnr" event={"ID":"588438b1-3078-46cb-a08a-8f1f215ee5f4","Type":"ContainerDied","Data":"142d1251734acd34c247ff0c8e39fb2801bc55b3f004355276a0ce5478f98b0a"} Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.458264 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="142d1251734acd34c247ff0c8e39fb2801bc55b3f004355276a0ce5478f98b0a" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.458223 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-smqnr" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.459558 4909 generic.go:334] "Generic (PLEG): container finished" podID="02ef6e61-480d-4840-a976-0c2567811583" containerID="507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc" exitCode=0 Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.459701 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" podUID="136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" containerName="dnsmasq-dns" containerID="cri-o://651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b" gracePeriod=10 Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.459682 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" event={"ID":"02ef6e61-480d-4840-a976-0c2567811583","Type":"ContainerDied","Data":"507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc"} Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.459788 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" event={"ID":"02ef6e61-480d-4840-a976-0c2567811583","Type":"ContainerStarted","Data":"c18026af8f07fc3e52c75914998c0e403304ee5b9ebdfc3bd1e9f57b6807eccd"} Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.730723 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-x8ptv"] Feb 02 10:50:51 crc kubenswrapper[4909]: E0202 10:50:51.734745 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588438b1-3078-46cb-a08a-8f1f215ee5f4" containerName="keystone-db-sync" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.734784 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="588438b1-3078-46cb-a08a-8f1f215ee5f4" containerName="keystone-db-sync" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.735069 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="588438b1-3078-46cb-a08a-8f1f215ee5f4" containerName="keystone-db-sync" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.735663 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.739094 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.739399 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l4g7z" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.739412 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.739582 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.739869 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.755179 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x8ptv"] Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.775578 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-c4gld"] Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.823379 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-scripts\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.823423 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5w5z\" (UniqueName: \"kubernetes.io/projected/53774ce4-0f24-484b-afdf-f6023d8498c0-kube-api-access-g5w5z\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.823461 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-credential-keys\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.823502 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-combined-ca-bundle\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.823529 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-config-data\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.823558 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-fernet-keys\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.903150 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-wlhg2"] Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.923735 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-wlhg2"] Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.923915 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.926997 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-fernet-keys\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.927094 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-scripts\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.927326 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5w5z\" (UniqueName: \"kubernetes.io/projected/53774ce4-0f24-484b-afdf-f6023d8498c0-kube-api-access-g5w5z\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.927428 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-credential-keys\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.927478 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-combined-ca-bundle\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.927556 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-config-data\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.958654 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-fernet-keys\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.958971 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-combined-ca-bundle\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.959128 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-scripts\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.959484 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-credential-keys\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:51 crc kubenswrapper[4909]: I0202 10:50:51.961000 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-config-data\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.021416 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5w5z\" (UniqueName: \"kubernetes.io/projected/53774ce4-0f24-484b-afdf-f6023d8498c0-kube-api-access-g5w5z\") pod \"keystone-bootstrap-x8ptv\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.030780 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.030934 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.031007 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-config\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.031051 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-svc\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.031127 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87c4m\" (UniqueName: \"kubernetes.io/projected/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-kube-api-access-87c4m\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.031172 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-swift-storage-0\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.072182 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.137770 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87c4m\" (UniqueName: \"kubernetes.io/projected/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-kube-api-access-87c4m\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.137844 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-swift-storage-0\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.137900 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.137927 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.137989 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-config\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.138029 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-svc\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.138975 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-svc\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.141849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.142633 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-config\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.143554 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.154005 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-swift-storage-0\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.165886 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jpqtq"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.166903 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.194575 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.195610 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sst8c" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.195809 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.217842 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xqjmc"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.219230 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.226387 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.227066 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hsdjf" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.235439 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.238214 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87c4m\" (UniqueName: \"kubernetes.io/projected/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-kube-api-access-87c4m\") pod \"dnsmasq-dns-c4fdd6b7-wlhg2\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.247422 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.278047 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xqjmc"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.278091 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jpqtq"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.284936 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344056 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-nb\") pod \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344219 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-dns-svc\") pod \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344290 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-config\") pod \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344325 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-sb\") pod \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344383 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7gdp\" (UniqueName: \"kubernetes.io/projected/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-kube-api-access-g7gdp\") pod \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\" (UID: \"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a\") " Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344584 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d68a9e4e-b453-459f-b397-9c6d7c221dda-etc-machine-id\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344608 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q7g4\" (UniqueName: \"kubernetes.io/projected/d68a9e4e-b453-459f-b397-9c6d7c221dda-kube-api-access-5q7g4\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344646 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-scripts\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344697 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-db-sync-config-data\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344729 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-259wr\" (UniqueName: \"kubernetes.io/projected/ca377ade-e972-41f6-add9-a0b491d86bbf-kube-api-access-259wr\") pod \"neutron-db-sync-jpqtq\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344759 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-combined-ca-bundle\") pod \"neutron-db-sync-jpqtq\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344793 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-config-data\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344937 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-config\") pod \"neutron-db-sync-jpqtq\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.344963 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-combined-ca-bundle\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.348331 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-kube-api-access-g7gdp" (OuterVolumeSpecName: "kube-api-access-g7gdp") pod "136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" (UID: "136d29c0-9fdd-4ae0-a6ca-53d4eab8006a"). InnerVolumeSpecName "kube-api-access-g7gdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.418877 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-lzdlk"] Feb 02 10:50:52 crc kubenswrapper[4909]: E0202 10:50:52.419274 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" containerName="init" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.419287 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" containerName="init" Feb 02 10:50:52 crc kubenswrapper[4909]: E0202 10:50:52.419308 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" containerName="dnsmasq-dns" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.419314 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" containerName="dnsmasq-dns" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.419477 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" containerName="dnsmasq-dns" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.420022 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.435361 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-87r8w" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.435551 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.439971 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-c27bb"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.441072 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.446598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d68a9e4e-b453-459f-b397-9c6d7c221dda-etc-machine-id\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.446645 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q7g4\" (UniqueName: \"kubernetes.io/projected/d68a9e4e-b453-459f-b397-9c6d7c221dda-kube-api-access-5q7g4\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.446687 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-scripts\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.446735 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-db-sync-config-data\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.446763 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-259wr\" (UniqueName: \"kubernetes.io/projected/ca377ade-e972-41f6-add9-a0b491d86bbf-kube-api-access-259wr\") pod \"neutron-db-sync-jpqtq\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.446793 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-combined-ca-bundle\") pod \"neutron-db-sync-jpqtq\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.446840 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-config-data\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.446862 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-config\") pod \"neutron-db-sync-jpqtq\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.446881 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-combined-ca-bundle\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.446926 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7gdp\" (UniqueName: \"kubernetes.io/projected/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-kube-api-access-g7gdp\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.453751 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.453996 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kqfqc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.454174 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.454951 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-combined-ca-bundle\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.457071 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-wlhg2"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.457741 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d68a9e4e-b453-459f-b397-9c6d7c221dda-etc-machine-id\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.482403 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-scripts\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.483221 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-config-data\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.488378 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-db-sync-config-data\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.489084 4909 generic.go:334] "Generic (PLEG): container finished" podID="136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" containerID="651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b" exitCode=0 Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.489167 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" event={"ID":"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a","Type":"ContainerDied","Data":"651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b"} Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.489203 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" event={"ID":"136d29c0-9fdd-4ae0-a6ca-53d4eab8006a","Type":"ContainerDied","Data":"31efa8592125fc4c83556ed581bda1a8ff17f3178b28f49e2348f4187482a303"} Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.489221 4909 scope.go:117] "RemoveContainer" containerID="651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.489356 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d489f5d97-4skz9" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.497358 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-259wr\" (UniqueName: \"kubernetes.io/projected/ca377ade-e972-41f6-add9-a0b491d86bbf-kube-api-access-259wr\") pod \"neutron-db-sync-jpqtq\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.500205 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-config\") pod \"neutron-db-sync-jpqtq\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.500501 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" event={"ID":"02ef6e61-480d-4840-a976-0c2567811583","Type":"ContainerStarted","Data":"ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e"} Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.501014 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.503056 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lzdlk"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.511735 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q7g4\" (UniqueName: \"kubernetes.io/projected/d68a9e4e-b453-459f-b397-9c6d7c221dda-kube-api-access-5q7g4\") pod \"cinder-db-sync-xqjmc\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.517221 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-combined-ca-bundle\") pod \"neutron-db-sync-jpqtq\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.538689 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.541029 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.547533 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.547722 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.549124 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-config-data\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.549177 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-scripts\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.549240 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-combined-ca-bundle\") pod \"barbican-db-sync-lzdlk\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.549260 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wrv5\" (UniqueName: \"kubernetes.io/projected/d68a2056-e886-4135-a63a-3755df0703af-kube-api-access-6wrv5\") pod \"barbican-db-sync-lzdlk\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.549292 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d450e8-b58b-423c-afae-ed534a2d65ed-logs\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.549309 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-db-sync-config-data\") pod \"barbican-db-sync-lzdlk\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.549338 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5vqc\" (UniqueName: \"kubernetes.io/projected/d5d450e8-b58b-423c-afae-ed534a2d65ed-kube-api-access-r5vqc\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.549362 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-combined-ca-bundle\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.549591 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-c27bb"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.569168 4909 scope.go:117] "RemoveContainer" containerID="8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.571070 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" (UID: "136d29c0-9fdd-4ae0-a6ca-53d4eab8006a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.572702 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-config" (OuterVolumeSpecName: "config") pod "136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" (UID: "136d29c0-9fdd-4ae0-a6ca-53d4eab8006a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.581139 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" (UID: "136d29c0-9fdd-4ae0-a6ca-53d4eab8006a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.582211 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-sgcx6"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.584603 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.586934 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" (UID: "136d29c0-9fdd-4ae0-a6ca-53d4eab8006a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.604595 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.613245 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.623303 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651282 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d450e8-b58b-423c-afae-ed534a2d65ed-logs\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651329 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-db-sync-config-data\") pod \"barbican-db-sync-lzdlk\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651349 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5vqc\" (UniqueName: \"kubernetes.io/projected/d5d450e8-b58b-423c-afae-ed534a2d65ed-kube-api-access-r5vqc\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651383 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-config-data\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651404 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-combined-ca-bundle\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651431 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-run-httpd\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651457 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651505 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-scripts\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651527 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm6x8\" (UniqueName: \"kubernetes.io/projected/9b4235b4-44a0-4238-9b70-ad3ea946f729-kube-api-access-rm6x8\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651548 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-config-data\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651575 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-scripts\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651630 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-log-httpd\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651669 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-combined-ca-bundle\") pod \"barbican-db-sync-lzdlk\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651686 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651704 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wrv5\" (UniqueName: \"kubernetes.io/projected/d68a2056-e886-4135-a63a-3755df0703af-kube-api-access-6wrv5\") pod \"barbican-db-sync-lzdlk\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651761 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651771 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651779 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.651788 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.652510 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d450e8-b58b-423c-afae-ed534a2d65ed-logs\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.657078 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-sgcx6"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.662018 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-config-data\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.662228 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-db-sync-config-data\") pod \"barbican-db-sync-lzdlk\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.662774 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-combined-ca-bundle\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.664650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-scripts\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.672306 4909 scope.go:117] "RemoveContainer" containerID="651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.672930 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-combined-ca-bundle\") pod \"barbican-db-sync-lzdlk\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.673731 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" podStartSLOduration=3.672978644 podStartE2EDuration="3.672978644s" podCreationTimestamp="2026-02-02 10:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:52.540265486 +0000 UTC m=+1178.286366221" watchObservedRunningTime="2026-02-02 10:50:52.672978644 +0000 UTC m=+1178.419079379" Feb 02 10:50:52 crc kubenswrapper[4909]: E0202 10:50:52.676033 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b\": container with ID starting with 651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b not found: ID does not exist" containerID="651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.676089 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b"} err="failed to get container status \"651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b\": rpc error: code = NotFound desc = could not find container \"651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b\": container with ID starting with 651fdd21cf78d17cb0d6926d96fd7e9a37ab292f50d9d5001862e14925e7237b not found: ID does not exist" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.676122 4909 scope.go:117] "RemoveContainer" containerID="8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808" Feb 02 10:50:52 crc kubenswrapper[4909]: E0202 10:50:52.680027 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808\": container with ID starting with 8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808 not found: ID does not exist" containerID="8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.680081 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808"} err="failed to get container status \"8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808\": rpc error: code = NotFound desc = could not find container \"8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808\": container with ID starting with 8d9214d95ff9ebbcca7ac9ef6d196ba4e7574df86686e43e2e078508cb8d4808 not found: ID does not exist" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.682456 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5vqc\" (UniqueName: \"kubernetes.io/projected/d5d450e8-b58b-423c-afae-ed534a2d65ed-kube-api-access-r5vqc\") pod \"placement-db-sync-c27bb\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.693071 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wrv5\" (UniqueName: \"kubernetes.io/projected/d68a2056-e886-4135-a63a-3755df0703af-kube-api-access-6wrv5\") pod \"barbican-db-sync-lzdlk\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.752917 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-scripts\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.752971 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm6x8\" (UniqueName: \"kubernetes.io/projected/9b4235b4-44a0-4238-9b70-ad3ea946f729-kube-api-access-rm6x8\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753000 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-config\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753061 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-svc\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753105 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-nb\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753134 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-log-httpd\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753167 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-sb\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753202 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753250 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-config-data\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753278 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-run-httpd\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753315 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753344 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-swift-storage-0\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.753367 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trj2m\" (UniqueName: \"kubernetes.io/projected/19d20c70-a055-4ecf-b593-95697717de45-kube-api-access-trj2m\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.755089 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-log-httpd\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.755436 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-run-httpd\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.765592 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.767794 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.780251 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-config-data\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.781097 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-scripts\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.788076 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm6x8\" (UniqueName: \"kubernetes.io/projected/9b4235b4-44a0-4238-9b70-ad3ea946f729-kube-api-access-rm6x8\") pod \"ceilometer-0\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.804452 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.843832 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d489f5d97-4skz9"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.844444 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c27bb" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.854935 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-swift-storage-0\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.854983 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trj2m\" (UniqueName: \"kubernetes.io/projected/19d20c70-a055-4ecf-b593-95697717de45-kube-api-access-trj2m\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.855021 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-config\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.855064 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-svc\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.855099 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-nb\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.855124 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-sb\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.855897 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-sb\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.856393 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-swift-storage-0\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.857122 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-config\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.857663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-svc\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.860042 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d489f5d97-4skz9"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.875976 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-nb\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.881982 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trj2m\" (UniqueName: \"kubernetes.io/projected/19d20c70-a055-4ecf-b593-95697717de45-kube-api-access-trj2m\") pod \"dnsmasq-dns-69c85d5ff7-sgcx6\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.894417 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.915362 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:52 crc kubenswrapper[4909]: W0202 10:50:52.923122 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53774ce4_0f24_484b_afdf_f6023d8498c0.slice/crio-0b83a53f74f32548cba3670182f2aebdcf8c078bb96c3a20cb1a1d554bedb160 WatchSource:0}: Error finding container 0b83a53f74f32548cba3670182f2aebdcf8c078bb96c3a20cb1a1d554bedb160: Status 404 returned error can't find the container with id 0b83a53f74f32548cba3670182f2aebdcf8c078bb96c3a20cb1a1d554bedb160 Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.926403 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-x8ptv"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.943957 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.945771 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.947744 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.948320 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.948445 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.948554 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-c2scs" Feb 02 10:50:52 crc kubenswrapper[4909]: I0202 10:50:52.971349 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.062561 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.062666 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.062689 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-config-data\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.062960 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.063009 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-logs\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.063037 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.063060 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-scripts\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.063089 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f6m8\" (UniqueName: \"kubernetes.io/projected/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-kube-api-access-8f6m8\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.089210 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="136d29c0-9fdd-4ae0-a6ca-53d4eab8006a" path="/var/lib/kubelet/pods/136d29c0-9fdd-4ae0-a6ca-53d4eab8006a/volumes" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.090543 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-wlhg2"] Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.112947 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.114326 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.117189 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.117369 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.152442 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.170569 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.171551 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-config-data\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.171381 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.171609 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.171641 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-logs\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.171664 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.171687 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-scripts\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.171709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f6m8\" (UniqueName: \"kubernetes.io/projected/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-kube-api-access-8f6m8\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.171788 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.174570 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-logs\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.175031 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.179857 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.180924 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-scripts\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.185723 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.194400 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-config-data\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.195457 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f6m8\" (UniqueName: \"kubernetes.io/projected/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-kube-api-access-8f6m8\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.215000 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.276974 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.277024 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.277059 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.277120 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-logs\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.277140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8w54\" (UniqueName: \"kubernetes.io/projected/a09a0ccc-3e9a-4f40-8237-8bf342976a24-kube-api-access-n8w54\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.277199 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.277231 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.277295 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.283341 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.298465 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jpqtq"] Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.361480 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xqjmc"] Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.423618 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-logs\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.423668 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8w54\" (UniqueName: \"kubernetes.io/projected/a09a0ccc-3e9a-4f40-8237-8bf342976a24-kube-api-access-n8w54\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.423817 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.423907 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.423990 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.424321 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.424353 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-logs\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.424374 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.436717 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.437106 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.438739 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.438737 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.442416 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.454382 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.455405 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.459763 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8w54\" (UniqueName: \"kubernetes.io/projected/a09a0ccc-3e9a-4f40-8237-8bf342976a24-kube-api-access-n8w54\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.529236 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.540229 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x8ptv" event={"ID":"53774ce4-0f24-484b-afdf-f6023d8498c0","Type":"ContainerStarted","Data":"0b83a53f74f32548cba3670182f2aebdcf8c078bb96c3a20cb1a1d554bedb160"} Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.541431 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xqjmc" event={"ID":"d68a9e4e-b453-459f-b397-9c6d7c221dda","Type":"ContainerStarted","Data":"43d7baa7d79263b0fb9547aab2fd421f5505c60aaf5047133ad8dcac3587eff1"} Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.542338 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" event={"ID":"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917","Type":"ContainerStarted","Data":"8bedcd5232206c8095d0d87e74eaed865a84f4422d57886f89f8bb95fa530964"} Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.543436 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" podUID="02ef6e61-480d-4840-a976-0c2567811583" containerName="dnsmasq-dns" containerID="cri-o://ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e" gracePeriod=10 Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.543707 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jpqtq" event={"ID":"ca377ade-e972-41f6-add9-a0b491d86bbf","Type":"ContainerStarted","Data":"3b0b93c077764d3486ddcdcd6ce07399ac39d664ab3dc519057e20c0184f8ee6"} Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.735172 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lzdlk"] Feb 02 10:50:53 crc kubenswrapper[4909]: W0202 10:50:53.781586 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd68a2056_e886_4135_a63a_3755df0703af.slice/crio-63c903f7836c76f2f2b2ad18e9317934d24d5901bb8d7e9ddf4e72f6a3ec84dd WatchSource:0}: Error finding container 63c903f7836c76f2f2b2ad18e9317934d24d5901bb8d7e9ddf4e72f6a3ec84dd: Status 404 returned error can't find the container with id 63c903f7836c76f2f2b2ad18e9317934d24d5901bb8d7e9ddf4e72f6a3ec84dd Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.796449 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.877201 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-sgcx6"] Feb 02 10:50:53 crc kubenswrapper[4909]: W0202 10:50:53.902643 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b4235b4_44a0_4238_9b70_ad3ea946f729.slice/crio-25aa449b8475eafbb0131dd8a6cdef0c5099c9f3bf1580e96d895b093a05f36e WatchSource:0}: Error finding container 25aa449b8475eafbb0131dd8a6cdef0c5099c9f3bf1580e96d895b093a05f36e: Status 404 returned error can't find the container with id 25aa449b8475eafbb0131dd8a6cdef0c5099c9f3bf1580e96d895b093a05f36e Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.902808 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:50:53 crc kubenswrapper[4909]: I0202 10:50:53.947243 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-c27bb"] Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.263312 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.466988 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.539510 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.563688 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-svc\") pod \"02ef6e61-480d-4840-a976-0c2567811583\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.563767 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bm77\" (UniqueName: \"kubernetes.io/projected/02ef6e61-480d-4840-a976-0c2567811583-kube-api-access-8bm77\") pod \"02ef6e61-480d-4840-a976-0c2567811583\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.563803 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-nb\") pod \"02ef6e61-480d-4840-a976-0c2567811583\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.563924 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-config\") pod \"02ef6e61-480d-4840-a976-0c2567811583\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.563995 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-swift-storage-0\") pod \"02ef6e61-480d-4840-a976-0c2567811583\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.564014 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-sb\") pod \"02ef6e61-480d-4840-a976-0c2567811583\" (UID: \"02ef6e61-480d-4840-a976-0c2567811583\") " Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.574310 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02ef6e61-480d-4840-a976-0c2567811583-kube-api-access-8bm77" (OuterVolumeSpecName: "kube-api-access-8bm77") pod "02ef6e61-480d-4840-a976-0c2567811583" (UID: "02ef6e61-480d-4840-a976-0c2567811583"). InnerVolumeSpecName "kube-api-access-8bm77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.609084 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.637121 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x8ptv" event={"ID":"53774ce4-0f24-484b-afdf-f6023d8498c0","Type":"ContainerStarted","Data":"9605881602e8e9802371144790cb626f296ca0ea0fffdab156c40c6988d7021c"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.683174 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bm77\" (UniqueName: \"kubernetes.io/projected/02ef6e61-480d-4840-a976-0c2567811583-kube-api-access-8bm77\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.684792 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.685074 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lzdlk" event={"ID":"d68a2056-e886-4135-a63a-3755df0703af","Type":"ContainerStarted","Data":"63c903f7836c76f2f2b2ad18e9317934d24d5901bb8d7e9ddf4e72f6a3ec84dd"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.690978 4909 generic.go:334] "Generic (PLEG): container finished" podID="74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" containerID="56d22107bd21b8fd396ed25e5915b210b746513b51b9050ec6d0f81d639c8fcf" exitCode=0 Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.691210 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" event={"ID":"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917","Type":"ContainerDied","Data":"56d22107bd21b8fd396ed25e5915b210b746513b51b9050ec6d0f81d639c8fcf"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.696596 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6d9df75c-c412-4c4c-a3fb-54de31dfe30a","Type":"ContainerStarted","Data":"011625849d41a6d3345a8da8a3b2a0ddbab269d7bda07656de0f40f8bd00a2bd"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.705969 4909 generic.go:334] "Generic (PLEG): container finished" podID="19d20c70-a055-4ecf-b593-95697717de45" containerID="2d1e6a3040ccc31833bc3c562714f9ba5294b8c70b6272aa234f6df8b8d99c0e" exitCode=0 Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.706063 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" event={"ID":"19d20c70-a055-4ecf-b593-95697717de45","Type":"ContainerDied","Data":"2d1e6a3040ccc31833bc3c562714f9ba5294b8c70b6272aa234f6df8b8d99c0e"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.706090 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" event={"ID":"19d20c70-a055-4ecf-b593-95697717de45","Type":"ContainerStarted","Data":"7c3f59a930d809d2a9aa9b11e44559181d0a3f6b11ddd97283ccf32cb4b64799"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.717256 4909 generic.go:334] "Generic (PLEG): container finished" podID="02ef6e61-480d-4840-a976-0c2567811583" containerID="ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e" exitCode=0 Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.717341 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" event={"ID":"02ef6e61-480d-4840-a976-0c2567811583","Type":"ContainerDied","Data":"ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.717377 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" event={"ID":"02ef6e61-480d-4840-a976-0c2567811583","Type":"ContainerDied","Data":"c18026af8f07fc3e52c75914998c0e403304ee5b9ebdfc3bd1e9f57b6807eccd"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.717395 4909 scope.go:117] "RemoveContainer" containerID="ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.717851 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-c4gld" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.737153 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4235b4-44a0-4238-9b70-ad3ea946f729","Type":"ContainerStarted","Data":"25aa449b8475eafbb0131dd8a6cdef0c5099c9f3bf1580e96d895b093a05f36e"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.745337 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "02ef6e61-480d-4840-a976-0c2567811583" (UID: "02ef6e61-480d-4840-a976-0c2567811583"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.749665 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "02ef6e61-480d-4840-a976-0c2567811583" (UID: "02ef6e61-480d-4840-a976-0c2567811583"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.773520 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c27bb" event={"ID":"d5d450e8-b58b-423c-afae-ed534a2d65ed","Type":"ContainerStarted","Data":"aeb7f4815af35af7824c2d5ec4b427b66fb43ab01eabcbc5b5b383989e34478e"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.773993 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "02ef6e61-480d-4840-a976-0c2567811583" (UID: "02ef6e61-480d-4840-a976-0c2567811583"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.781121 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jpqtq" event={"ID":"ca377ade-e972-41f6-add9-a0b491d86bbf","Type":"ContainerStarted","Data":"311a7b10742f33d16e743d03813d2f4a91673f1da23226395322a7b04d1bbfb8"} Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.786301 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.787958 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.787988 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.787999 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.816070 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-config" (OuterVolumeSpecName: "config") pod "02ef6e61-480d-4840-a976-0c2567811583" (UID: "02ef6e61-480d-4840-a976-0c2567811583"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.816293 4909 scope.go:117] "RemoveContainer" containerID="507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.817114 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-x8ptv" podStartSLOduration=3.817089665 podStartE2EDuration="3.817089665s" podCreationTimestamp="2026-02-02 10:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:54.667551479 +0000 UTC m=+1180.413652214" watchObservedRunningTime="2026-02-02 10:50:54.817089665 +0000 UTC m=+1180.563190400" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.847095 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "02ef6e61-480d-4840-a976-0c2567811583" (UID: "02ef6e61-480d-4840-a976-0c2567811583"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.875088 4909 scope.go:117] "RemoveContainer" containerID="ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e" Feb 02 10:50:54 crc kubenswrapper[4909]: E0202 10:50:54.880958 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e\": container with ID starting with ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e not found: ID does not exist" containerID="ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.880999 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e"} err="failed to get container status \"ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e\": rpc error: code = NotFound desc = could not find container \"ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e\": container with ID starting with ebfa71ac16a75a99594dbd1d2e726b3e82524fdb6afa2fe6cadd5c31b5d41b2e not found: ID does not exist" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.881023 4909 scope.go:117] "RemoveContainer" containerID="507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.890139 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.890180 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02ef6e61-480d-4840-a976-0c2567811583-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:54 crc kubenswrapper[4909]: E0202 10:50:54.895220 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc\": container with ID starting with 507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc not found: ID does not exist" containerID="507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.895281 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc"} err="failed to get container status \"507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc\": rpc error: code = NotFound desc = could not find container \"507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc\": container with ID starting with 507b111c07a330c7f94282dafbeed8b419e129d1ebca22307f2fc29b39d034cc not found: ID does not exist" Feb 02 10:50:54 crc kubenswrapper[4909]: I0202 10:50:54.907670 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jpqtq" podStartSLOduration=2.907646906 podStartE2EDuration="2.907646906s" podCreationTimestamp="2026-02-02 10:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:54.816448277 +0000 UTC m=+1180.562549012" watchObservedRunningTime="2026-02-02 10:50:54.907646906 +0000 UTC m=+1180.653747641" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.213302 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-c4gld"] Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.220774 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-c4gld"] Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.323464 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.511236 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-config\") pod \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.511275 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-svc\") pod \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.511344 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-swift-storage-0\") pod \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.511375 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-nb\") pod \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.511529 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-sb\") pod \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.511569 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87c4m\" (UniqueName: \"kubernetes.io/projected/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-kube-api-access-87c4m\") pod \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\" (UID: \"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917\") " Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.517836 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-kube-api-access-87c4m" (OuterVolumeSpecName: "kube-api-access-87c4m") pod "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" (UID: "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917"). InnerVolumeSpecName "kube-api-access-87c4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.557431 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" (UID: "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.562400 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" (UID: "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.563351 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" (UID: "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.564696 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" (UID: "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.587697 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-config" (OuterVolumeSpecName: "config") pod "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" (UID: "74fe0bf2-a4bf-43fa-807a-1bafdc4e2917"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.613328 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.613360 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87c4m\" (UniqueName: \"kubernetes.io/projected/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-kube-api-access-87c4m\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.613376 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.613386 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.613397 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.613407 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.798859 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" event={"ID":"74fe0bf2-a4bf-43fa-807a-1bafdc4e2917","Type":"ContainerDied","Data":"8bedcd5232206c8095d0d87e74eaed865a84f4422d57886f89f8bb95fa530964"} Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.798883 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-wlhg2" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.798909 4909 scope.go:117] "RemoveContainer" containerID="56d22107bd21b8fd396ed25e5915b210b746513b51b9050ec6d0f81d639c8fcf" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.802224 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6d9df75c-c412-4c4c-a3fb-54de31dfe30a","Type":"ContainerStarted","Data":"62f9f7bbeccb13a0393942be064823929d9c96cd168c1d37e1d90e123c0a3e2f"} Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.811022 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a09a0ccc-3e9a-4f40-8237-8bf342976a24","Type":"ContainerStarted","Data":"651d0d37ea02430af395e9060d8e2c3ea89b194ee9b0688d253c803bf11fa13a"} Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.818295 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" event={"ID":"19d20c70-a055-4ecf-b593-95697717de45","Type":"ContainerStarted","Data":"d27bc9973e507de4ff445be7075821941e661aaf7dacfe8ec4a1ac24d6405616"} Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.818337 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.845053 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" podStartSLOduration=3.845034687 podStartE2EDuration="3.845034687s" podCreationTimestamp="2026-02-02 10:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:55.837279647 +0000 UTC m=+1181.583380372" watchObservedRunningTime="2026-02-02 10:50:55.845034687 +0000 UTC m=+1181.591135422" Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.891679 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-wlhg2"] Feb 02 10:50:55 crc kubenswrapper[4909]: I0202 10:50:55.901987 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-wlhg2"] Feb 02 10:50:56 crc kubenswrapper[4909]: I0202 10:50:56.829637 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a09a0ccc-3e9a-4f40-8237-8bf342976a24","Type":"ContainerStarted","Data":"fd34d4698ba922a7dc3df1469905bb3e5d1c8860d6ef18784d597822849b01bb"} Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.031778 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02ef6e61-480d-4840-a976-0c2567811583" path="/var/lib/kubelet/pods/02ef6e61-480d-4840-a976-0c2567811583/volumes" Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.032639 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" path="/var/lib/kubelet/pods/74fe0bf2-a4bf-43fa-807a-1bafdc4e2917/volumes" Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.841183 4909 generic.go:334] "Generic (PLEG): container finished" podID="53774ce4-0f24-484b-afdf-f6023d8498c0" containerID="9605881602e8e9802371144790cb626f296ca0ea0fffdab156c40c6988d7021c" exitCode=0 Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.841230 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x8ptv" event={"ID":"53774ce4-0f24-484b-afdf-f6023d8498c0","Type":"ContainerDied","Data":"9605881602e8e9802371144790cb626f296ca0ea0fffdab156c40c6988d7021c"} Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.845200 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6d9df75c-c412-4c4c-a3fb-54de31dfe30a","Type":"ContainerStarted","Data":"c3519ce82eae4829da84fa84986ced64c1aaa40d9bf7d1b5e97f8552a52ddc4f"} Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.845363 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" containerName="glance-log" containerID="cri-o://62f9f7bbeccb13a0393942be064823929d9c96cd168c1d37e1d90e123c0a3e2f" gracePeriod=30 Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.845396 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" containerName="glance-httpd" containerID="cri-o://c3519ce82eae4829da84fa84986ced64c1aaa40d9bf7d1b5e97f8552a52ddc4f" gracePeriod=30 Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.848672 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a09a0ccc-3e9a-4f40-8237-8bf342976a24","Type":"ContainerStarted","Data":"11b58ad41c03eb53e383b77ac44737f5152eae3ed415ee5b59c574d411f4e674"} Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.848834 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" containerName="glance-log" containerID="cri-o://fd34d4698ba922a7dc3df1469905bb3e5d1c8860d6ef18784d597822849b01bb" gracePeriod=30 Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.848913 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" containerName="glance-httpd" containerID="cri-o://11b58ad41c03eb53e383b77ac44737f5152eae3ed415ee5b59c574d411f4e674" gracePeriod=30 Feb 02 10:50:57 crc kubenswrapper[4909]: I0202 10:50:57.898748 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.89872535 podStartE2EDuration="6.89872535s" podCreationTimestamp="2026-02-02 10:50:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:57.881482281 +0000 UTC m=+1183.627583016" watchObservedRunningTime="2026-02-02 10:50:57.89872535 +0000 UTC m=+1183.644826085" Feb 02 10:50:58 crc kubenswrapper[4909]: I0202 10:50:58.880591 4909 generic.go:334] "Generic (PLEG): container finished" podID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" containerID="11b58ad41c03eb53e383b77ac44737f5152eae3ed415ee5b59c574d411f4e674" exitCode=0 Feb 02 10:50:58 crc kubenswrapper[4909]: I0202 10:50:58.880990 4909 generic.go:334] "Generic (PLEG): container finished" podID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" containerID="fd34d4698ba922a7dc3df1469905bb3e5d1c8860d6ef18784d597822849b01bb" exitCode=143 Feb 02 10:50:58 crc kubenswrapper[4909]: I0202 10:50:58.880677 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a09a0ccc-3e9a-4f40-8237-8bf342976a24","Type":"ContainerDied","Data":"11b58ad41c03eb53e383b77ac44737f5152eae3ed415ee5b59c574d411f4e674"} Feb 02 10:50:58 crc kubenswrapper[4909]: I0202 10:50:58.881085 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a09a0ccc-3e9a-4f40-8237-8bf342976a24","Type":"ContainerDied","Data":"fd34d4698ba922a7dc3df1469905bb3e5d1c8860d6ef18784d597822849b01bb"} Feb 02 10:50:58 crc kubenswrapper[4909]: I0202 10:50:58.884220 4909 generic.go:334] "Generic (PLEG): container finished" podID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" containerID="c3519ce82eae4829da84fa84986ced64c1aaa40d9bf7d1b5e97f8552a52ddc4f" exitCode=0 Feb 02 10:50:58 crc kubenswrapper[4909]: I0202 10:50:58.884245 4909 generic.go:334] "Generic (PLEG): container finished" podID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" containerID="62f9f7bbeccb13a0393942be064823929d9c96cd168c1d37e1d90e123c0a3e2f" exitCode=143 Feb 02 10:50:58 crc kubenswrapper[4909]: I0202 10:50:58.884542 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6d9df75c-c412-4c4c-a3fb-54de31dfe30a","Type":"ContainerDied","Data":"c3519ce82eae4829da84fa84986ced64c1aaa40d9bf7d1b5e97f8552a52ddc4f"} Feb 02 10:50:58 crc kubenswrapper[4909]: I0202 10:50:58.884569 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6d9df75c-c412-4c4c-a3fb-54de31dfe30a","Type":"ContainerDied","Data":"62f9f7bbeccb13a0393942be064823929d9c96cd168c1d37e1d90e123c0a3e2f"} Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.573857 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.601544 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.60152499 podStartE2EDuration="9.60152499s" podCreationTimestamp="2026-02-02 10:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:50:57.912751618 +0000 UTC m=+1183.658852373" watchObservedRunningTime="2026-02-02 10:51:01.60152499 +0000 UTC m=+1187.347625725" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.732675 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-credential-keys\") pod \"53774ce4-0f24-484b-afdf-f6023d8498c0\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.732751 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-config-data\") pod \"53774ce4-0f24-484b-afdf-f6023d8498c0\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.732829 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-combined-ca-bundle\") pod \"53774ce4-0f24-484b-afdf-f6023d8498c0\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.732881 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5w5z\" (UniqueName: \"kubernetes.io/projected/53774ce4-0f24-484b-afdf-f6023d8498c0-kube-api-access-g5w5z\") pod \"53774ce4-0f24-484b-afdf-f6023d8498c0\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.732903 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-fernet-keys\") pod \"53774ce4-0f24-484b-afdf-f6023d8498c0\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.733713 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-scripts\") pod \"53774ce4-0f24-484b-afdf-f6023d8498c0\" (UID: \"53774ce4-0f24-484b-afdf-f6023d8498c0\") " Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.749338 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53774ce4-0f24-484b-afdf-f6023d8498c0-kube-api-access-g5w5z" (OuterVolumeSpecName: "kube-api-access-g5w5z") pod "53774ce4-0f24-484b-afdf-f6023d8498c0" (UID: "53774ce4-0f24-484b-afdf-f6023d8498c0"). InnerVolumeSpecName "kube-api-access-g5w5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.750047 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "53774ce4-0f24-484b-afdf-f6023d8498c0" (UID: "53774ce4-0f24-484b-afdf-f6023d8498c0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.750920 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-scripts" (OuterVolumeSpecName: "scripts") pod "53774ce4-0f24-484b-afdf-f6023d8498c0" (UID: "53774ce4-0f24-484b-afdf-f6023d8498c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.750965 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "53774ce4-0f24-484b-afdf-f6023d8498c0" (UID: "53774ce4-0f24-484b-afdf-f6023d8498c0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.775712 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-config-data" (OuterVolumeSpecName: "config-data") pod "53774ce4-0f24-484b-afdf-f6023d8498c0" (UID: "53774ce4-0f24-484b-afdf-f6023d8498c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.785787 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53774ce4-0f24-484b-afdf-f6023d8498c0" (UID: "53774ce4-0f24-484b-afdf-f6023d8498c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.835924 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.835958 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.835970 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5w5z\" (UniqueName: \"kubernetes.io/projected/53774ce4-0f24-484b-afdf-f6023d8498c0-kube-api-access-g5w5z\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.835981 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.835991 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.836000 4909 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/53774ce4-0f24-484b-afdf-f6023d8498c0-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.913359 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-x8ptv" event={"ID":"53774ce4-0f24-484b-afdf-f6023d8498c0","Type":"ContainerDied","Data":"0b83a53f74f32548cba3670182f2aebdcf8c078bb96c3a20cb1a1d554bedb160"} Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.913901 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b83a53f74f32548cba3670182f2aebdcf8c078bb96c3a20cb1a1d554bedb160" Feb 02 10:51:01 crc kubenswrapper[4909]: I0202 10:51:01.913473 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-x8ptv" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.666353 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-x8ptv"] Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.672915 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-x8ptv"] Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.754444 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8sv84"] Feb 02 10:51:02 crc kubenswrapper[4909]: E0202 10:51:02.754774 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" containerName="init" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.754787 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" containerName="init" Feb 02 10:51:02 crc kubenswrapper[4909]: E0202 10:51:02.754799 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ef6e61-480d-4840-a976-0c2567811583" containerName="init" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.755166 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ef6e61-480d-4840-a976-0c2567811583" containerName="init" Feb 02 10:51:02 crc kubenswrapper[4909]: E0202 10:51:02.755208 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53774ce4-0f24-484b-afdf-f6023d8498c0" containerName="keystone-bootstrap" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.755215 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="53774ce4-0f24-484b-afdf-f6023d8498c0" containerName="keystone-bootstrap" Feb 02 10:51:02 crc kubenswrapper[4909]: E0202 10:51:02.755226 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02ef6e61-480d-4840-a976-0c2567811583" containerName="dnsmasq-dns" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.755232 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="02ef6e61-480d-4840-a976-0c2567811583" containerName="dnsmasq-dns" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.755377 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="53774ce4-0f24-484b-afdf-f6023d8498c0" containerName="keystone-bootstrap" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.755392 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="02ef6e61-480d-4840-a976-0c2567811583" containerName="dnsmasq-dns" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.755403 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="74fe0bf2-a4bf-43fa-807a-1bafdc4e2917" containerName="init" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.755935 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.758648 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.759119 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.759210 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.759821 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l4g7z" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.763749 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.783963 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8sv84"] Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.916997 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.957006 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-config-data\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.957074 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-credential-keys\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.957125 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-combined-ca-bundle\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.957163 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-scripts\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.957254 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn5hs\" (UniqueName: \"kubernetes.io/projected/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-kube-api-access-vn5hs\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:02 crc kubenswrapper[4909]: I0202 10:51:02.957327 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-fernet-keys\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.030395 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53774ce4-0f24-484b-afdf-f6023d8498c0" path="/var/lib/kubelet/pods/53774ce4-0f24-484b-afdf-f6023d8498c0/volumes" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.032002 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-wx74v"] Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.032334 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" podUID="04683aef-1e09-400c-a20d-29d191926c20" containerName="dnsmasq-dns" containerID="cri-o://e93409660b5a9d436b70952c8f04f7832ca2759777d66ca326bb6d3fc52d1471" gracePeriod=10 Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.063206 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-fernet-keys\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.063419 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-config-data\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.063528 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-credential-keys\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.063670 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-combined-ca-bundle\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.063787 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-scripts\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.063951 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn5hs\" (UniqueName: \"kubernetes.io/projected/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-kube-api-access-vn5hs\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.076079 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-scripts\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.080784 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-credential-keys\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.081119 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-fernet-keys\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.082942 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-config-data\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.083064 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-combined-ca-bundle\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.086698 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn5hs\" (UniqueName: \"kubernetes.io/projected/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-kube-api-access-vn5hs\") pod \"keystone-bootstrap-8sv84\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.376265 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.535890 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.672733 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.672861 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-scripts\") pod \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.672962 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-httpd-run\") pod \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.672992 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-public-tls-certs\") pod \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.673017 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-logs\") pod \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.674125 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-logs" (OuterVolumeSpecName: "logs") pod "6d9df75c-c412-4c4c-a3fb-54de31dfe30a" (UID: "6d9df75c-c412-4c4c-a3fb-54de31dfe30a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.674014 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6d9df75c-c412-4c4c-a3fb-54de31dfe30a" (UID: "6d9df75c-c412-4c4c-a3fb-54de31dfe30a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.673052 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f6m8\" (UniqueName: \"kubernetes.io/projected/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-kube-api-access-8f6m8\") pod \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.675371 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-config-data\") pod \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.675393 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-combined-ca-bundle\") pod \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\" (UID: \"6d9df75c-c412-4c4c-a3fb-54de31dfe30a\") " Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.676098 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.676114 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.677941 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-scripts" (OuterVolumeSpecName: "scripts") pod "6d9df75c-c412-4c4c-a3fb-54de31dfe30a" (UID: "6d9df75c-c412-4c4c-a3fb-54de31dfe30a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.678914 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "6d9df75c-c412-4c4c-a3fb-54de31dfe30a" (UID: "6d9df75c-c412-4c4c-a3fb-54de31dfe30a"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.691285 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-kube-api-access-8f6m8" (OuterVolumeSpecName: "kube-api-access-8f6m8") pod "6d9df75c-c412-4c4c-a3fb-54de31dfe30a" (UID: "6d9df75c-c412-4c4c-a3fb-54de31dfe30a"). InnerVolumeSpecName "kube-api-access-8f6m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.701884 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d9df75c-c412-4c4c-a3fb-54de31dfe30a" (UID: "6d9df75c-c412-4c4c-a3fb-54de31dfe30a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.736961 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-config-data" (OuterVolumeSpecName: "config-data") pod "6d9df75c-c412-4c4c-a3fb-54de31dfe30a" (UID: "6d9df75c-c412-4c4c-a3fb-54de31dfe30a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.736995 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6d9df75c-c412-4c4c-a3fb-54de31dfe30a" (UID: "6d9df75c-c412-4c4c-a3fb-54de31dfe30a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.778232 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f6m8\" (UniqueName: \"kubernetes.io/projected/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-kube-api-access-8f6m8\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.778267 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.778277 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.778313 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.778324 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.778336 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d9df75c-c412-4c4c-a3fb-54de31dfe30a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.803435 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.880368 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.946929 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6d9df75c-c412-4c4c-a3fb-54de31dfe30a","Type":"ContainerDied","Data":"011625849d41a6d3345a8da8a3b2a0ddbab269d7bda07656de0f40f8bd00a2bd"} Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.946979 4909 scope.go:117] "RemoveContainer" containerID="c3519ce82eae4829da84fa84986ced64c1aaa40d9bf7d1b5e97f8552a52ddc4f" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.947007 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.954881 4909 generic.go:334] "Generic (PLEG): container finished" podID="04683aef-1e09-400c-a20d-29d191926c20" containerID="e93409660b5a9d436b70952c8f04f7832ca2759777d66ca326bb6d3fc52d1471" exitCode=0 Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.954931 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" event={"ID":"04683aef-1e09-400c-a20d-29d191926c20","Type":"ContainerDied","Data":"e93409660b5a9d436b70952c8f04f7832ca2759777d66ca326bb6d3fc52d1471"} Feb 02 10:51:03 crc kubenswrapper[4909]: I0202 10:51:03.988169 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.010729 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.062300 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:51:04 crc kubenswrapper[4909]: E0202 10:51:04.062949 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" containerName="glance-log" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.062975 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" containerName="glance-log" Feb 02 10:51:04 crc kubenswrapper[4909]: E0202 10:51:04.063003 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" containerName="glance-httpd" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.063013 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" containerName="glance-httpd" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.063229 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" containerName="glance-httpd" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.063265 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" containerName="glance-log" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.064650 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.067263 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.067395 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.068413 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.187565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.187608 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.187634 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.187685 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.187706 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.187744 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpqmq\" (UniqueName: \"kubernetes.io/projected/fb15af0b-4954-4dba-b189-c193408924f3-kube-api-access-tpqmq\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.187782 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-logs\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.187845 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.290273 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.290348 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.290434 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpqmq\" (UniqueName: \"kubernetes.io/projected/fb15af0b-4954-4dba-b189-c193408924f3-kube-api-access-tpqmq\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.290509 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-logs\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.290590 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.290666 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.290725 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.290782 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.291196 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.292031 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-logs\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.294552 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.299376 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-scripts\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.300973 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.302823 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-config-data\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.306003 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.310430 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpqmq\" (UniqueName: \"kubernetes.io/projected/fb15af0b-4954-4dba-b189-c193408924f3-kube-api-access-tpqmq\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.318431 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " pod="openstack/glance-default-external-api-0" Feb 02 10:51:04 crc kubenswrapper[4909]: I0202 10:51:04.439859 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:51:05 crc kubenswrapper[4909]: I0202 10:51:05.035691 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9df75c-c412-4c4c-a3fb-54de31dfe30a" path="/var/lib/kubelet/pods/6d9df75c-c412-4c4c-a3fb-54de31dfe30a/volumes" Feb 02 10:51:07 crc kubenswrapper[4909]: I0202 10:51:07.731763 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" podUID="04683aef-1e09-400c-a20d-29d191926c20" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Feb 02 10:51:11 crc kubenswrapper[4909]: E0202 10:51:11.804037 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5ac8ede62671a3b3695cf29bd3a6f124f27c93d1730f9030cc3daa05034d4af4" Feb 02 10:51:11 crc kubenswrapper[4909]: E0202 10:51:11.804428 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5ac8ede62671a3b3695cf29bd3a6f124f27c93d1730f9030cc3daa05034d4af4,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n77h5d9h5cch5chdh97h64h59dh646h646h59fh589h9bh55h68h5b6h5bbh5b8h5ch5f5h584h585h5c8h685h584h9fh54h5b9h568h548h676h5b7q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rm6x8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(9b4235b4-44a0-4238-9b70-ad3ea946f729): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:51:11 crc kubenswrapper[4909]: I0202 10:51:11.890099 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.033949 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a09a0ccc-3e9a-4f40-8237-8bf342976a24","Type":"ContainerDied","Data":"651d0d37ea02430af395e9060d8e2c3ea89b194ee9b0688d253c803bf11fa13a"} Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.034053 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.036661 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-combined-ca-bundle\") pod \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.036762 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-config-data\") pod \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.036802 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-scripts\") pod \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.036877 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.036920 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-internal-tls-certs\") pod \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.036968 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-httpd-run\") pod \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.037036 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-logs\") pod \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.037111 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8w54\" (UniqueName: \"kubernetes.io/projected/a09a0ccc-3e9a-4f40-8237-8bf342976a24-kube-api-access-n8w54\") pod \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\" (UID: \"a09a0ccc-3e9a-4f40-8237-8bf342976a24\") " Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.038158 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a09a0ccc-3e9a-4f40-8237-8bf342976a24" (UID: "a09a0ccc-3e9a-4f40-8237-8bf342976a24"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.038443 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-logs" (OuterVolumeSpecName: "logs") pod "a09a0ccc-3e9a-4f40-8237-8bf342976a24" (UID: "a09a0ccc-3e9a-4f40-8237-8bf342976a24"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.042920 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-scripts" (OuterVolumeSpecName: "scripts") pod "a09a0ccc-3e9a-4f40-8237-8bf342976a24" (UID: "a09a0ccc-3e9a-4f40-8237-8bf342976a24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.055621 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09a0ccc-3e9a-4f40-8237-8bf342976a24-kube-api-access-n8w54" (OuterVolumeSpecName: "kube-api-access-n8w54") pod "a09a0ccc-3e9a-4f40-8237-8bf342976a24" (UID: "a09a0ccc-3e9a-4f40-8237-8bf342976a24"). InnerVolumeSpecName "kube-api-access-n8w54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.062208 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a09a0ccc-3e9a-4f40-8237-8bf342976a24" (UID: "a09a0ccc-3e9a-4f40-8237-8bf342976a24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.064659 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "a09a0ccc-3e9a-4f40-8237-8bf342976a24" (UID: "a09a0ccc-3e9a-4f40-8237-8bf342976a24"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.088832 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a09a0ccc-3e9a-4f40-8237-8bf342976a24" (UID: "a09a0ccc-3e9a-4f40-8237-8bf342976a24"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.094363 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-config-data" (OuterVolumeSpecName: "config-data") pod "a09a0ccc-3e9a-4f40-8237-8bf342976a24" (UID: "a09a0ccc-3e9a-4f40-8237-8bf342976a24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.138750 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.138782 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.138792 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.138831 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.138842 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a09a0ccc-3e9a-4f40-8237-8bf342976a24-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.138851 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.138860 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a09a0ccc-3e9a-4f40-8237-8bf342976a24-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.138868 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8w54\" (UniqueName: \"kubernetes.io/projected/a09a0ccc-3e9a-4f40-8237-8bf342976a24-kube-api-access-n8w54\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.158759 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.240553 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.372625 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.384427 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.414524 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:51:12 crc kubenswrapper[4909]: E0202 10:51:12.415141 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" containerName="glance-httpd" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.415165 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" containerName="glance-httpd" Feb 02 10:51:12 crc kubenswrapper[4909]: E0202 10:51:12.415176 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" containerName="glance-log" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.415182 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" containerName="glance-log" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.415468 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" containerName="glance-log" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.415499 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" containerName="glance-httpd" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.416828 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.419402 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.421084 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.438744 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.547109 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.547240 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.547338 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.547388 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.547419 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.547478 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.547521 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jph9d\" (UniqueName: \"kubernetes.io/projected/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-kube-api-access-jph9d\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.547551 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.648961 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.649037 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.649071 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.649090 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.649131 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.649605 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.649676 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-logs\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.649864 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.650275 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jph9d\" (UniqueName: \"kubernetes.io/projected/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-kube-api-access-jph9d\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.650646 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.650689 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.654940 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.655594 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.656244 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.658148 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.672272 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jph9d\" (UniqueName: \"kubernetes.io/projected/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-kube-api-access-jph9d\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.674297 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:51:12 crc kubenswrapper[4909]: I0202 10:51:12.740599 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.026139 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09a0ccc-3e9a-4f40-8237-8bf342976a24" path="/var/lib/kubelet/pods/a09a0ccc-3e9a-4f40-8237-8bf342976a24/volumes" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.054158 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca377ade-e972-41f6-add9-a0b491d86bbf" containerID="311a7b10742f33d16e743d03813d2f4a91673f1da23226395322a7b04d1bbfb8" exitCode=0 Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.054218 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jpqtq" event={"ID":"ca377ade-e972-41f6-add9-a0b491d86bbf","Type":"ContainerDied","Data":"311a7b10742f33d16e743d03813d2f4a91673f1da23226395322a7b04d1bbfb8"} Feb 02 10:51:13 crc kubenswrapper[4909]: E0202 10:51:13.080527 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 02 10:51:13 crc kubenswrapper[4909]: E0202 10:51:13.080921 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5q7g4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xqjmc_openstack(d68a9e4e-b453-459f-b397-9c6d7c221dda): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 02 10:51:13 crc kubenswrapper[4909]: E0202 10:51:13.083091 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xqjmc" podUID="d68a9e4e-b453-459f-b397-9c6d7c221dda" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.086792 4909 scope.go:117] "RemoveContainer" containerID="62f9f7bbeccb13a0393942be064823929d9c96cd168c1d37e1d90e123c0a3e2f" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.223191 4909 scope.go:117] "RemoveContainer" containerID="11b58ad41c03eb53e383b77ac44737f5152eae3ed415ee5b59c574d411f4e674" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.316493 4909 scope.go:117] "RemoveContainer" containerID="fd34d4698ba922a7dc3df1469905bb3e5d1c8860d6ef18784d597822849b01bb" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.361736 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.471685 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-dns-svc\") pod \"04683aef-1e09-400c-a20d-29d191926c20\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.472240 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-nb\") pod \"04683aef-1e09-400c-a20d-29d191926c20\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.472266 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-sb\") pod \"04683aef-1e09-400c-a20d-29d191926c20\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.472300 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-config\") pod \"04683aef-1e09-400c-a20d-29d191926c20\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.472371 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrt4p\" (UniqueName: \"kubernetes.io/projected/04683aef-1e09-400c-a20d-29d191926c20-kube-api-access-rrt4p\") pod \"04683aef-1e09-400c-a20d-29d191926c20\" (UID: \"04683aef-1e09-400c-a20d-29d191926c20\") " Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.480589 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04683aef-1e09-400c-a20d-29d191926c20-kube-api-access-rrt4p" (OuterVolumeSpecName: "kube-api-access-rrt4p") pod "04683aef-1e09-400c-a20d-29d191926c20" (UID: "04683aef-1e09-400c-a20d-29d191926c20"). InnerVolumeSpecName "kube-api-access-rrt4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.551686 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04683aef-1e09-400c-a20d-29d191926c20" (UID: "04683aef-1e09-400c-a20d-29d191926c20"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.552429 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-config" (OuterVolumeSpecName: "config") pod "04683aef-1e09-400c-a20d-29d191926c20" (UID: "04683aef-1e09-400c-a20d-29d191926c20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.560415 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04683aef-1e09-400c-a20d-29d191926c20" (UID: "04683aef-1e09-400c-a20d-29d191926c20"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.564601 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04683aef-1e09-400c-a20d-29d191926c20" (UID: "04683aef-1e09-400c-a20d-29d191926c20"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.574278 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.574320 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.574337 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.574353 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrt4p\" (UniqueName: \"kubernetes.io/projected/04683aef-1e09-400c-a20d-29d191926c20-kube-api-access-rrt4p\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.574366 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04683aef-1e09-400c-a20d-29d191926c20-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.580082 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8sv84"] Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.779072 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:51:13 crc kubenswrapper[4909]: I0202 10:51:13.891671 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:51:13 crc kubenswrapper[4909]: W0202 10:51:13.929151 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb15af0b_4954_4dba_b189_c193408924f3.slice/crio-f7ef12baf139de721318198c27daa0a8a8c9f344dd87883d50457ca4353e8ce7 WatchSource:0}: Error finding container f7ef12baf139de721318198c27daa0a8a8c9f344dd87883d50457ca4353e8ce7: Status 404 returned error can't find the container with id f7ef12baf139de721318198c27daa0a8a8c9f344dd87883d50457ca4353e8ce7 Feb 02 10:51:13 crc kubenswrapper[4909]: W0202 10:51:13.938515 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2391e0d_9f13_489b_8e71_15a8da6cfbfe.slice/crio-ec3a34f1cc494771447890260d6faef67f89105eb03322c56d11e165e74ae577 WatchSource:0}: Error finding container ec3a34f1cc494771447890260d6faef67f89105eb03322c56d11e165e74ae577: Status 404 returned error can't find the container with id ec3a34f1cc494771447890260d6faef67f89105eb03322c56d11e165e74ae577 Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.065909 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c27bb" event={"ID":"d5d450e8-b58b-423c-afae-ed534a2d65ed","Type":"ContainerStarted","Data":"ab7cb2bbb65c4d7634f9a5e8e479f9fc0694a5bfd80908df979d2152969dd51d"} Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.094819 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-c27bb" podStartSLOduration=2.946837397 podStartE2EDuration="22.094757454s" podCreationTimestamp="2026-02-02 10:50:52 +0000 UTC" firstStartedPulling="2026-02-02 10:50:53.911906407 +0000 UTC m=+1179.658007142" lastFinishedPulling="2026-02-02 10:51:13.059826414 +0000 UTC m=+1198.805927199" observedRunningTime="2026-02-02 10:51:14.092940822 +0000 UTC m=+1199.839041557" watchObservedRunningTime="2026-02-02 10:51:14.094757454 +0000 UTC m=+1199.840858189" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.101683 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.101883 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" event={"ID":"04683aef-1e09-400c-a20d-29d191926c20","Type":"ContainerDied","Data":"f11ee355dd066308d1141cf0b520ed327e211b3f3bc1f990a60a3b1563ae92c2"} Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.101985 4909 scope.go:117] "RemoveContainer" containerID="e93409660b5a9d436b70952c8f04f7832ca2759777d66ca326bb6d3fc52d1471" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.109158 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f4bc941-d43d-4e64-b0b6-a677ab0374f8","Type":"ContainerStarted","Data":"b86f86d1a6d89fa117c2c872986406a245dda9f89fcc9b2e998480c4365277de"} Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.110513 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb15af0b-4954-4dba-b189-c193408924f3","Type":"ContainerStarted","Data":"f7ef12baf139de721318198c27daa0a8a8c9f344dd87883d50457ca4353e8ce7"} Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.112447 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8sv84" event={"ID":"d2391e0d-9f13-489b-8e71-15a8da6cfbfe","Type":"ContainerStarted","Data":"ec3a34f1cc494771447890260d6faef67f89105eb03322c56d11e165e74ae577"} Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.118085 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lzdlk" event={"ID":"d68a2056-e886-4135-a63a-3755df0703af","Type":"ContainerStarted","Data":"6548681ec6a0ff65f9e2a7c5a769dc2781eeae61756fd4aedf64e11796338ff8"} Feb 02 10:51:14 crc kubenswrapper[4909]: E0202 10:51:14.120322 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-xqjmc" podUID="d68a9e4e-b453-459f-b397-9c6d7c221dda" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.163931 4909 scope.go:117] "RemoveContainer" containerID="c4e6780362f55fd4568dbf235d90624d27eef7161ae83eef3eab04d7d32f586b" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.165714 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-lzdlk" podStartSLOduration=2.892468814 podStartE2EDuration="22.165613175s" podCreationTimestamp="2026-02-02 10:50:52 +0000 UTC" firstStartedPulling="2026-02-02 10:50:53.789914284 +0000 UTC m=+1179.536015019" lastFinishedPulling="2026-02-02 10:51:13.063058645 +0000 UTC m=+1198.809159380" observedRunningTime="2026-02-02 10:51:14.161508429 +0000 UTC m=+1199.907609174" watchObservedRunningTime="2026-02-02 10:51:14.165613175 +0000 UTC m=+1199.911713910" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.178876 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-wx74v"] Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.186552 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-wx74v"] Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.505105 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.594318 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-combined-ca-bundle\") pod \"ca377ade-e972-41f6-add9-a0b491d86bbf\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.594487 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-259wr\" (UniqueName: \"kubernetes.io/projected/ca377ade-e972-41f6-add9-a0b491d86bbf-kube-api-access-259wr\") pod \"ca377ade-e972-41f6-add9-a0b491d86bbf\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.594639 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-config\") pod \"ca377ade-e972-41f6-add9-a0b491d86bbf\" (UID: \"ca377ade-e972-41f6-add9-a0b491d86bbf\") " Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.600098 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca377ade-e972-41f6-add9-a0b491d86bbf-kube-api-access-259wr" (OuterVolumeSpecName: "kube-api-access-259wr") pod "ca377ade-e972-41f6-add9-a0b491d86bbf" (UID: "ca377ade-e972-41f6-add9-a0b491d86bbf"). InnerVolumeSpecName "kube-api-access-259wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.631098 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca377ade-e972-41f6-add9-a0b491d86bbf" (UID: "ca377ade-e972-41f6-add9-a0b491d86bbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.645147 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-config" (OuterVolumeSpecName: "config") pod "ca377ade-e972-41f6-add9-a0b491d86bbf" (UID: "ca377ade-e972-41f6-add9-a0b491d86bbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.697074 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.697133 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-259wr\" (UniqueName: \"kubernetes.io/projected/ca377ade-e972-41f6-add9-a0b491d86bbf-kube-api-access-259wr\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:14 crc kubenswrapper[4909]: I0202 10:51:14.697150 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ca377ade-e972-41f6-add9-a0b491d86bbf-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.045041 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04683aef-1e09-400c-a20d-29d191926c20" path="/var/lib/kubelet/pods/04683aef-1e09-400c-a20d-29d191926c20/volumes" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.138384 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8sv84" event={"ID":"d2391e0d-9f13-489b-8e71-15a8da6cfbfe","Type":"ContainerStarted","Data":"53c671e3885cedcf174cef191a0d515bd5fef3560bab4d01f99eaa50ccab077c"} Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.147767 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4235b4-44a0-4238-9b70-ad3ea946f729","Type":"ContainerStarted","Data":"5cdd443f73cd51e100bceb14caae9ac75a69bda8061677bfa6c49185d8fd7f82"} Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.152174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jpqtq" event={"ID":"ca377ade-e972-41f6-add9-a0b491d86bbf","Type":"ContainerDied","Data":"3b0b93c077764d3486ddcdcd6ce07399ac39d664ab3dc519057e20c0184f8ee6"} Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.152215 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b0b93c077764d3486ddcdcd6ce07399ac39d664ab3dc519057e20c0184f8ee6" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.152325 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jpqtq" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.156027 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f4bc941-d43d-4e64-b0b6-a677ab0374f8","Type":"ContainerStarted","Data":"60e7a89b43e404c571abf85b133ea6e9e4167764ad23f1e40b9ac64785177e7f"} Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.181613 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb15af0b-4954-4dba-b189-c193408924f3","Type":"ContainerStarted","Data":"288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37"} Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.194017 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8sv84" podStartSLOduration=13.19398347 podStartE2EDuration="13.19398347s" podCreationTimestamp="2026-02-02 10:51:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:15.162570538 +0000 UTC m=+1200.908671273" watchObservedRunningTime="2026-02-02 10:51:15.19398347 +0000 UTC m=+1200.940084205" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.385222 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-dnppd"] Feb 02 10:51:15 crc kubenswrapper[4909]: E0202 10:51:15.385803 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca377ade-e972-41f6-add9-a0b491d86bbf" containerName="neutron-db-sync" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.391921 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca377ade-e972-41f6-add9-a0b491d86bbf" containerName="neutron-db-sync" Feb 02 10:51:15 crc kubenswrapper[4909]: E0202 10:51:15.391972 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04683aef-1e09-400c-a20d-29d191926c20" containerName="dnsmasq-dns" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.391979 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="04683aef-1e09-400c-a20d-29d191926c20" containerName="dnsmasq-dns" Feb 02 10:51:15 crc kubenswrapper[4909]: E0202 10:51:15.392010 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04683aef-1e09-400c-a20d-29d191926c20" containerName="init" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.392016 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="04683aef-1e09-400c-a20d-29d191926c20" containerName="init" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.392484 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="04683aef-1e09-400c-a20d-29d191926c20" containerName="dnsmasq-dns" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.392521 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca377ade-e972-41f6-add9-a0b491d86bbf" containerName="neutron-db-sync" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.393797 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.418699 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-dnppd"] Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.512562 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d95b455f4-xnd62"] Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.520239 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.525398 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.525619 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sst8c" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.525831 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.526020 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.536721 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-nb\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.536773 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-sb\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.536830 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-swift-storage-0\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.536865 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-svc\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.537002 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-config\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.537118 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htxnd\" (UniqueName: \"kubernetes.io/projected/7b681927-4a4d-4177-97ec-eee1e9d43cb8-kube-api-access-htxnd\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.549089 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d95b455f4-xnd62"] Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.638879 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-sb\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.638946 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp84k\" (UniqueName: \"kubernetes.io/projected/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-kube-api-access-mp84k\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.638992 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-swift-storage-0\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.639034 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-svc\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.639092 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-config\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.639123 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-config\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.639145 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-combined-ca-bundle\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.639178 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htxnd\" (UniqueName: \"kubernetes.io/projected/7b681927-4a4d-4177-97ec-eee1e9d43cb8-kube-api-access-htxnd\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.639220 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-ovndb-tls-certs\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.639254 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-httpd-config\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.639302 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-nb\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.640559 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-nb\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.641396 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-config\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.641867 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-svc\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.644320 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-swift-storage-0\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.645153 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-sb\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.665877 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htxnd\" (UniqueName: \"kubernetes.io/projected/7b681927-4a4d-4177-97ec-eee1e9d43cb8-kube-api-access-htxnd\") pod \"dnsmasq-dns-6f455b5fc7-dnppd\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.743536 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp84k\" (UniqueName: \"kubernetes.io/projected/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-kube-api-access-mp84k\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.743907 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-config\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.743934 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-combined-ca-bundle\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.743972 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-ovndb-tls-certs\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.743999 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-httpd-config\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.746915 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.754297 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-config\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.758679 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-combined-ca-bundle\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.760854 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-httpd-config\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.761532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-ovndb-tls-certs\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.767478 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp84k\" (UniqueName: \"kubernetes.io/projected/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-kube-api-access-mp84k\") pod \"neutron-5d95b455f4-xnd62\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:15 crc kubenswrapper[4909]: I0202 10:51:15.846980 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:16 crc kubenswrapper[4909]: I0202 10:51:16.226569 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f4bc941-d43d-4e64-b0b6-a677ab0374f8","Type":"ContainerStarted","Data":"8567952a983ebcc3a59a1c216958b61dc8e42144faabd6a76f371350ce4e5467"} Feb 02 10:51:16 crc kubenswrapper[4909]: I0202 10:51:16.248077 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb15af0b-4954-4dba-b189-c193408924f3","Type":"ContainerStarted","Data":"2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44"} Feb 02 10:51:16 crc kubenswrapper[4909]: I0202 10:51:16.342331 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.342307531 podStartE2EDuration="4.342307531s" podCreationTimestamp="2026-02-02 10:51:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:16.276130432 +0000 UTC m=+1202.022231177" watchObservedRunningTime="2026-02-02 10:51:16.342307531 +0000 UTC m=+1202.088408266" Feb 02 10:51:16 crc kubenswrapper[4909]: I0202 10:51:16.376735 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.376719688 podStartE2EDuration="13.376719688s" podCreationTimestamp="2026-02-02 10:51:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:16.316336463 +0000 UTC m=+1202.062437198" watchObservedRunningTime="2026-02-02 10:51:16.376719688 +0000 UTC m=+1202.122820423" Feb 02 10:51:16 crc kubenswrapper[4909]: I0202 10:51:16.435483 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-dnppd"] Feb 02 10:51:16 crc kubenswrapper[4909]: I0202 10:51:16.894102 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d95b455f4-xnd62"] Feb 02 10:51:16 crc kubenswrapper[4909]: W0202 10:51:16.895276 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff044864_ba16_4d8f_86bc_7677e7d4f8ad.slice/crio-822edfe230c2fb83dc9f0ef06223974e9da2cd8e9c2730ddf58421ccee590ce3 WatchSource:0}: Error finding container 822edfe230c2fb83dc9f0ef06223974e9da2cd8e9c2730ddf58421ccee590ce3: Status 404 returned error can't find the container with id 822edfe230c2fb83dc9f0ef06223974e9da2cd8e9c2730ddf58421ccee590ce3 Feb 02 10:51:17 crc kubenswrapper[4909]: I0202 10:51:17.267686 4909 generic.go:334] "Generic (PLEG): container finished" podID="d5d450e8-b58b-423c-afae-ed534a2d65ed" containerID="ab7cb2bbb65c4d7634f9a5e8e479f9fc0694a5bfd80908df979d2152969dd51d" exitCode=0 Feb 02 10:51:17 crc kubenswrapper[4909]: I0202 10:51:17.267761 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c27bb" event={"ID":"d5d450e8-b58b-423c-afae-ed534a2d65ed","Type":"ContainerDied","Data":"ab7cb2bbb65c4d7634f9a5e8e479f9fc0694a5bfd80908df979d2152969dd51d"} Feb 02 10:51:17 crc kubenswrapper[4909]: I0202 10:51:17.270257 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d95b455f4-xnd62" event={"ID":"ff044864-ba16-4d8f-86bc-7677e7d4f8ad","Type":"ContainerStarted","Data":"6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008"} Feb 02 10:51:17 crc kubenswrapper[4909]: I0202 10:51:17.270279 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d95b455f4-xnd62" event={"ID":"ff044864-ba16-4d8f-86bc-7677e7d4f8ad","Type":"ContainerStarted","Data":"822edfe230c2fb83dc9f0ef06223974e9da2cd8e9c2730ddf58421ccee590ce3"} Feb 02 10:51:17 crc kubenswrapper[4909]: I0202 10:51:17.273193 4909 generic.go:334] "Generic (PLEG): container finished" podID="7b681927-4a4d-4177-97ec-eee1e9d43cb8" containerID="86698faabd6bab4bda58b445b66eb68041ad49d0e26b314280cf2dee0e8599d5" exitCode=0 Feb 02 10:51:17 crc kubenswrapper[4909]: I0202 10:51:17.273493 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" event={"ID":"7b681927-4a4d-4177-97ec-eee1e9d43cb8","Type":"ContainerDied","Data":"86698faabd6bab4bda58b445b66eb68041ad49d0e26b314280cf2dee0e8599d5"} Feb 02 10:51:17 crc kubenswrapper[4909]: I0202 10:51:17.273548 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" event={"ID":"7b681927-4a4d-4177-97ec-eee1e9d43cb8","Type":"ContainerStarted","Data":"93d4c675b5569ec4ea6abd41c19ad5ae0b832bb81497c4ae6710cd1896f187e3"} Feb 02 10:51:17 crc kubenswrapper[4909]: I0202 10:51:17.732692 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-66b577f8c-wx74v" podUID="04683aef-1e09-400c-a20d-29d191926c20" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.283890 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d95b455f4-xnd62" event={"ID":"ff044864-ba16-4d8f-86bc-7677e7d4f8ad","Type":"ContainerStarted","Data":"1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda"} Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.284203 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.296260 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" event={"ID":"7b681927-4a4d-4177-97ec-eee1e9d43cb8","Type":"ContainerStarted","Data":"96841beac53340f127b385adf0a063a9b0c1245d7d50c79437e842a511862338"} Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.297577 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.303663 4909 generic.go:334] "Generic (PLEG): container finished" podID="d2391e0d-9f13-489b-8e71-15a8da6cfbfe" containerID="53c671e3885cedcf174cef191a0d515bd5fef3560bab4d01f99eaa50ccab077c" exitCode=0 Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.303954 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8sv84" event={"ID":"d2391e0d-9f13-489b-8e71-15a8da6cfbfe","Type":"ContainerDied","Data":"53c671e3885cedcf174cef191a0d515bd5fef3560bab4d01f99eaa50ccab077c"} Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.307128 4909 generic.go:334] "Generic (PLEG): container finished" podID="d68a2056-e886-4135-a63a-3755df0703af" containerID="6548681ec6a0ff65f9e2a7c5a769dc2781eeae61756fd4aedf64e11796338ff8" exitCode=0 Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.307309 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lzdlk" event={"ID":"d68a2056-e886-4135-a63a-3755df0703af","Type":"ContainerDied","Data":"6548681ec6a0ff65f9e2a7c5a769dc2781eeae61756fd4aedf64e11796338ff8"} Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.314855 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d95b455f4-xnd62" podStartSLOduration=3.314835499 podStartE2EDuration="3.314835499s" podCreationTimestamp="2026-02-02 10:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:18.303122946 +0000 UTC m=+1204.049223681" watchObservedRunningTime="2026-02-02 10:51:18.314835499 +0000 UTC m=+1204.060936234" Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.335329 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" podStartSLOduration=3.33531004 podStartE2EDuration="3.33531004s" podCreationTimestamp="2026-02-02 10:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:18.327124548 +0000 UTC m=+1204.073225293" watchObservedRunningTime="2026-02-02 10:51:18.33531004 +0000 UTC m=+1204.081410775" Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.858357 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-744cf8b8bf-vxfhd"] Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.863259 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.868858 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.868917 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 02 10:51:18 crc kubenswrapper[4909]: I0202 10:51:18.874073 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-744cf8b8bf-vxfhd"] Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.011259 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-ovndb-tls-certs\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.011322 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-public-tls-certs\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.011378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-config\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.011396 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-combined-ca-bundle\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.011436 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-internal-tls-certs\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.011463 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-httpd-config\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.011487 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9r2d\" (UniqueName: \"kubernetes.io/projected/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-kube-api-access-z9r2d\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.112867 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-config\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.112926 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-combined-ca-bundle\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.112972 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-internal-tls-certs\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.113010 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-httpd-config\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.113037 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9r2d\" (UniqueName: \"kubernetes.io/projected/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-kube-api-access-z9r2d\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.113077 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-ovndb-tls-certs\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.113117 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-public-tls-certs\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.119259 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-combined-ca-bundle\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.119305 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-config\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.121601 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-ovndb-tls-certs\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.121756 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-internal-tls-certs\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.121898 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-public-tls-certs\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.130104 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-httpd-config\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.136249 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9r2d\" (UniqueName: \"kubernetes.io/projected/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-kube-api-access-z9r2d\") pod \"neutron-744cf8b8bf-vxfhd\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:19 crc kubenswrapper[4909]: I0202 10:51:19.197545 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:21 crc kubenswrapper[4909]: I0202 10:51:21.933951 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c27bb" Feb 02 10:51:21 crc kubenswrapper[4909]: I0202 10:51:21.967633 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:21 crc kubenswrapper[4909]: I0202 10:51:21.992739 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.084352 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-scripts\") pod \"d5d450e8-b58b-423c-afae-ed534a2d65ed\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.084742 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-credential-keys\") pod \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.084773 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-config-data\") pod \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.084871 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-combined-ca-bundle\") pod \"d68a2056-e886-4135-a63a-3755df0703af\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.084896 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-db-sync-config-data\") pod \"d68a2056-e886-4135-a63a-3755df0703af\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.084921 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wrv5\" (UniqueName: \"kubernetes.io/projected/d68a2056-e886-4135-a63a-3755df0703af-kube-api-access-6wrv5\") pod \"d68a2056-e886-4135-a63a-3755df0703af\" (UID: \"d68a2056-e886-4135-a63a-3755df0703af\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.084960 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-scripts\") pod \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.084990 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d450e8-b58b-423c-afae-ed534a2d65ed-logs\") pod \"d5d450e8-b58b-423c-afae-ed534a2d65ed\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.085022 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-config-data\") pod \"d5d450e8-b58b-423c-afae-ed534a2d65ed\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.085054 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-combined-ca-bundle\") pod \"d5d450e8-b58b-423c-afae-ed534a2d65ed\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.085084 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5vqc\" (UniqueName: \"kubernetes.io/projected/d5d450e8-b58b-423c-afae-ed534a2d65ed-kube-api-access-r5vqc\") pod \"d5d450e8-b58b-423c-afae-ed534a2d65ed\" (UID: \"d5d450e8-b58b-423c-afae-ed534a2d65ed\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.085143 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-combined-ca-bundle\") pod \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.085164 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn5hs\" (UniqueName: \"kubernetes.io/projected/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-kube-api-access-vn5hs\") pod \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.085189 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-fernet-keys\") pod \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\" (UID: \"d2391e0d-9f13-489b-8e71-15a8da6cfbfe\") " Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.086291 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d450e8-b58b-423c-afae-ed534a2d65ed-logs" (OuterVolumeSpecName: "logs") pod "d5d450e8-b58b-423c-afae-ed534a2d65ed" (UID: "d5d450e8-b58b-423c-afae-ed534a2d65ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.089957 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d450e8-b58b-423c-afae-ed534a2d65ed-kube-api-access-r5vqc" (OuterVolumeSpecName: "kube-api-access-r5vqc") pod "d5d450e8-b58b-423c-afae-ed534a2d65ed" (UID: "d5d450e8-b58b-423c-afae-ed534a2d65ed"). InnerVolumeSpecName "kube-api-access-r5vqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.090589 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d2391e0d-9f13-489b-8e71-15a8da6cfbfe" (UID: "d2391e0d-9f13-489b-8e71-15a8da6cfbfe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.090616 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d2391e0d-9f13-489b-8e71-15a8da6cfbfe" (UID: "d2391e0d-9f13-489b-8e71-15a8da6cfbfe"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.091532 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d68a2056-e886-4135-a63a-3755df0703af" (UID: "d68a2056-e886-4135-a63a-3755df0703af"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.092617 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d68a2056-e886-4135-a63a-3755df0703af-kube-api-access-6wrv5" (OuterVolumeSpecName: "kube-api-access-6wrv5") pod "d68a2056-e886-4135-a63a-3755df0703af" (UID: "d68a2056-e886-4135-a63a-3755df0703af"). InnerVolumeSpecName "kube-api-access-6wrv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.093455 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-scripts" (OuterVolumeSpecName: "scripts") pod "d5d450e8-b58b-423c-afae-ed534a2d65ed" (UID: "d5d450e8-b58b-423c-afae-ed534a2d65ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.101399 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-scripts" (OuterVolumeSpecName: "scripts") pod "d2391e0d-9f13-489b-8e71-15a8da6cfbfe" (UID: "d2391e0d-9f13-489b-8e71-15a8da6cfbfe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.101655 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-kube-api-access-vn5hs" (OuterVolumeSpecName: "kube-api-access-vn5hs") pod "d2391e0d-9f13-489b-8e71-15a8da6cfbfe" (UID: "d2391e0d-9f13-489b-8e71-15a8da6cfbfe"). InnerVolumeSpecName "kube-api-access-vn5hs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.115983 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d68a2056-e886-4135-a63a-3755df0703af" (UID: "d68a2056-e886-4135-a63a-3755df0703af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.118017 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5d450e8-b58b-423c-afae-ed534a2d65ed" (UID: "d5d450e8-b58b-423c-afae-ed534a2d65ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.127150 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-config-data" (OuterVolumeSpecName: "config-data") pod "d5d450e8-b58b-423c-afae-ed534a2d65ed" (UID: "d5d450e8-b58b-423c-afae-ed534a2d65ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.129429 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2391e0d-9f13-489b-8e71-15a8da6cfbfe" (UID: "d2391e0d-9f13-489b-8e71-15a8da6cfbfe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.132100 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-config-data" (OuterVolumeSpecName: "config-data") pod "d2391e0d-9f13-489b-8e71-15a8da6cfbfe" (UID: "d2391e0d-9f13-489b-8e71-15a8da6cfbfe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191049 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191091 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn5hs\" (UniqueName: \"kubernetes.io/projected/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-kube-api-access-vn5hs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191109 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191117 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191129 4909 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191141 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191151 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191160 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d68a2056-e886-4135-a63a-3755df0703af-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191169 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wrv5\" (UniqueName: \"kubernetes.io/projected/d68a2056-e886-4135-a63a-3755df0703af-kube-api-access-6wrv5\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191181 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2391e0d-9f13-489b-8e71-15a8da6cfbfe-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191191 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d450e8-b58b-423c-afae-ed534a2d65ed-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191201 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191209 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d450e8-b58b-423c-afae-ed534a2d65ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.191220 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5vqc\" (UniqueName: \"kubernetes.io/projected/d5d450e8-b58b-423c-afae-ed534a2d65ed-kube-api-access-r5vqc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.328199 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-744cf8b8bf-vxfhd"] Feb 02 10:51:22 crc kubenswrapper[4909]: W0202 10:51:22.330398 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e0c3b2f_546a_403b_9dee_bda4c14ab84d.slice/crio-02c7bbeecb89a33f46ebbe19f782d307e40f5060a30bd0e150f918b3bb551bed WatchSource:0}: Error finding container 02c7bbeecb89a33f46ebbe19f782d307e40f5060a30bd0e150f918b3bb551bed: Status 404 returned error can't find the container with id 02c7bbeecb89a33f46ebbe19f782d307e40f5060a30bd0e150f918b3bb551bed Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.338839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8sv84" event={"ID":"d2391e0d-9f13-489b-8e71-15a8da6cfbfe","Type":"ContainerDied","Data":"ec3a34f1cc494771447890260d6faef67f89105eb03322c56d11e165e74ae577"} Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.339083 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec3a34f1cc494771447890260d6faef67f89105eb03322c56d11e165e74ae577" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.339976 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8sv84" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.341171 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4235b4-44a0-4238-9b70-ad3ea946f729","Type":"ContainerStarted","Data":"ec6c43b06eb3c0e8cc9e57843ebecb998a72a14920fbaa5f98469840ab83d3e9"} Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.342901 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lzdlk" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.342900 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lzdlk" event={"ID":"d68a2056-e886-4135-a63a-3755df0703af","Type":"ContainerDied","Data":"63c903f7836c76f2f2b2ad18e9317934d24d5901bb8d7e9ddf4e72f6a3ec84dd"} Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.342974 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63c903f7836c76f2f2b2ad18e9317934d24d5901bb8d7e9ddf4e72f6a3ec84dd" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.345040 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-c27bb" event={"ID":"d5d450e8-b58b-423c-afae-ed534a2d65ed","Type":"ContainerDied","Data":"aeb7f4815af35af7824c2d5ec4b427b66fb43ab01eabcbc5b5b383989e34478e"} Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.345075 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aeb7f4815af35af7824c2d5ec4b427b66fb43ab01eabcbc5b5b383989e34478e" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.345138 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-c27bb" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.742218 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.742591 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.781605 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:22 crc kubenswrapper[4909]: I0202 10:51:22.813399 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.081748 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-54c465c874-5jkf8"] Feb 02 10:51:23 crc kubenswrapper[4909]: E0202 10:51:23.082349 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d68a2056-e886-4135-a63a-3755df0703af" containerName="barbican-db-sync" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.082368 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d68a2056-e886-4135-a63a-3755df0703af" containerName="barbican-db-sync" Feb 02 10:51:23 crc kubenswrapper[4909]: E0202 10:51:23.082387 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2391e0d-9f13-489b-8e71-15a8da6cfbfe" containerName="keystone-bootstrap" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.082395 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2391e0d-9f13-489b-8e71-15a8da6cfbfe" containerName="keystone-bootstrap" Feb 02 10:51:23 crc kubenswrapper[4909]: E0202 10:51:23.082424 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d450e8-b58b-423c-afae-ed534a2d65ed" containerName="placement-db-sync" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.082430 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d450e8-b58b-423c-afae-ed534a2d65ed" containerName="placement-db-sync" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.082625 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d450e8-b58b-423c-afae-ed534a2d65ed" containerName="placement-db-sync" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.082638 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d68a2056-e886-4135-a63a-3755df0703af" containerName="barbican-db-sync" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.082651 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2391e0d-9f13-489b-8e71-15a8da6cfbfe" containerName="keystone-bootstrap" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.083923 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.091461 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.092551 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.092783 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.093066 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.093298 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kqfqc" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.098407 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54c465c874-5jkf8"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.186194 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7bb594654d-prg2q"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.190027 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.192725 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.195699 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.195847 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.195907 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.196123 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.196502 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-l4g7z" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.204559 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bb594654d-prg2q"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.212234 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-config-data\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.212308 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-public-tls-certs\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.212368 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a64068-38da-44e0-99a6-93aa570aef32-logs\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.212574 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-internal-tls-certs\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.212601 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-combined-ca-bundle\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.212653 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-scripts\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.212705 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnn8s\" (UniqueName: \"kubernetes.io/projected/a4a64068-38da-44e0-99a6-93aa570aef32-kube-api-access-lnn8s\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.291205 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-658647b45-s5s8w"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.293105 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.299482 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.299667 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.299939 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-87r8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.314355 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-fernet-keys\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.314398 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-scripts\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.314434 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-internal-tls-certs\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.314454 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-combined-ca-bundle\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.314471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-scripts\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.314724 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-combined-ca-bundle\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.314768 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-credential-keys\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.314785 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-config-data\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.314831 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnn8s\" (UniqueName: \"kubernetes.io/projected/a4a64068-38da-44e0-99a6-93aa570aef32-kube-api-access-lnn8s\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.315467 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-658647b45-s5s8w"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.315490 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-internal-tls-certs\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.315601 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-config-data\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.315676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-public-tls-certs\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.315706 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a64068-38da-44e0-99a6-93aa570aef32-logs\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.315758 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glpp4\" (UniqueName: \"kubernetes.io/projected/86bc749f-73e5-4bcc-8079-7c9b053e0318-kube-api-access-glpp4\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.315883 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-public-tls-certs\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.316431 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a64068-38da-44e0-99a6-93aa570aef32-logs\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.326129 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-public-tls-certs\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.332058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-combined-ca-bundle\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.332908 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-internal-tls-certs\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.333819 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-config-data\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.353486 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-866f4c5954-tljg7"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.370350 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.371182 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnn8s\" (UniqueName: \"kubernetes.io/projected/a4a64068-38da-44e0-99a6-93aa570aef32-kube-api-access-lnn8s\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.373576 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.380568 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-744cf8b8bf-vxfhd" event={"ID":"2e0c3b2f-546a-403b-9dee-bda4c14ab84d","Type":"ContainerStarted","Data":"e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c"} Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.380619 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-744cf8b8bf-vxfhd" event={"ID":"2e0c3b2f-546a-403b-9dee-bda4c14ab84d","Type":"ContainerStarted","Data":"d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e"} Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.380633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-744cf8b8bf-vxfhd" event={"ID":"2e0c3b2f-546a-403b-9dee-bda4c14ab84d","Type":"ContainerStarted","Data":"02c7bbeecb89a33f46ebbe19f782d307e40f5060a30bd0e150f918b3bb551bed"} Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.380652 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.380670 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.380682 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.389505 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-scripts\") pod \"placement-54c465c874-5jkf8\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.416625 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-866f4c5954-tljg7"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417482 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-fernet-keys\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417507 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-scripts\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417538 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m45nc\" (UniqueName: \"kubernetes.io/projected/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-kube-api-access-m45nc\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417556 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-logs\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417577 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-combined-ca-bundle\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417593 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data-custom\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417634 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417650 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-credential-keys\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417665 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-config-data\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417687 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-internal-tls-certs\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417702 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-combined-ca-bundle\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417749 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glpp4\" (UniqueName: \"kubernetes.io/projected/86bc749f-73e5-4bcc-8079-7c9b053e0318-kube-api-access-glpp4\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.417799 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-public-tls-certs\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.424070 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-public-tls-certs\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.428125 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-fernet-keys\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.428293 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.435644 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-scripts\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.444874 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-combined-ca-bundle\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.445533 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-config-data\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.475515 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-credential-keys\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.527028 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m45nc\" (UniqueName: \"kubernetes.io/projected/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-kube-api-access-m45nc\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.527075 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-logs\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.527103 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data-custom\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.527138 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.527177 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-combined-ca-bundle\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.531046 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-logs\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.534485 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-dnppd"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.534922 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" podUID="7b681927-4a4d-4177-97ec-eee1e9d43cb8" containerName="dnsmasq-dns" containerID="cri-o://96841beac53340f127b385adf0a063a9b0c1245d7d50c79437e842a511862338" gracePeriod=10 Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.555173 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-internal-tls-certs\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.556231 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.561616 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glpp4\" (UniqueName: \"kubernetes.io/projected/86bc749f-73e5-4bcc-8079-7c9b053e0318-kube-api-access-glpp4\") pod \"keystone-7bb594654d-prg2q\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.570986 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-combined-ca-bundle\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.572442 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m45nc\" (UniqueName: \"kubernetes.io/projected/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-kube-api-access-m45nc\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.583030 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data-custom\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.602966 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data\") pod \"barbican-worker-658647b45-s5s8w\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.608217 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-744cf8b8bf-vxfhd" podStartSLOduration=5.608198725 podStartE2EDuration="5.608198725s" podCreationTimestamp="2026-02-02 10:51:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:23.465831733 +0000 UTC m=+1209.211932468" watchObservedRunningTime="2026-02-02 10:51:23.608198725 +0000 UTC m=+1209.354299460" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.629782 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.629873 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82e7f090-1c8a-419e-95da-4d6c82bcde8d-logs\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.629960 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5sph\" (UniqueName: \"kubernetes.io/projected/82e7f090-1c8a-419e-95da-4d6c82bcde8d-kube-api-access-k5sph\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.630084 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data-custom\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.630153 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-combined-ca-bundle\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.691772 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-75589bd9c8-npg4p"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.694167 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.715366 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d4cb98fc-6tg42"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.716739 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.731972 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-combined-ca-bundle\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.732362 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.732393 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82e7f090-1c8a-419e-95da-4d6c82bcde8d-logs\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.732429 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5sph\" (UniqueName: \"kubernetes.io/projected/82e7f090-1c8a-419e-95da-4d6c82bcde8d-kube-api-access-k5sph\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.732469 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data-custom\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.739870 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82e7f090-1c8a-419e-95da-4d6c82bcde8d-logs\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.768880 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-pswzg"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.770284 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.787619 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d4cb98fc-6tg42"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.802883 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75589bd9c8-npg4p"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.813794 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.814316 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835239 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-config\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835280 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-logs\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835312 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-svc\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835338 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4589304-68d2-48c9-a691-e34a9cb4c75b-logs\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835361 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-nb\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835407 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-sb\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835457 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-combined-ca-bundle\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835524 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835567 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkwsb\" (UniqueName: \"kubernetes.io/projected/113b4dcd-ae8d-4c1e-af62-07441b8665a9-kube-api-access-fkwsb\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835600 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835627 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data-custom\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835659 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nw9q\" (UniqueName: \"kubernetes.io/projected/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-kube-api-access-6nw9q\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835689 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-swift-storage-0\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data-custom\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835745 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd56h\" (UniqueName: \"kubernetes.io/projected/d4589304-68d2-48c9-a691-e34a9cb4c75b-kube-api-access-jd56h\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.835781 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-combined-ca-bundle\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.836914 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5sph\" (UniqueName: \"kubernetes.io/projected/82e7f090-1c8a-419e-95da-4d6c82bcde8d-kube-api-access-k5sph\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.837362 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data-custom\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.839295 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.841968 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-pswzg"] Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.859492 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-combined-ca-bundle\") pod \"barbican-keystone-listener-866f4c5954-tljg7\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937414 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-sb\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937451 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-combined-ca-bundle\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937497 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937536 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkwsb\" (UniqueName: \"kubernetes.io/projected/113b4dcd-ae8d-4c1e-af62-07441b8665a9-kube-api-access-fkwsb\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937557 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937587 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data-custom\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937618 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nw9q\" (UniqueName: \"kubernetes.io/projected/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-kube-api-access-6nw9q\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937656 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-swift-storage-0\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937675 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data-custom\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937694 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd56h\" (UniqueName: \"kubernetes.io/projected/d4589304-68d2-48c9-a691-e34a9cb4c75b-kube-api-access-jd56h\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937715 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-combined-ca-bundle\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937755 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-config\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937771 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-logs\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937789 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-svc\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937820 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4589304-68d2-48c9-a691-e34a9cb4c75b-logs\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.937840 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-nb\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.938886 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-nb\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.948783 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-swift-storage-0\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.949943 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-sb\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.950227 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-logs\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.950584 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-config\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.951708 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4589304-68d2-48c9-a691-e34a9cb4c75b-logs\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.965317 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-svc\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.985517 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.986993 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:23 crc kubenswrapper[4909]: I0202 10:51:23.997588 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-combined-ca-bundle\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.000199 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data-custom\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.001507 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data-custom\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.002504 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-combined-ca-bundle\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.004324 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.059391 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-bc5db8d96-8gwft"] Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.061038 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd56h\" (UniqueName: \"kubernetes.io/projected/d4589304-68d2-48c9-a691-e34a9cb4c75b-kube-api-access-jd56h\") pod \"barbican-worker-6d4cb98fc-6tg42\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.061949 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nw9q\" (UniqueName: \"kubernetes.io/projected/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-kube-api-access-6nw9q\") pod \"barbican-keystone-listener-75589bd9c8-npg4p\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.062791 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkwsb\" (UniqueName: \"kubernetes.io/projected/113b4dcd-ae8d-4c1e-af62-07441b8665a9-kube-api-access-fkwsb\") pod \"dnsmasq-dns-6b55f48d49-pswzg\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.064425 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.073115 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.087509 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bc5db8d96-8gwft"] Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.100033 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.122673 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.165997 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.198463 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78q8n\" (UniqueName: \"kubernetes.io/projected/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-kube-api-access-78q8n\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.198535 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-combined-ca-bundle\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.198884 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.198948 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-logs\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.198984 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data-custom\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.301781 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-logs\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.301842 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data-custom\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.301998 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78q8n\" (UniqueName: \"kubernetes.io/projected/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-kube-api-access-78q8n\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.302046 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-combined-ca-bundle\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.302110 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.302822 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-logs\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.312601 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data-custom\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.324228 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-combined-ca-bundle\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.326306 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.333922 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78q8n\" (UniqueName: \"kubernetes.io/projected/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-kube-api-access-78q8n\") pod \"barbican-api-bc5db8d96-8gwft\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.373742 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54c465c874-5jkf8"] Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.430599 4909 generic.go:334] "Generic (PLEG): container finished" podID="7b681927-4a4d-4177-97ec-eee1e9d43cb8" containerID="96841beac53340f127b385adf0a063a9b0c1245d7d50c79437e842a511862338" exitCode=0 Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.430690 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" event={"ID":"7b681927-4a4d-4177-97ec-eee1e9d43cb8","Type":"ContainerDied","Data":"96841beac53340f127b385adf0a063a9b0c1245d7d50c79437e842a511862338"} Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.441445 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.442687 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.512422 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.531500 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.564900 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:51:24 crc kubenswrapper[4909]: I0202 10:51:24.918044 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.044937 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-nb\") pod \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.045001 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-config\") pod \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.045066 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htxnd\" (UniqueName: \"kubernetes.io/projected/7b681927-4a4d-4177-97ec-eee1e9d43cb8-kube-api-access-htxnd\") pod \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.045174 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-sb\") pod \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.045250 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-svc\") pod \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.045394 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-swift-storage-0\") pod \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\" (UID: \"7b681927-4a4d-4177-97ec-eee1e9d43cb8\") " Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.087574 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b681927-4a4d-4177-97ec-eee1e9d43cb8-kube-api-access-htxnd" (OuterVolumeSpecName: "kube-api-access-htxnd") pod "7b681927-4a4d-4177-97ec-eee1e9d43cb8" (UID: "7b681927-4a4d-4177-97ec-eee1e9d43cb8"). InnerVolumeSpecName "kube-api-access-htxnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.157522 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htxnd\" (UniqueName: \"kubernetes.io/projected/7b681927-4a4d-4177-97ec-eee1e9d43cb8-kube-api-access-htxnd\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:25 crc kubenswrapper[4909]: W0202 10:51:25.246976 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podade61e3a_3fd8_40af_8891_46c51e7a9d1b.slice/crio-40b2b657e93c64f51fc572ce95f28496a337ccfd3b9acbdaee96dfe00e5bc40e WatchSource:0}: Error finding container 40b2b657e93c64f51fc572ce95f28496a337ccfd3b9acbdaee96dfe00e5bc40e: Status 404 returned error can't find the container with id 40b2b657e93c64f51fc572ce95f28496a337ccfd3b9acbdaee96dfe00e5bc40e Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.303256 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7b681927-4a4d-4177-97ec-eee1e9d43cb8" (UID: "7b681927-4a4d-4177-97ec-eee1e9d43cb8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.341748 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7b681927-4a4d-4177-97ec-eee1e9d43cb8" (UID: "7b681927-4a4d-4177-97ec-eee1e9d43cb8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.372885 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75589bd9c8-npg4p"] Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.373415 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7b681927-4a4d-4177-97ec-eee1e9d43cb8" (UID: "7b681927-4a4d-4177-97ec-eee1e9d43cb8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.374463 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-config" (OuterVolumeSpecName: "config") pod "7b681927-4a4d-4177-97ec-eee1e9d43cb8" (UID: "7b681927-4a4d-4177-97ec-eee1e9d43cb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.378950 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.378971 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.378980 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.378992 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.412903 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7b681927-4a4d-4177-97ec-eee1e9d43cb8" (UID: "7b681927-4a4d-4177-97ec-eee1e9d43cb8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.463642 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-658647b45-s5s8w"] Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.466056 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-658647b45-s5s8w" event={"ID":"ade61e3a-3fd8-40af-8891-46c51e7a9d1b","Type":"ContainerStarted","Data":"40b2b657e93c64f51fc572ce95f28496a337ccfd3b9acbdaee96dfe00e5bc40e"} Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.478132 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" event={"ID":"7b681927-4a4d-4177-97ec-eee1e9d43cb8","Type":"ContainerDied","Data":"93d4c675b5569ec4ea6abd41c19ad5ae0b832bb81497c4ae6710cd1896f187e3"} Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.478219 4909 scope.go:117] "RemoveContainer" containerID="96841beac53340f127b385adf0a063a9b0c1245d7d50c79437e842a511862338" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.478537 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-dnppd" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.490377 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-866f4c5954-tljg7"] Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.501219 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7b681927-4a4d-4177-97ec-eee1e9d43cb8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.516653 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54c465c874-5jkf8" event={"ID":"a4a64068-38da-44e0-99a6-93aa570aef32","Type":"ContainerStarted","Data":"6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76"} Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.516783 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54c465c874-5jkf8" event={"ID":"a4a64068-38da-44e0-99a6-93aa570aef32","Type":"ContainerStarted","Data":"2d3c9a4a799772b956833d2b6dfc19d9d1e4f23863940d5d9272286683b6a923"} Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.519324 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7bb594654d-prg2q"] Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.531843 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" event={"ID":"82e7f090-1c8a-419e-95da-4d6c82bcde8d","Type":"ContainerStarted","Data":"c9a181f009668f93fba5b865d2843aa04914cf2e8eafcad2dd22ad44112a2191"} Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.542268 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d4cb98fc-6tg42" event={"ID":"d4589304-68d2-48c9-a691-e34a9cb4c75b","Type":"ContainerStarted","Data":"d45f2f677836f43b339590ff0f90600a91d9a2459322b376e7c3797133b37fbf"} Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.548906 4909 scope.go:117] "RemoveContainer" containerID="86698faabd6bab4bda58b445b66eb68041ad49d0e26b314280cf2dee0e8599d5" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.559881 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-pswzg"] Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.562961 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" event={"ID":"113b4dcd-ae8d-4c1e-af62-07441b8665a9","Type":"ContainerStarted","Data":"1decab96899cf3074facf4fdaa1e6b21e99696e00f9a83539714f2dd9df32f6c"} Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.572996 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d4cb98fc-6tg42"] Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.578555 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-bc5db8d96-8gwft"] Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.584102 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" event={"ID":"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a","Type":"ContainerStarted","Data":"3666c9dd742bb2e660a2e5f09ee5ad6371c8f5a2cea7da3cc530291b2492447c"} Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.588310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bb594654d-prg2q" event={"ID":"86bc749f-73e5-4bcc-8079-7c9b053e0318","Type":"ContainerStarted","Data":"970657d27ac1802c3277eb374c96858a64ea9afab783dcccf3e14553e78df636"} Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.588768 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.589662 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.592644 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-dnppd"] Feb 02 10:51:25 crc kubenswrapper[4909]: I0202 10:51:25.601987 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-dnppd"] Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.604210 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54c465c874-5jkf8" event={"ID":"a4a64068-38da-44e0-99a6-93aa570aef32","Type":"ContainerStarted","Data":"68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a"} Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.605017 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.605200 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.620011 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bc5db8d96-8gwft" event={"ID":"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa","Type":"ContainerStarted","Data":"337a70a6ab943486d81b6c0242ebbff8615162d6b8377690d8386e631787317c"} Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.620056 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bc5db8d96-8gwft" event={"ID":"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa","Type":"ContainerStarted","Data":"93d9f1086be2acc5ec3ecbf84647a6da95e29f9c01f730db52a0d526107b8511"} Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.620089 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bc5db8d96-8gwft" event={"ID":"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa","Type":"ContainerStarted","Data":"367a26312055d5b54f70f855eb94b467868288a062b7990c70a3b38e3fb37df8"} Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.620643 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.620734 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.625120 4909 generic.go:334] "Generic (PLEG): container finished" podID="113b4dcd-ae8d-4c1e-af62-07441b8665a9" containerID="255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9" exitCode=0 Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.625227 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" event={"ID":"113b4dcd-ae8d-4c1e-af62-07441b8665a9","Type":"ContainerDied","Data":"255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9"} Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.627766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bb594654d-prg2q" event={"ID":"86bc749f-73e5-4bcc-8079-7c9b053e0318","Type":"ContainerStarted","Data":"0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737"} Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.629152 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.667682 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-54c465c874-5jkf8" podStartSLOduration=3.652305214 podStartE2EDuration="3.652305214s" podCreationTimestamp="2026-02-02 10:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:26.64300426 +0000 UTC m=+1212.389104995" watchObservedRunningTime="2026-02-02 10:51:26.652305214 +0000 UTC m=+1212.398405949" Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.708676 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-bc5db8d96-8gwft" podStartSLOduration=3.7086566039999997 podStartE2EDuration="3.708656604s" podCreationTimestamp="2026-02-02 10:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:26.700088121 +0000 UTC m=+1212.446188856" watchObservedRunningTime="2026-02-02 10:51:26.708656604 +0000 UTC m=+1212.454757339" Feb 02 10:51:26 crc kubenswrapper[4909]: I0202 10:51:26.734763 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7bb594654d-prg2q" podStartSLOduration=3.734746955 podStartE2EDuration="3.734746955s" podCreationTimestamp="2026-02-02 10:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:26.732897422 +0000 UTC m=+1212.478998157" watchObservedRunningTime="2026-02-02 10:51:26.734746955 +0000 UTC m=+1212.480847690" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.045993 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b681927-4a4d-4177-97ec-eee1e9d43cb8" path="/var/lib/kubelet/pods/7b681927-4a4d-4177-97ec-eee1e9d43cb8/volumes" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.673085 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.673333 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.676490 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" event={"ID":"113b4dcd-ae8d-4c1e-af62-07441b8665a9","Type":"ContainerStarted","Data":"56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee"} Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.676587 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.697332 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" podStartSLOduration=4.697313661 podStartE2EDuration="4.697313661s" podCreationTimestamp="2026-02-02 10:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:27.69622372 +0000 UTC m=+1213.442324455" watchObservedRunningTime="2026-02-02 10:51:27.697313661 +0000 UTC m=+1213.443414396" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.882180 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5d56d8dff8-fh9sw"] Feb 02 10:51:27 crc kubenswrapper[4909]: E0202 10:51:27.884181 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b681927-4a4d-4177-97ec-eee1e9d43cb8" containerName="init" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.884225 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b681927-4a4d-4177-97ec-eee1e9d43cb8" containerName="init" Feb 02 10:51:27 crc kubenswrapper[4909]: E0202 10:51:27.884243 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b681927-4a4d-4177-97ec-eee1e9d43cb8" containerName="dnsmasq-dns" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.884249 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b681927-4a4d-4177-97ec-eee1e9d43cb8" containerName="dnsmasq-dns" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.884675 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b681927-4a4d-4177-97ec-eee1e9d43cb8" containerName="dnsmasq-dns" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.892692 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.903917 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5d56d8dff8-fh9sw"] Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.948900 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6c7fbb5f9b-4fbwp"] Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.950894 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.956381 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.957047 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 02 10:51:27 crc kubenswrapper[4909]: I0202 10:51:27.974722 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c7fbb5f9b-4fbwp"] Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.019305 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-combined-ca-bundle\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.020459 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-public-tls-certs\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.020505 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-scripts\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.020552 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-logs\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.031551 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-internal-tls-certs\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.031688 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krr4b\" (UniqueName: \"kubernetes.io/projected/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-kube-api-access-krr4b\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.031846 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-config-data\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.058631 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.058741 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.133531 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-internal-tls-certs\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.133588 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-internal-tls-certs\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.133622 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-combined-ca-bundle\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.133644 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krr4b\" (UniqueName: \"kubernetes.io/projected/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-kube-api-access-krr4b\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.133663 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04232dcc-dda5-4774-b999-5104335f2da0-logs\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.133684 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.133730 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-config-data\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.133767 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-public-tls-certs\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.133884 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-combined-ca-bundle\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.133982 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data-custom\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.134084 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-public-tls-certs\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.134177 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-scripts\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.134226 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtjfn\" (UniqueName: \"kubernetes.io/projected/04232dcc-dda5-4774-b999-5104335f2da0-kube-api-access-rtjfn\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.134320 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-logs\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.137683 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-logs\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.140077 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-config-data\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.142050 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-combined-ca-bundle\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.143507 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-public-tls-certs\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.149222 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-scripts\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.152294 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-internal-tls-certs\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.180510 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.185310 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krr4b\" (UniqueName: \"kubernetes.io/projected/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-kube-api-access-krr4b\") pod \"placement-5d56d8dff8-fh9sw\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.236087 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-public-tls-certs\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.236139 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data-custom\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.236225 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtjfn\" (UniqueName: \"kubernetes.io/projected/04232dcc-dda5-4774-b999-5104335f2da0-kube-api-access-rtjfn\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.236333 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-internal-tls-certs\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.236370 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-combined-ca-bundle\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.236402 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04232dcc-dda5-4774-b999-5104335f2da0-logs\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.236423 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.238411 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04232dcc-dda5-4774-b999-5104335f2da0-logs\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.242420 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-internal-tls-certs\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.245359 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data-custom\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.245424 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.246382 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-public-tls-certs\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.255384 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-combined-ca-bundle\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.256882 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.264776 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtjfn\" (UniqueName: \"kubernetes.io/projected/04232dcc-dda5-4774-b999-5104335f2da0-kube-api-access-rtjfn\") pod \"barbican-api-6c7fbb5f9b-4fbwp\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.293052 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.694245 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xqjmc" event={"ID":"d68a9e4e-b453-459f-b397-9c6d7c221dda","Type":"ContainerStarted","Data":"453b20a20c9169f3f80a085b62a851af4591bf9be5cc667f73032d0f885d7114"} Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.712997 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xqjmc" podStartSLOduration=3.52850345 podStartE2EDuration="36.712979715s" podCreationTimestamp="2026-02-02 10:50:52 +0000 UTC" firstStartedPulling="2026-02-02 10:50:53.461112139 +0000 UTC m=+1179.207212874" lastFinishedPulling="2026-02-02 10:51:26.645588404 +0000 UTC m=+1212.391689139" observedRunningTime="2026-02-02 10:51:28.711519753 +0000 UTC m=+1214.457620488" watchObservedRunningTime="2026-02-02 10:51:28.712979715 +0000 UTC m=+1214.459080450" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.863408 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:51:28 crc kubenswrapper[4909]: I0202 10:51:28.863459 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:51:29 crc kubenswrapper[4909]: I0202 10:51:29.844791 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c7fbb5f9b-4fbwp"] Feb 02 10:51:29 crc kubenswrapper[4909]: W0202 10:51:29.866390 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04232dcc_dda5_4774_b999_5104335f2da0.slice/crio-943333632efbfbf6251d2da0a6f3592d760b3a1be6b03002e16b1751ad7c126d WatchSource:0}: Error finding container 943333632efbfbf6251d2da0a6f3592d760b3a1be6b03002e16b1751ad7c126d: Status 404 returned error can't find the container with id 943333632efbfbf6251d2da0a6f3592d760b3a1be6b03002e16b1751ad7c126d Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.148126 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5d56d8dff8-fh9sw"] Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.728269 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d4cb98fc-6tg42" event={"ID":"d4589304-68d2-48c9-a691-e34a9cb4c75b","Type":"ContainerStarted","Data":"397ade4dfec5a408d839ad6fb1e26085b9b09d9f93c56399c0d7b99f36c1aded"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.728726 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d4cb98fc-6tg42" event={"ID":"d4589304-68d2-48c9-a691-e34a9cb4c75b","Type":"ContainerStarted","Data":"766cdd0ecebd2c830eb9eb4207f7f04fbf9e06724a13a143f2ffa40073116503"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.742653 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d56d8dff8-fh9sw" event={"ID":"b6be13cc-01ed-441f-b2c9-dc024fcb4b18","Type":"ContainerStarted","Data":"39650f1fdde2fdbf3bd917a0f731e9a813c0c766ece71f265bc33c3787265ed1"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.742702 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d56d8dff8-fh9sw" event={"ID":"b6be13cc-01ed-441f-b2c9-dc024fcb4b18","Type":"ContainerStarted","Data":"ca078b1d864dcc02751079f14dd3708fe78bce7b544c8de78b4bb2d92c23e89e"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.756478 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" event={"ID":"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a","Type":"ContainerStarted","Data":"b85d85d06b59a99a123c0147182bc23c0f8782f8baf6bdc50f6e3c5737f2292a"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.756530 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" event={"ID":"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a","Type":"ContainerStarted","Data":"cdf003d472b80a2499008207df29bc8a7fe1a4f1fbca8fa6aa522ad599f3f1ea"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.768901 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-658647b45-s5s8w" event={"ID":"ade61e3a-3fd8-40af-8891-46c51e7a9d1b","Type":"ContainerStarted","Data":"f0e6e01e183780aa6ddd90bcada53fd5724041fd1de37ec76059d32a3f84f455"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.768950 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-658647b45-s5s8w" event={"ID":"ade61e3a-3fd8-40af-8891-46c51e7a9d1b","Type":"ContainerStarted","Data":"fcdf51be789bd6a6d4d7395610d676379719ae55063b0bf3dc47c6a71ded5169"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.802253 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" event={"ID":"82e7f090-1c8a-419e-95da-4d6c82bcde8d","Type":"ContainerStarted","Data":"058be1382e5717fb1e57cdf585e0153bea98324cf6703943f27f9e9a5d4a650c"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.802288 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" event={"ID":"82e7f090-1c8a-419e-95da-4d6c82bcde8d","Type":"ContainerStarted","Data":"2aac88e4f9abc70098ab49a36bed3602ad9256a56ae95b31d40125b608196fa8"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.838257 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d4cb98fc-6tg42" podStartSLOduration=3.954283447 podStartE2EDuration="7.83824074s" podCreationTimestamp="2026-02-02 10:51:23 +0000 UTC" firstStartedPulling="2026-02-02 10:51:25.416980994 +0000 UTC m=+1211.163081729" lastFinishedPulling="2026-02-02 10:51:29.300938287 +0000 UTC m=+1215.047039022" observedRunningTime="2026-02-02 10:51:30.749732988 +0000 UTC m=+1216.495833723" watchObservedRunningTime="2026-02-02 10:51:30.83824074 +0000 UTC m=+1216.584341475" Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.851608 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" event={"ID":"04232dcc-dda5-4774-b999-5104335f2da0","Type":"ContainerStarted","Data":"806aaa0ff1451eccc6cc80605c3b759d86a7c3041d5b3685c9a6764ed0b9dda2"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.851648 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" event={"ID":"04232dcc-dda5-4774-b999-5104335f2da0","Type":"ContainerStarted","Data":"7573e7b7356e01328ac6964e4639c84a155f497ab30f7801c09c8069b1ba1175"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.851659 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" event={"ID":"04232dcc-dda5-4774-b999-5104335f2da0","Type":"ContainerStarted","Data":"943333632efbfbf6251d2da0a6f3592d760b3a1be6b03002e16b1751ad7c126d"} Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.852445 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-658647b45-s5s8w"] Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.852464 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.852474 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.868203 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" podStartSLOduration=3.887600225 podStartE2EDuration="7.868137479s" podCreationTimestamp="2026-02-02 10:51:23 +0000 UTC" firstStartedPulling="2026-02-02 10:51:25.306300862 +0000 UTC m=+1211.052401607" lastFinishedPulling="2026-02-02 10:51:29.286838126 +0000 UTC m=+1215.032938861" observedRunningTime="2026-02-02 10:51:30.789511797 +0000 UTC m=+1216.535612532" watchObservedRunningTime="2026-02-02 10:51:30.868137479 +0000 UTC m=+1216.614238214" Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.890079 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-866f4c5954-tljg7"] Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.896137 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-658647b45-s5s8w" podStartSLOduration=3.8614755929999998 podStartE2EDuration="7.896113103s" podCreationTimestamp="2026-02-02 10:51:23 +0000 UTC" firstStartedPulling="2026-02-02 10:51:25.284924425 +0000 UTC m=+1211.031025160" lastFinishedPulling="2026-02-02 10:51:29.319561935 +0000 UTC m=+1215.065662670" observedRunningTime="2026-02-02 10:51:30.835692688 +0000 UTC m=+1216.581793423" watchObservedRunningTime="2026-02-02 10:51:30.896113103 +0000 UTC m=+1216.642213838" Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.913298 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" podStartSLOduration=3.909139685 podStartE2EDuration="7.91327351s" podCreationTimestamp="2026-02-02 10:51:23 +0000 UTC" firstStartedPulling="2026-02-02 10:51:25.290111252 +0000 UTC m=+1211.036211987" lastFinishedPulling="2026-02-02 10:51:29.294245077 +0000 UTC m=+1215.040345812" observedRunningTime="2026-02-02 10:51:30.856940161 +0000 UTC m=+1216.603040896" watchObservedRunningTime="2026-02-02 10:51:30.91327351 +0000 UTC m=+1216.659374245" Feb 02 10:51:30 crc kubenswrapper[4909]: I0202 10:51:30.922015 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" podStartSLOduration=3.9219919880000003 podStartE2EDuration="3.921991988s" podCreationTimestamp="2026-02-02 10:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:30.878179174 +0000 UTC m=+1216.624279909" watchObservedRunningTime="2026-02-02 10:51:30.921991988 +0000 UTC m=+1216.668092723" Feb 02 10:51:31 crc kubenswrapper[4909]: I0202 10:51:31.893263 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d56d8dff8-fh9sw" event={"ID":"b6be13cc-01ed-441f-b2c9-dc024fcb4b18","Type":"ContainerStarted","Data":"a0b641eb35c253bddcb998df3a1e8bb281e92fcbf9ed7e43633609aff1d97578"} Feb 02 10:51:31 crc kubenswrapper[4909]: I0202 10:51:31.898151 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:31 crc kubenswrapper[4909]: I0202 10:51:31.937554 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5d56d8dff8-fh9sw" podStartSLOduration=4.937507897 podStartE2EDuration="4.937507897s" podCreationTimestamp="2026-02-02 10:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:31.918653542 +0000 UTC m=+1217.664754287" watchObservedRunningTime="2026-02-02 10:51:31.937507897 +0000 UTC m=+1217.683608632" Feb 02 10:51:32 crc kubenswrapper[4909]: I0202 10:51:32.901638 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" podUID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" containerName="barbican-keystone-listener-log" containerID="cri-o://2aac88e4f9abc70098ab49a36bed3602ad9256a56ae95b31d40125b608196fa8" gracePeriod=30 Feb 02 10:51:32 crc kubenswrapper[4909]: I0202 10:51:32.902099 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-658647b45-s5s8w" podUID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" containerName="barbican-worker-log" containerID="cri-o://fcdf51be789bd6a6d4d7395610d676379719ae55063b0bf3dc47c6a71ded5169" gracePeriod=30 Feb 02 10:51:32 crc kubenswrapper[4909]: I0202 10:51:32.902148 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" podUID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" containerName="barbican-keystone-listener" containerID="cri-o://058be1382e5717fb1e57cdf585e0153bea98324cf6703943f27f9e9a5d4a650c" gracePeriod=30 Feb 02 10:51:32 crc kubenswrapper[4909]: I0202 10:51:32.902714 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:32 crc kubenswrapper[4909]: I0202 10:51:32.902751 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-658647b45-s5s8w" podUID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" containerName="barbican-worker" containerID="cri-o://f0e6e01e183780aa6ddd90bcada53fd5724041fd1de37ec76059d32a3f84f455" gracePeriod=30 Feb 02 10:51:33 crc kubenswrapper[4909]: I0202 10:51:33.912531 4909 generic.go:334] "Generic (PLEG): container finished" podID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" containerID="058be1382e5717fb1e57cdf585e0153bea98324cf6703943f27f9e9a5d4a650c" exitCode=0 Feb 02 10:51:33 crc kubenswrapper[4909]: I0202 10:51:33.912927 4909 generic.go:334] "Generic (PLEG): container finished" podID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" containerID="2aac88e4f9abc70098ab49a36bed3602ad9256a56ae95b31d40125b608196fa8" exitCode=143 Feb 02 10:51:33 crc kubenswrapper[4909]: I0202 10:51:33.912736 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" event={"ID":"82e7f090-1c8a-419e-95da-4d6c82bcde8d","Type":"ContainerDied","Data":"058be1382e5717fb1e57cdf585e0153bea98324cf6703943f27f9e9a5d4a650c"} Feb 02 10:51:33 crc kubenswrapper[4909]: I0202 10:51:33.913010 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" event={"ID":"82e7f090-1c8a-419e-95da-4d6c82bcde8d","Type":"ContainerDied","Data":"2aac88e4f9abc70098ab49a36bed3602ad9256a56ae95b31d40125b608196fa8"} Feb 02 10:51:33 crc kubenswrapper[4909]: I0202 10:51:33.915279 4909 generic.go:334] "Generic (PLEG): container finished" podID="d68a9e4e-b453-459f-b397-9c6d7c221dda" containerID="453b20a20c9169f3f80a085b62a851af4591bf9be5cc667f73032d0f885d7114" exitCode=0 Feb 02 10:51:33 crc kubenswrapper[4909]: I0202 10:51:33.915324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xqjmc" event={"ID":"d68a9e4e-b453-459f-b397-9c6d7c221dda","Type":"ContainerDied","Data":"453b20a20c9169f3f80a085b62a851af4591bf9be5cc667f73032d0f885d7114"} Feb 02 10:51:33 crc kubenswrapper[4909]: I0202 10:51:33.917642 4909 generic.go:334] "Generic (PLEG): container finished" podID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" containerID="f0e6e01e183780aa6ddd90bcada53fd5724041fd1de37ec76059d32a3f84f455" exitCode=0 Feb 02 10:51:33 crc kubenswrapper[4909]: I0202 10:51:33.917665 4909 generic.go:334] "Generic (PLEG): container finished" podID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" containerID="fcdf51be789bd6a6d4d7395610d676379719ae55063b0bf3dc47c6a71ded5169" exitCode=143 Feb 02 10:51:33 crc kubenswrapper[4909]: I0202 10:51:33.917686 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-658647b45-s5s8w" event={"ID":"ade61e3a-3fd8-40af-8891-46c51e7a9d1b","Type":"ContainerDied","Data":"f0e6e01e183780aa6ddd90bcada53fd5724041fd1de37ec76059d32a3f84f455"} Feb 02 10:51:33 crc kubenswrapper[4909]: I0202 10:51:33.917769 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-658647b45-s5s8w" event={"ID":"ade61e3a-3fd8-40af-8891-46c51e7a9d1b","Type":"ContainerDied","Data":"fcdf51be789bd6a6d4d7395610d676379719ae55063b0bf3dc47c6a71ded5169"} Feb 02 10:51:34 crc kubenswrapper[4909]: I0202 10:51:34.169035 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:34 crc kubenswrapper[4909]: I0202 10:51:34.243082 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-sgcx6"] Feb 02 10:51:34 crc kubenswrapper[4909]: I0202 10:51:34.246180 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" podUID="19d20c70-a055-4ecf-b593-95697717de45" containerName="dnsmasq-dns" containerID="cri-o://d27bc9973e507de4ff445be7075821941e661aaf7dacfe8ec4a1ac24d6405616" gracePeriod=10 Feb 02 10:51:34 crc kubenswrapper[4909]: I0202 10:51:34.943507 4909 generic.go:334] "Generic (PLEG): container finished" podID="19d20c70-a055-4ecf-b593-95697717de45" containerID="d27bc9973e507de4ff445be7075821941e661aaf7dacfe8ec4a1ac24d6405616" exitCode=0 Feb 02 10:51:34 crc kubenswrapper[4909]: I0202 10:51:34.943590 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" event={"ID":"19d20c70-a055-4ecf-b593-95697717de45","Type":"ContainerDied","Data":"d27bc9973e507de4ff445be7075821941e661aaf7dacfe8ec4a1ac24d6405616"} Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.090318 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.209706 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.410881 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.520758 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q7g4\" (UniqueName: \"kubernetes.io/projected/d68a9e4e-b453-459f-b397-9c6d7c221dda-kube-api-access-5q7g4\") pod \"d68a9e4e-b453-459f-b397-9c6d7c221dda\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.520834 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-db-sync-config-data\") pod \"d68a9e4e-b453-459f-b397-9c6d7c221dda\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.520903 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-config-data\") pod \"d68a9e4e-b453-459f-b397-9c6d7c221dda\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.520950 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-combined-ca-bundle\") pod \"d68a9e4e-b453-459f-b397-9c6d7c221dda\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.521014 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-scripts\") pod \"d68a9e4e-b453-459f-b397-9c6d7c221dda\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.521080 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d68a9e4e-b453-459f-b397-9c6d7c221dda-etc-machine-id\") pod \"d68a9e4e-b453-459f-b397-9c6d7c221dda\" (UID: \"d68a9e4e-b453-459f-b397-9c6d7c221dda\") " Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.521486 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d68a9e4e-b453-459f-b397-9c6d7c221dda-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d68a9e4e-b453-459f-b397-9c6d7c221dda" (UID: "d68a9e4e-b453-459f-b397-9c6d7c221dda"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.529118 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d68a9e4e-b453-459f-b397-9c6d7c221dda-kube-api-access-5q7g4" (OuterVolumeSpecName: "kube-api-access-5q7g4") pod "d68a9e4e-b453-459f-b397-9c6d7c221dda" (UID: "d68a9e4e-b453-459f-b397-9c6d7c221dda"). InnerVolumeSpecName "kube-api-access-5q7g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.529826 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d68a9e4e-b453-459f-b397-9c6d7c221dda" (UID: "d68a9e4e-b453-459f-b397-9c6d7c221dda"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.543993 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-scripts" (OuterVolumeSpecName: "scripts") pod "d68a9e4e-b453-459f-b397-9c6d7c221dda" (UID: "d68a9e4e-b453-459f-b397-9c6d7c221dda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.555244 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d68a9e4e-b453-459f-b397-9c6d7c221dda" (UID: "d68a9e4e-b453-459f-b397-9c6d7c221dda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.580039 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-config-data" (OuterVolumeSpecName: "config-data") pod "d68a9e4e-b453-459f-b397-9c6d7c221dda" (UID: "d68a9e4e-b453-459f-b397-9c6d7c221dda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.623785 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q7g4\" (UniqueName: \"kubernetes.io/projected/d68a9e4e-b453-459f-b397-9c6d7c221dda-kube-api-access-5q7g4\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.623835 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.623845 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.623854 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.623862 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d68a9e4e-b453-459f-b397-9c6d7c221dda-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.623870 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d68a9e4e-b453-459f-b397-9c6d7c221dda-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.964395 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xqjmc" Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.971870 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xqjmc" event={"ID":"d68a9e4e-b453-459f-b397-9c6d7c221dda","Type":"ContainerDied","Data":"43d7baa7d79263b0fb9547aab2fd421f5505c60aaf5047133ad8dcac3587eff1"} Feb 02 10:51:36 crc kubenswrapper[4909]: I0202 10:51:36.971906 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d7baa7d79263b0fb9547aab2fd421f5505c60aaf5047133ad8dcac3587eff1" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.756855 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:51:37 crc kubenswrapper[4909]: E0202 10:51:37.757618 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d68a9e4e-b453-459f-b397-9c6d7c221dda" containerName="cinder-db-sync" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.757634 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d68a9e4e-b453-459f-b397-9c6d7c221dda" containerName="cinder-db-sync" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.757949 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d68a9e4e-b453-459f-b397-9c6d7c221dda" containerName="cinder-db-sync" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.759112 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.762352 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-hsdjf" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.762631 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.763466 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.763770 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.796196 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.830250 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-rl62l"] Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.832467 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847639 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847707 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847729 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-config\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847751 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847779 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j477p\" (UniqueName: \"kubernetes.io/projected/17452cea-74ae-4e13-8369-431a2062addc-kube-api-access-j477p\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847818 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847841 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847858 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847890 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847919 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-svc\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847939 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.847954 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p282m\" (UniqueName: \"kubernetes.io/projected/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-kube-api-access-p282m\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.857730 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-rl62l"] Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956438 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j477p\" (UniqueName: \"kubernetes.io/projected/17452cea-74ae-4e13-8369-431a2062addc-kube-api-access-j477p\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956498 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956541 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956564 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956619 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956656 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-svc\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956685 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956707 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p282m\" (UniqueName: \"kubernetes.io/projected/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-kube-api-access-p282m\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956775 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956878 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956906 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-config\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.956934 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.957614 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.958158 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.961739 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.964353 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-svc\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.964466 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-config\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.964761 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.981885 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.981953 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-scripts\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.982188 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.985139 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.985670 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p282m\" (UniqueName: \"kubernetes.io/projected/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-kube-api-access-p282m\") pod \"cinder-scheduler-0\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:37 crc kubenswrapper[4909]: I0202 10:51:37.997572 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j477p\" (UniqueName: \"kubernetes.io/projected/17452cea-74ae-4e13-8369-431a2062addc-kube-api-access-j477p\") pod \"dnsmasq-dns-6dc67df487-rl62l\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.021701 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" event={"ID":"82e7f090-1c8a-419e-95da-4d6c82bcde8d","Type":"ContainerDied","Data":"c9a181f009668f93fba5b865d2843aa04914cf2e8eafcad2dd22ad44112a2191"} Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.021773 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9a181f009668f93fba5b865d2843aa04914cf2e8eafcad2dd22ad44112a2191" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.064585 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.066711 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.075677 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.079390 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.079968 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.087827 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.150938 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.164265 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5sph\" (UniqueName: \"kubernetes.io/projected/82e7f090-1c8a-419e-95da-4d6c82bcde8d-kube-api-access-k5sph\") pod \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.164423 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82e7f090-1c8a-419e-95da-4d6c82bcde8d-logs\") pod \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.164507 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data\") pod \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.164547 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data-custom\") pod \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.164585 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-combined-ca-bundle\") pod \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\" (UID: \"82e7f090-1c8a-419e-95da-4d6c82bcde8d\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.165515 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data-custom\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.165562 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ldlw\" (UniqueName: \"kubernetes.io/projected/e462e484-c2ad-4d5f-85b8-7042663617ff-kube-api-access-9ldlw\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.165581 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-scripts\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.165660 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e462e484-c2ad-4d5f-85b8-7042663617ff-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.165722 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.165800 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e462e484-c2ad-4d5f-85b8-7042663617ff-logs\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.165859 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.167845 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82e7f090-1c8a-419e-95da-4d6c82bcde8d-logs" (OuterVolumeSpecName: "logs") pod "82e7f090-1c8a-419e-95da-4d6c82bcde8d" (UID: "82e7f090-1c8a-419e-95da-4d6c82bcde8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.190334 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "82e7f090-1c8a-419e-95da-4d6c82bcde8d" (UID: "82e7f090-1c8a-419e-95da-4d6c82bcde8d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.190436 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e7f090-1c8a-419e-95da-4d6c82bcde8d-kube-api-access-k5sph" (OuterVolumeSpecName: "kube-api-access-k5sph") pod "82e7f090-1c8a-419e-95da-4d6c82bcde8d" (UID: "82e7f090-1c8a-419e-95da-4d6c82bcde8d"). InnerVolumeSpecName "kube-api-access-k5sph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.205379 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82e7f090-1c8a-419e-95da-4d6c82bcde8d" (UID: "82e7f090-1c8a-419e-95da-4d6c82bcde8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.229514 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data" (OuterVolumeSpecName: "config-data") pod "82e7f090-1c8a-419e-95da-4d6c82bcde8d" (UID: "82e7f090-1c8a-419e-95da-4d6c82bcde8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.267632 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e462e484-c2ad-4d5f-85b8-7042663617ff-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.267953 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.268584 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e462e484-c2ad-4d5f-85b8-7042663617ff-logs\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.268791 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.268971 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data-custom\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.269093 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ldlw\" (UniqueName: \"kubernetes.io/projected/e462e484-c2ad-4d5f-85b8-7042663617ff-kube-api-access-9ldlw\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.269219 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-scripts\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.269390 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5sph\" (UniqueName: \"kubernetes.io/projected/82e7f090-1c8a-419e-95da-4d6c82bcde8d-kube-api-access-k5sph\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.269486 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82e7f090-1c8a-419e-95da-4d6c82bcde8d-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.269587 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.269691 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.269785 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82e7f090-1c8a-419e-95da-4d6c82bcde8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.273282 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.267898 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e462e484-c2ad-4d5f-85b8-7042663617ff-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.273558 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e462e484-c2ad-4d5f-85b8-7042663617ff-logs\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.276612 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.277068 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-scripts\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.277348 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data-custom\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.291779 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ldlw\" (UniqueName: \"kubernetes.io/projected/e462e484-c2ad-4d5f-85b8-7042663617ff-kube-api-access-9ldlw\") pod \"cinder-api-0\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.396731 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.883348 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.887891 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.982618 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-nb\") pod \"19d20c70-a055-4ecf-b593-95697717de45\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.982688 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data\") pod \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.982848 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trj2m\" (UniqueName: \"kubernetes.io/projected/19d20c70-a055-4ecf-b593-95697717de45-kube-api-access-trj2m\") pod \"19d20c70-a055-4ecf-b593-95697717de45\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.982887 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-config\") pod \"19d20c70-a055-4ecf-b593-95697717de45\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.982924 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data-custom\") pod \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.982959 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-sb\") pod \"19d20c70-a055-4ecf-b593-95697717de45\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.982988 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-logs\") pod \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.983054 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-combined-ca-bundle\") pod \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.983091 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m45nc\" (UniqueName: \"kubernetes.io/projected/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-kube-api-access-m45nc\") pod \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\" (UID: \"ade61e3a-3fd8-40af-8891-46c51e7a9d1b\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.983170 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-svc\") pod \"19d20c70-a055-4ecf-b593-95697717de45\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.983212 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-swift-storage-0\") pod \"19d20c70-a055-4ecf-b593-95697717de45\" (UID: \"19d20c70-a055-4ecf-b593-95697717de45\") " Feb 02 10:51:38 crc kubenswrapper[4909]: I0202 10:51:38.984605 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-logs" (OuterVolumeSpecName: "logs") pod "ade61e3a-3fd8-40af-8891-46c51e7a9d1b" (UID: "ade61e3a-3fd8-40af-8891-46c51e7a9d1b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.008992 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d20c70-a055-4ecf-b593-95697717de45-kube-api-access-trj2m" (OuterVolumeSpecName: "kube-api-access-trj2m") pod "19d20c70-a055-4ecf-b593-95697717de45" (UID: "19d20c70-a055-4ecf-b593-95697717de45"). InnerVolumeSpecName "kube-api-access-trj2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.009446 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-kube-api-access-m45nc" (OuterVolumeSpecName: "kube-api-access-m45nc") pod "ade61e3a-3fd8-40af-8891-46c51e7a9d1b" (UID: "ade61e3a-3fd8-40af-8891-46c51e7a9d1b"). InnerVolumeSpecName "kube-api-access-m45nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.024414 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ade61e3a-3fd8-40af-8891-46c51e7a9d1b" (UID: "ade61e3a-3fd8-40af-8891-46c51e7a9d1b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.058962 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-658647b45-s5s8w" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.061824 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-866f4c5954-tljg7" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.062998 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.099471 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m45nc\" (UniqueName: \"kubernetes.io/projected/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-kube-api-access-m45nc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.101443 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trj2m\" (UniqueName: \"kubernetes.io/projected/19d20c70-a055-4ecf-b593-95697717de45-kube-api-access-trj2m\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.101459 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.101468 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.172739 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "19d20c70-a055-4ecf-b593-95697717de45" (UID: "19d20c70-a055-4ecf-b593-95697717de45"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.181489 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ade61e3a-3fd8-40af-8891-46c51e7a9d1b" (UID: "ade61e3a-3fd8-40af-8891-46c51e7a9d1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.182427 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-config" (OuterVolumeSpecName: "config") pod "19d20c70-a055-4ecf-b593-95697717de45" (UID: "19d20c70-a055-4ecf-b593-95697717de45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.196678 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "19d20c70-a055-4ecf-b593-95697717de45" (UID: "19d20c70-a055-4ecf-b593-95697717de45"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.197239 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "19d20c70-a055-4ecf-b593-95697717de45" (UID: "19d20c70-a055-4ecf-b593-95697717de45"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.211188 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.211224 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.211233 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.211244 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.211255 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.231417 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "19d20c70-a055-4ecf-b593-95697717de45" (UID: "19d20c70-a055-4ecf-b593-95697717de45"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.315102 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/19d20c70-a055-4ecf-b593-95697717de45-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.317026 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data" (OuterVolumeSpecName: "config-data") pod "ade61e3a-3fd8-40af-8891-46c51e7a9d1b" (UID: "ade61e3a-3fd8-40af-8891-46c51e7a9d1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.385858 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-658647b45-s5s8w" event={"ID":"ade61e3a-3fd8-40af-8891-46c51e7a9d1b","Type":"ContainerDied","Data":"40b2b657e93c64f51fc572ce95f28496a337ccfd3b9acbdaee96dfe00e5bc40e"} Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.385902 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" event={"ID":"19d20c70-a055-4ecf-b593-95697717de45","Type":"ContainerDied","Data":"7c3f59a930d809d2a9aa9b11e44559181d0a3f6b11ddd97283ccf32cb4b64799"} Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.385926 4909 scope.go:117] "RemoveContainer" containerID="f0e6e01e183780aa6ddd90bcada53fd5724041fd1de37ec76059d32a3f84f455" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.416691 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ade61e3a-3fd8-40af-8891-46c51e7a9d1b-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.545214 4909 scope.go:117] "RemoveContainer" containerID="fcdf51be789bd6a6d4d7395610d676379719ae55063b0bf3dc47c6a71ded5169" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.573303 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-sgcx6"] Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.581239 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-sgcx6"] Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.587467 4909 scope.go:117] "RemoveContainer" containerID="d27bc9973e507de4ff445be7075821941e661aaf7dacfe8ec4a1ac24d6405616" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.599708 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-866f4c5954-tljg7"] Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.616000 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-866f4c5954-tljg7"] Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.624552 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-658647b45-s5s8w"] Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.632973 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-658647b45-s5s8w"] Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.641006 4909 scope.go:117] "RemoveContainer" containerID="2d1e6a3040ccc31833bc3c562714f9ba5294b8c70b6272aa234f6df8b8d99c0e" Feb 02 10:51:39 crc kubenswrapper[4909]: E0202 10:51:39.641218 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.719068 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.728608 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:51:39 crc kubenswrapper[4909]: I0202 10:51:39.834877 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-rl62l"] Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.074429 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e462e484-c2ad-4d5f-85b8-7042663617ff","Type":"ContainerStarted","Data":"f0ad5cc9bb28005692be645cc967c4db160826a751f87b117252da1ad7fc6679"} Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.075675 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.080238 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4235b4-44a0-4238-9b70-ad3ea946f729","Type":"ContainerStarted","Data":"3d6d889cf5654de8573407f2359ea91d742a8ba74f142c89469ff361282b9916"} Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.080346 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="ceilometer-notification-agent" containerID="cri-o://5cdd443f73cd51e100bceb14caae9ac75a69bda8061677bfa6c49185d8fd7f82" gracePeriod=30 Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.080384 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="sg-core" containerID="cri-o://ec6c43b06eb3c0e8cc9e57843ebecb998a72a14920fbaa5f98469840ab83d3e9" gracePeriod=30 Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.080545 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.080560 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="proxy-httpd" containerID="cri-o://3d6d889cf5654de8573407f2359ea91d742a8ba74f142c89469ff361282b9916" gracePeriod=30 Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.090141 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.090861 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e","Type":"ContainerStarted","Data":"97c3ea1eedc67c253593b6fcff9738e7ca264a654af0353f601543c037dfa731"} Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.107134 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" event={"ID":"17452cea-74ae-4e13-8369-431a2062addc","Type":"ContainerStarted","Data":"7cd572050039adbc36366d7ae04bb6867825f7aea830bab43fac184cd47d3c25"} Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.210452 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.286971 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-bc5db8d96-8gwft"] Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.287222 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-bc5db8d96-8gwft" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api-log" containerID="cri-o://93d9f1086be2acc5ec3ecbf84647a6da95e29f9c01f730db52a0d526107b8511" gracePeriod=30 Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.287644 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-bc5db8d96-8gwft" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api" containerID="cri-o://337a70a6ab943486d81b6c0242ebbff8615162d6b8377690d8386e631787317c" gracePeriod=30 Feb 02 10:51:40 crc kubenswrapper[4909]: I0202 10:51:40.304222 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-bc5db8d96-8gwft" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": EOF" Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.034873 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d20c70-a055-4ecf-b593-95697717de45" path="/var/lib/kubelet/pods/19d20c70-a055-4ecf-b593-95697717de45/volumes" Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.039603 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" path="/var/lib/kubelet/pods/82e7f090-1c8a-419e-95da-4d6c82bcde8d/volumes" Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.040330 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" path="/var/lib/kubelet/pods/ade61e3a-3fd8-40af-8891-46c51e7a9d1b/volumes" Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.135347 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e462e484-c2ad-4d5f-85b8-7042663617ff","Type":"ContainerStarted","Data":"828c355e02a1496b0bcc5f2ff2fed34a28fa559bad719556214a379b96fa8585"} Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.137134 4909 generic.go:334] "Generic (PLEG): container finished" podID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerID="93d9f1086be2acc5ec3ecbf84647a6da95e29f9c01f730db52a0d526107b8511" exitCode=143 Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.137177 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bc5db8d96-8gwft" event={"ID":"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa","Type":"ContainerDied","Data":"93d9f1086be2acc5ec3ecbf84647a6da95e29f9c01f730db52a0d526107b8511"} Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.144153 4909 generic.go:334] "Generic (PLEG): container finished" podID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerID="3d6d889cf5654de8573407f2359ea91d742a8ba74f142c89469ff361282b9916" exitCode=0 Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.144193 4909 generic.go:334] "Generic (PLEG): container finished" podID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerID="ec6c43b06eb3c0e8cc9e57843ebecb998a72a14920fbaa5f98469840ab83d3e9" exitCode=2 Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.144291 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4235b4-44a0-4238-9b70-ad3ea946f729","Type":"ContainerDied","Data":"3d6d889cf5654de8573407f2359ea91d742a8ba74f142c89469ff361282b9916"} Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.144322 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4235b4-44a0-4238-9b70-ad3ea946f729","Type":"ContainerDied","Data":"ec6c43b06eb3c0e8cc9e57843ebecb998a72a14920fbaa5f98469840ab83d3e9"} Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.154976 4909 generic.go:334] "Generic (PLEG): container finished" podID="17452cea-74ae-4e13-8369-431a2062addc" containerID="02865c2c13ba1491c90611238a5f366850156749981c87da1bb4d977674ed09d" exitCode=0 Feb 02 10:51:41 crc kubenswrapper[4909]: I0202 10:51:41.155166 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" event={"ID":"17452cea-74ae-4e13-8369-431a2062addc","Type":"ContainerDied","Data":"02865c2c13ba1491c90611238a5f366850156749981c87da1bb4d977674ed09d"} Feb 02 10:51:42 crc kubenswrapper[4909]: I0202 10:51:42.166923 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e","Type":"ContainerStarted","Data":"043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c"} Feb 02 10:51:42 crc kubenswrapper[4909]: I0202 10:51:42.170739 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" event={"ID":"17452cea-74ae-4e13-8369-431a2062addc","Type":"ContainerStarted","Data":"5fcb95bd792a909ea5b933458787933e9ca37ce2f1240323b32b4b2c86cc21ef"} Feb 02 10:51:42 crc kubenswrapper[4909]: I0202 10:51:42.170960 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:42 crc kubenswrapper[4909]: I0202 10:51:42.178454 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e462e484-c2ad-4d5f-85b8-7042663617ff","Type":"ContainerStarted","Data":"183693e0814dace9a9bccdeddbdfcb5d0e5f35e7d4ea73adfd3615097c62b6ec"} Feb 02 10:51:42 crc kubenswrapper[4909]: I0202 10:51:42.178583 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerName="cinder-api-log" containerID="cri-o://828c355e02a1496b0bcc5f2ff2fed34a28fa559bad719556214a379b96fa8585" gracePeriod=30 Feb 02 10:51:42 crc kubenswrapper[4909]: I0202 10:51:42.178750 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 10:51:42 crc kubenswrapper[4909]: I0202 10:51:42.178835 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerName="cinder-api" containerID="cri-o://183693e0814dace9a9bccdeddbdfcb5d0e5f35e7d4ea73adfd3615097c62b6ec" gracePeriod=30 Feb 02 10:51:42 crc kubenswrapper[4909]: I0202 10:51:42.199626 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" podStartSLOduration=5.199605161 podStartE2EDuration="5.199605161s" podCreationTimestamp="2026-02-02 10:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:42.195150874 +0000 UTC m=+1227.941251629" watchObservedRunningTime="2026-02-02 10:51:42.199605161 +0000 UTC m=+1227.945705896" Feb 02 10:51:42 crc kubenswrapper[4909]: I0202 10:51:42.219314 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.21929891 podStartE2EDuration="5.21929891s" podCreationTimestamp="2026-02-02 10:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:42.217017595 +0000 UTC m=+1227.963118330" watchObservedRunningTime="2026-02-02 10:51:42.21929891 +0000 UTC m=+1227.965399635" Feb 02 10:51:42 crc kubenswrapper[4909]: I0202 10:51:42.919154 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69c85d5ff7-sgcx6" podUID="19d20c70-a055-4ecf-b593-95697717de45" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.144:5353: i/o timeout" Feb 02 10:51:43 crc kubenswrapper[4909]: I0202 10:51:43.189283 4909 generic.go:334] "Generic (PLEG): container finished" podID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerID="828c355e02a1496b0bcc5f2ff2fed34a28fa559bad719556214a379b96fa8585" exitCode=143 Feb 02 10:51:43 crc kubenswrapper[4909]: I0202 10:51:43.189368 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e462e484-c2ad-4d5f-85b8-7042663617ff","Type":"ContainerDied","Data":"828c355e02a1496b0bcc5f2ff2fed34a28fa559bad719556214a379b96fa8585"} Feb 02 10:51:43 crc kubenswrapper[4909]: I0202 10:51:43.191788 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e","Type":"ContainerStarted","Data":"a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3"} Feb 02 10:51:43 crc kubenswrapper[4909]: I0202 10:51:43.218963 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.391634122 podStartE2EDuration="6.218944399s" podCreationTimestamp="2026-02-02 10:51:37 +0000 UTC" firstStartedPulling="2026-02-02 10:51:39.741935919 +0000 UTC m=+1225.488036644" lastFinishedPulling="2026-02-02 10:51:40.569246186 +0000 UTC m=+1226.315346921" observedRunningTime="2026-02-02 10:51:43.215120341 +0000 UTC m=+1228.961221086" watchObservedRunningTime="2026-02-02 10:51:43.218944399 +0000 UTC m=+1228.965045134" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.222755 4909 generic.go:334] "Generic (PLEG): container finished" podID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerID="5cdd443f73cd51e100bceb14caae9ac75a69bda8061677bfa6c49185d8fd7f82" exitCode=0 Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.224247 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4235b4-44a0-4238-9b70-ad3ea946f729","Type":"ContainerDied","Data":"5cdd443f73cd51e100bceb14caae9ac75a69bda8061677bfa6c49185d8fd7f82"} Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.386374 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.425692 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-scripts\") pod \"9b4235b4-44a0-4238-9b70-ad3ea946f729\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.425750 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm6x8\" (UniqueName: \"kubernetes.io/projected/9b4235b4-44a0-4238-9b70-ad3ea946f729-kube-api-access-rm6x8\") pod \"9b4235b4-44a0-4238-9b70-ad3ea946f729\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.425892 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-combined-ca-bundle\") pod \"9b4235b4-44a0-4238-9b70-ad3ea946f729\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.425921 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-log-httpd\") pod \"9b4235b4-44a0-4238-9b70-ad3ea946f729\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.426097 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-config-data\") pod \"9b4235b4-44a0-4238-9b70-ad3ea946f729\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.426171 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-run-httpd\") pod \"9b4235b4-44a0-4238-9b70-ad3ea946f729\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.426203 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-sg-core-conf-yaml\") pod \"9b4235b4-44a0-4238-9b70-ad3ea946f729\" (UID: \"9b4235b4-44a0-4238-9b70-ad3ea946f729\") " Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.428517 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9b4235b4-44a0-4238-9b70-ad3ea946f729" (UID: "9b4235b4-44a0-4238-9b70-ad3ea946f729"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.428607 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9b4235b4-44a0-4238-9b70-ad3ea946f729" (UID: "9b4235b4-44a0-4238-9b70-ad3ea946f729"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.433395 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4235b4-44a0-4238-9b70-ad3ea946f729-kube-api-access-rm6x8" (OuterVolumeSpecName: "kube-api-access-rm6x8") pod "9b4235b4-44a0-4238-9b70-ad3ea946f729" (UID: "9b4235b4-44a0-4238-9b70-ad3ea946f729"). InnerVolumeSpecName "kube-api-access-rm6x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.435899 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-scripts" (OuterVolumeSpecName: "scripts") pod "9b4235b4-44a0-4238-9b70-ad3ea946f729" (UID: "9b4235b4-44a0-4238-9b70-ad3ea946f729"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.462631 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9b4235b4-44a0-4238-9b70-ad3ea946f729" (UID: "9b4235b4-44a0-4238-9b70-ad3ea946f729"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.519827 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b4235b4-44a0-4238-9b70-ad3ea946f729" (UID: "9b4235b4-44a0-4238-9b70-ad3ea946f729"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.529506 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.529546 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.529558 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.529572 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm6x8\" (UniqueName: \"kubernetes.io/projected/9b4235b4-44a0-4238-9b70-ad3ea946f729-kube-api-access-rm6x8\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.529583 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.529593 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9b4235b4-44a0-4238-9b70-ad3ea946f729-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.534568 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-config-data" (OuterVolumeSpecName: "config-data") pod "9b4235b4-44a0-4238-9b70-ad3ea946f729" (UID: "9b4235b4-44a0-4238-9b70-ad3ea946f729"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:44 crc kubenswrapper[4909]: I0202 10:51:44.631081 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b4235b4-44a0-4238-9b70-ad3ea946f729-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.234198 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9b4235b4-44a0-4238-9b70-ad3ea946f729","Type":"ContainerDied","Data":"25aa449b8475eafbb0131dd8a6cdef0c5099c9f3bf1580e96d895b093a05f36e"} Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.234514 4909 scope.go:117] "RemoveContainer" containerID="3d6d889cf5654de8573407f2359ea91d742a8ba74f142c89469ff361282b9916" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.234267 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.274433 4909 scope.go:117] "RemoveContainer" containerID="ec6c43b06eb3c0e8cc9e57843ebecb998a72a14920fbaa5f98469840ab83d3e9" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.279708 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.290172 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.314737 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:51:45 crc kubenswrapper[4909]: E0202 10:51:45.315331 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" containerName="barbican-worker-log" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.315436 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" containerName="barbican-worker-log" Feb 02 10:51:45 crc kubenswrapper[4909]: E0202 10:51:45.315503 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" containerName="barbican-keystone-listener-log" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.315562 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" containerName="barbican-keystone-listener-log" Feb 02 10:51:45 crc kubenswrapper[4909]: E0202 10:51:45.315629 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="ceilometer-notification-agent" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.315684 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="ceilometer-notification-agent" Feb 02 10:51:45 crc kubenswrapper[4909]: E0202 10:51:45.315740 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="sg-core" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.315821 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="sg-core" Feb 02 10:51:45 crc kubenswrapper[4909]: E0202 10:51:45.315889 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d20c70-a055-4ecf-b593-95697717de45" containerName="dnsmasq-dns" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.315939 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d20c70-a055-4ecf-b593-95697717de45" containerName="dnsmasq-dns" Feb 02 10:51:45 crc kubenswrapper[4909]: E0202 10:51:45.316006 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" containerName="barbican-worker" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.316098 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" containerName="barbican-worker" Feb 02 10:51:45 crc kubenswrapper[4909]: E0202 10:51:45.316198 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d20c70-a055-4ecf-b593-95697717de45" containerName="init" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.316258 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d20c70-a055-4ecf-b593-95697717de45" containerName="init" Feb 02 10:51:45 crc kubenswrapper[4909]: E0202 10:51:45.316313 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="proxy-httpd" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.316366 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="proxy-httpd" Feb 02 10:51:45 crc kubenswrapper[4909]: E0202 10:51:45.316418 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" containerName="barbican-keystone-listener" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.316555 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" containerName="barbican-keystone-listener" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.316765 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="proxy-httpd" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.316846 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" containerName="barbican-worker-log" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.316912 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade61e3a-3fd8-40af-8891-46c51e7a9d1b" containerName="barbican-worker" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.316975 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="sg-core" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.317035 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" containerName="ceilometer-notification-agent" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.317087 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" containerName="barbican-keystone-listener-log" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.317139 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="82e7f090-1c8a-419e-95da-4d6c82bcde8d" containerName="barbican-keystone-listener" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.317198 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d20c70-a055-4ecf-b593-95697717de45" containerName="dnsmasq-dns" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.318758 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.329385 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.329717 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.334163 4909 scope.go:117] "RemoveContainer" containerID="5cdd443f73cd51e100bceb14caae9ac75a69bda8061677bfa6c49185d8fd7f82" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.342147 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.449031 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-scripts\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.449100 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9glsv\" (UniqueName: \"kubernetes.io/projected/29af981b-c23d-4740-8a31-a99a24d8a72e-kube-api-access-9glsv\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.449174 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-log-httpd\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.449216 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-config-data\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.449260 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.449360 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-run-httpd\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.449421 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.550484 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9glsv\" (UniqueName: \"kubernetes.io/projected/29af981b-c23d-4740-8a31-a99a24d8a72e-kube-api-access-9glsv\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.550884 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-log-httpd\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.550989 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-config-data\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.551067 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.551155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-run-httpd\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.551273 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.551354 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-scripts\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.551607 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-run-httpd\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.551283 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-log-httpd\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.556219 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-config-data\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.556781 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-scripts\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.557486 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.558026 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.571294 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9glsv\" (UniqueName: \"kubernetes.io/projected/29af981b-c23d-4740-8a31-a99a24d8a72e-kube-api-access-9glsv\") pod \"ceilometer-0\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.652619 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.732091 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bc5db8d96-8gwft" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:41380->10.217.0.160:9311: read: connection reset by peer" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.732160 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-bc5db8d96-8gwft" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:41384->10.217.0.160:9311: read: connection reset by peer" Feb 02 10:51:45 crc kubenswrapper[4909]: I0202 10:51:45.865345 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.116964 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-744cf8b8bf-vxfhd"] Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.117530 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-744cf8b8bf-vxfhd" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerName="neutron-api" containerID="cri-o://d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e" gracePeriod=30 Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.118261 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-744cf8b8bf-vxfhd" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerName="neutron-httpd" containerID="cri-o://e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c" gracePeriod=30 Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.129478 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.166635 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8588c46577-4cp8s"] Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.168523 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.208346 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8588c46577-4cp8s"] Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.232311 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-744cf8b8bf-vxfhd" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": read tcp 10.217.0.2:48734->10.217.0.152:9696: read: connection reset by peer" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.252261 4909 generic.go:334] "Generic (PLEG): container finished" podID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerID="337a70a6ab943486d81b6c0242ebbff8615162d6b8377690d8386e631787317c" exitCode=0 Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.252338 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bc5db8d96-8gwft" event={"ID":"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa","Type":"ContainerDied","Data":"337a70a6ab943486d81b6c0242ebbff8615162d6b8377690d8386e631787317c"} Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.254104 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29af981b-c23d-4740-8a31-a99a24d8a72e","Type":"ContainerStarted","Data":"da9eca87a510c4c7ccce7bb38c87d2b5f2051b9f01b432c6bd2371d65aa90ae0"} Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.281516 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-combined-ca-bundle\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.281658 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-httpd-config\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.281679 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5t64\" (UniqueName: \"kubernetes.io/projected/d1145da4-90e5-422b-917a-33473a9c5d6a-kube-api-access-v5t64\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.281838 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-internal-tls-certs\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.281861 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-ovndb-tls-certs\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.281878 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-config\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.281939 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-public-tls-certs\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.383208 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-internal-tls-certs\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.383362 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-ovndb-tls-certs\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.383437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-config\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.383534 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-public-tls-certs\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.383618 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-combined-ca-bundle\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.383747 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-httpd-config\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.383829 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5t64\" (UniqueName: \"kubernetes.io/projected/d1145da4-90e5-422b-917a-33473a9c5d6a-kube-api-access-v5t64\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.391413 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-ovndb-tls-certs\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.391708 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-combined-ca-bundle\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.392793 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-internal-tls-certs\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.393001 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-config\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.393184 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-httpd-config\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.398124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-public-tls-certs\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.408191 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.410179 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5t64\" (UniqueName: \"kubernetes.io/projected/d1145da4-90e5-422b-917a-33473a9c5d6a-kube-api-access-v5t64\") pod \"neutron-8588c46577-4cp8s\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.484986 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data-custom\") pod \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.485138 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-combined-ca-bundle\") pod \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.485253 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78q8n\" (UniqueName: \"kubernetes.io/projected/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-kube-api-access-78q8n\") pod \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.485333 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-logs\") pod \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.485374 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data\") pod \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\" (UID: \"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa\") " Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.486186 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-logs" (OuterVolumeSpecName: "logs") pod "ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" (UID: "ae36a22a-0d59-45ce-b94d-0e8657ad4aaa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.488915 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-kube-api-access-78q8n" (OuterVolumeSpecName: "kube-api-access-78q8n") pod "ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" (UID: "ae36a22a-0d59-45ce-b94d-0e8657ad4aaa"). InnerVolumeSpecName "kube-api-access-78q8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.489997 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" (UID: "ae36a22a-0d59-45ce-b94d-0e8657ad4aaa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.514616 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.518233 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" (UID: "ae36a22a-0d59-45ce-b94d-0e8657ad4aaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.547072 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data" (OuterVolumeSpecName: "config-data") pod "ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" (UID: "ae36a22a-0d59-45ce-b94d-0e8657ad4aaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.587962 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78q8n\" (UniqueName: \"kubernetes.io/projected/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-kube-api-access-78q8n\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.588004 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.588017 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.588030 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:46 crc kubenswrapper[4909]: I0202 10:51:46.588042 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.029713 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4235b4-44a0-4238-9b70-ad3ea946f729" path="/var/lib/kubelet/pods/9b4235b4-44a0-4238-9b70-ad3ea946f729/volumes" Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.097494 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8588c46577-4cp8s"] Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.264913 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-bc5db8d96-8gwft" event={"ID":"ae36a22a-0d59-45ce-b94d-0e8657ad4aaa","Type":"ContainerDied","Data":"367a26312055d5b54f70f855eb94b467868288a062b7990c70a3b38e3fb37df8"} Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.264997 4909 scope.go:117] "RemoveContainer" containerID="337a70a6ab943486d81b6c0242ebbff8615162d6b8377690d8386e631787317c" Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.265006 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-bc5db8d96-8gwft" Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.269797 4909 generic.go:334] "Generic (PLEG): container finished" podID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerID="e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c" exitCode=0 Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.269926 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-744cf8b8bf-vxfhd" event={"ID":"2e0c3b2f-546a-403b-9dee-bda4c14ab84d","Type":"ContainerDied","Data":"e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c"} Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.274071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8588c46577-4cp8s" event={"ID":"d1145da4-90e5-422b-917a-33473a9c5d6a","Type":"ContainerStarted","Data":"59b1d6f9825a899073b39bab4b3554e3b90a77cb33ecbb1602394e0b14fc8b49"} Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.278728 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29af981b-c23d-4740-8a31-a99a24d8a72e","Type":"ContainerStarted","Data":"5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b"} Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.301440 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-bc5db8d96-8gwft"] Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.302861 4909 scope.go:117] "RemoveContainer" containerID="93d9f1086be2acc5ec3ecbf84647a6da95e29f9c01f730db52a0d526107b8511" Feb 02 10:51:47 crc kubenswrapper[4909]: I0202 10:51:47.313645 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-bc5db8d96-8gwft"] Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.080843 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.153043 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.219784 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-pswzg"] Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.223864 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" podUID="113b4dcd-ae8d-4c1e-af62-07441b8665a9" containerName="dnsmasq-dns" containerID="cri-o://56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee" gracePeriod=10 Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.312699 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29af981b-c23d-4740-8a31-a99a24d8a72e","Type":"ContainerStarted","Data":"159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b"} Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.324304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8588c46577-4cp8s" event={"ID":"d1145da4-90e5-422b-917a-33473a9c5d6a","Type":"ContainerStarted","Data":"07fee0cf291ae485ea00f0668c32744aaec34c29563b1aae25a47955a349b94d"} Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.324353 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8588c46577-4cp8s" event={"ID":"d1145da4-90e5-422b-917a-33473a9c5d6a","Type":"ContainerStarted","Data":"8a5e2b95b89f07eeb7333cda5e4cb6d87b241046d11a832dbb17fcb90f90c063"} Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.325527 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.332157 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.370352 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8588c46577-4cp8s" podStartSLOduration=2.370337197 podStartE2EDuration="2.370337197s" podCreationTimestamp="2026-02-02 10:51:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:48.369994738 +0000 UTC m=+1234.116095473" watchObservedRunningTime="2026-02-02 10:51:48.370337197 +0000 UTC m=+1234.116437932" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.426237 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.808856 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.850360 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-swift-storage-0\") pod \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.850414 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-config\") pod \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.850431 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-svc\") pod \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.850483 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkwsb\" (UniqueName: \"kubernetes.io/projected/113b4dcd-ae8d-4c1e-af62-07441b8665a9-kube-api-access-fkwsb\") pod \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.850505 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-nb\") pod \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.850574 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-sb\") pod \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\" (UID: \"113b4dcd-ae8d-4c1e-af62-07441b8665a9\") " Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.871034 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113b4dcd-ae8d-4c1e-af62-07441b8665a9-kube-api-access-fkwsb" (OuterVolumeSpecName: "kube-api-access-fkwsb") pod "113b4dcd-ae8d-4c1e-af62-07441b8665a9" (UID: "113b4dcd-ae8d-4c1e-af62-07441b8665a9"). InnerVolumeSpecName "kube-api-access-fkwsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.903785 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "113b4dcd-ae8d-4c1e-af62-07441b8665a9" (UID: "113b4dcd-ae8d-4c1e-af62-07441b8665a9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.910110 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "113b4dcd-ae8d-4c1e-af62-07441b8665a9" (UID: "113b4dcd-ae8d-4c1e-af62-07441b8665a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.926481 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "113b4dcd-ae8d-4c1e-af62-07441b8665a9" (UID: "113b4dcd-ae8d-4c1e-af62-07441b8665a9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.941765 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "113b4dcd-ae8d-4c1e-af62-07441b8665a9" (UID: "113b4dcd-ae8d-4c1e-af62-07441b8665a9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.956936 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.956967 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.956976 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkwsb\" (UniqueName: \"kubernetes.io/projected/113b4dcd-ae8d-4c1e-af62-07441b8665a9-kube-api-access-fkwsb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.956987 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.956998 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:48 crc kubenswrapper[4909]: I0202 10:51:48.984691 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-config" (OuterVolumeSpecName: "config") pod "113b4dcd-ae8d-4c1e-af62-07441b8665a9" (UID: "113b4dcd-ae8d-4c1e-af62-07441b8665a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.026128 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" path="/var/lib/kubelet/pods/ae36a22a-0d59-45ce-b94d-0e8657ad4aaa/volumes" Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.058729 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/113b4dcd-ae8d-4c1e-af62-07441b8665a9-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.199239 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-744cf8b8bf-vxfhd" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": dial tcp 10.217.0.152:9696: connect: connection refused" Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.343302 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29af981b-c23d-4740-8a31-a99a24d8a72e","Type":"ContainerStarted","Data":"f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e"} Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.345471 4909 generic.go:334] "Generic (PLEG): container finished" podID="113b4dcd-ae8d-4c1e-af62-07441b8665a9" containerID="56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee" exitCode=0 Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.345541 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.345538 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" event={"ID":"113b4dcd-ae8d-4c1e-af62-07441b8665a9","Type":"ContainerDied","Data":"56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee"} Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.345602 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-pswzg" event={"ID":"113b4dcd-ae8d-4c1e-af62-07441b8665a9","Type":"ContainerDied","Data":"1decab96899cf3074facf4fdaa1e6b21e99696e00f9a83539714f2dd9df32f6c"} Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.345625 4909 scope.go:117] "RemoveContainer" containerID="56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee" Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.345657 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" containerName="cinder-scheduler" containerID="cri-o://043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c" gracePeriod=30 Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.345705 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" containerName="probe" containerID="cri-o://a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3" gracePeriod=30 Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.376182 4909 scope.go:117] "RemoveContainer" containerID="255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9" Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.377605 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-pswzg"] Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.386867 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-pswzg"] Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.404956 4909 scope.go:117] "RemoveContainer" containerID="56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee" Feb 02 10:51:49 crc kubenswrapper[4909]: E0202 10:51:49.410286 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee\": container with ID starting with 56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee not found: ID does not exist" containerID="56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee" Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.410330 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee"} err="failed to get container status \"56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee\": rpc error: code = NotFound desc = could not find container \"56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee\": container with ID starting with 56186756abfad7bb6caca95c951c31c1886f2be3bd236a5f0c7de3f7a6271eee not found: ID does not exist" Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.410355 4909 scope.go:117] "RemoveContainer" containerID="255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9" Feb 02 10:51:49 crc kubenswrapper[4909]: E0202 10:51:49.410740 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9\": container with ID starting with 255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9 not found: ID does not exist" containerID="255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9" Feb 02 10:51:49 crc kubenswrapper[4909]: I0202 10:51:49.410776 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9"} err="failed to get container status \"255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9\": rpc error: code = NotFound desc = could not find container \"255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9\": container with ID starting with 255c8085517d0198a016ce037cdd64428033c1a53fd6af7181d018c4060182b9 not found: ID does not exist" Feb 02 10:51:50 crc kubenswrapper[4909]: I0202 10:51:50.370619 4909 generic.go:334] "Generic (PLEG): container finished" podID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" containerID="a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3" exitCode=0 Feb 02 10:51:50 crc kubenswrapper[4909]: I0202 10:51:50.370932 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e","Type":"ContainerDied","Data":"a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3"} Feb 02 10:51:50 crc kubenswrapper[4909]: I0202 10:51:50.375932 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.038597 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="113b4dcd-ae8d-4c1e-af62-07441b8665a9" path="/var/lib/kubelet/pods/113b4dcd-ae8d-4c1e-af62-07441b8665a9/volumes" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.249827 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.303114 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-internal-tls-certs\") pod \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.303454 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-public-tls-certs\") pod \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.303578 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-httpd-config\") pod \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.303630 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-config\") pod \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.303665 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-ovndb-tls-certs\") pod \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.303736 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-combined-ca-bundle\") pod \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.303777 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9r2d\" (UniqueName: \"kubernetes.io/projected/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-kube-api-access-z9r2d\") pod \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\" (UID: \"2e0c3b2f-546a-403b-9dee-bda4c14ab84d\") " Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.329739 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "2e0c3b2f-546a-403b-9dee-bda4c14ab84d" (UID: "2e0c3b2f-546a-403b-9dee-bda4c14ab84d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.330764 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-kube-api-access-z9r2d" (OuterVolumeSpecName: "kube-api-access-z9r2d") pod "2e0c3b2f-546a-403b-9dee-bda4c14ab84d" (UID: "2e0c3b2f-546a-403b-9dee-bda4c14ab84d"). InnerVolumeSpecName "kube-api-access-z9r2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.365426 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2e0c3b2f-546a-403b-9dee-bda4c14ab84d" (UID: "2e0c3b2f-546a-403b-9dee-bda4c14ab84d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.366575 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e0c3b2f-546a-403b-9dee-bda4c14ab84d" (UID: "2e0c3b2f-546a-403b-9dee-bda4c14ab84d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.382255 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2e0c3b2f-546a-403b-9dee-bda4c14ab84d" (UID: "2e0c3b2f-546a-403b-9dee-bda4c14ab84d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.382668 4909 generic.go:334] "Generic (PLEG): container finished" podID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerID="d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e" exitCode=0 Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.382744 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-744cf8b8bf-vxfhd" event={"ID":"2e0c3b2f-546a-403b-9dee-bda4c14ab84d","Type":"ContainerDied","Data":"d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e"} Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.382778 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-744cf8b8bf-vxfhd" event={"ID":"2e0c3b2f-546a-403b-9dee-bda4c14ab84d","Type":"ContainerDied","Data":"02c7bbeecb89a33f46ebbe19f782d307e40f5060a30bd0e150f918b3bb551bed"} Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.382796 4909 scope.go:117] "RemoveContainer" containerID="e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.382837 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-744cf8b8bf-vxfhd" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.387934 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-config" (OuterVolumeSpecName: "config") pod "2e0c3b2f-546a-403b-9dee-bda4c14ab84d" (UID: "2e0c3b2f-546a-403b-9dee-bda4c14ab84d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.390507 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29af981b-c23d-4740-8a31-a99a24d8a72e","Type":"ContainerStarted","Data":"adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd"} Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.391723 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.402191 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "2e0c3b2f-546a-403b-9dee-bda4c14ab84d" (UID: "2e0c3b2f-546a-403b-9dee-bda4c14ab84d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.406644 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.406683 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.406713 4909 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.406725 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.406735 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9r2d\" (UniqueName: \"kubernetes.io/projected/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-kube-api-access-z9r2d\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.406743 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.406752 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e0c3b2f-546a-403b-9dee-bda4c14ab84d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.423013 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.097212638 podStartE2EDuration="6.422990244s" podCreationTimestamp="2026-02-02 10:51:45 +0000 UTC" firstStartedPulling="2026-02-02 10:51:46.1453316 +0000 UTC m=+1231.891432335" lastFinishedPulling="2026-02-02 10:51:50.471109206 +0000 UTC m=+1236.217209941" observedRunningTime="2026-02-02 10:51:51.414724919 +0000 UTC m=+1237.160825654" watchObservedRunningTime="2026-02-02 10:51:51.422990244 +0000 UTC m=+1237.169090979" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.502031 4909 scope.go:117] "RemoveContainer" containerID="d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.535028 4909 scope.go:117] "RemoveContainer" containerID="e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c" Feb 02 10:51:51 crc kubenswrapper[4909]: E0202 10:51:51.535396 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c\": container with ID starting with e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c not found: ID does not exist" containerID="e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.535430 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c"} err="failed to get container status \"e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c\": rpc error: code = NotFound desc = could not find container \"e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c\": container with ID starting with e492024f10acf7ea7ef2c1ebfebff32d8036b6ed9959ea1ae9ac17b82834ea3c not found: ID does not exist" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.535457 4909 scope.go:117] "RemoveContainer" containerID="d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e" Feb 02 10:51:51 crc kubenswrapper[4909]: E0202 10:51:51.535922 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e\": container with ID starting with d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e not found: ID does not exist" containerID="d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.535954 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e"} err="failed to get container status \"d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e\": rpc error: code = NotFound desc = could not find container \"d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e\": container with ID starting with d9b7e7c2d82ea6bc4b113faefe37915f651c5dc9bd63468cbb8600d907ba119e not found: ID does not exist" Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.721591 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-744cf8b8bf-vxfhd"] Feb 02 10:51:51 crc kubenswrapper[4909]: I0202 10:51:51.724036 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-744cf8b8bf-vxfhd"] Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.038474 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" path="/var/lib/kubelet/pods/2e0c3b2f-546a-403b-9dee-bda4c14ab84d/volumes" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.223857 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.345197 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data-custom\") pod \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.345327 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-combined-ca-bundle\") pod \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.345363 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-scripts\") pod \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.345439 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p282m\" (UniqueName: \"kubernetes.io/projected/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-kube-api-access-p282m\") pod \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.345465 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data\") pod \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.345602 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-etc-machine-id\") pod \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\" (UID: \"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e\") " Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.345833 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" (UID: "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.346078 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.351437 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-scripts" (OuterVolumeSpecName: "scripts") pod "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" (UID: "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.351482 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" (UID: "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.355854 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-kube-api-access-p282m" (OuterVolumeSpecName: "kube-api-access-p282m") pod "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" (UID: "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e"). InnerVolumeSpecName "kube-api-access-p282m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.395587 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" (UID: "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.412557 4909 generic.go:334] "Generic (PLEG): container finished" podID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" containerID="043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c" exitCode=0 Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.412585 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.412633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e","Type":"ContainerDied","Data":"043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c"} Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.412663 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e","Type":"ContainerDied","Data":"97c3ea1eedc67c253593b6fcff9738e7ca264a654af0353f601543c037dfa731"} Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.412889 4909 scope.go:117] "RemoveContainer" containerID="a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.447988 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.448938 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.449050 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.449152 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p282m\" (UniqueName: \"kubernetes.io/projected/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-kube-api-access-p282m\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.458672 4909 scope.go:117] "RemoveContainer" containerID="043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.481077 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data" (OuterVolumeSpecName: "config-data") pod "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" (UID: "0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.483263 4909 scope.go:117] "RemoveContainer" containerID="a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3" Feb 02 10:51:53 crc kubenswrapper[4909]: E0202 10:51:53.483756 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3\": container with ID starting with a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3 not found: ID does not exist" containerID="a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.483816 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3"} err="failed to get container status \"a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3\": rpc error: code = NotFound desc = could not find container \"a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3\": container with ID starting with a2fabc53e818e0df147692b67903b56abc757df34656b800aa49da705c2f87d3 not found: ID does not exist" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.483849 4909 scope.go:117] "RemoveContainer" containerID="043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c" Feb 02 10:51:53 crc kubenswrapper[4909]: E0202 10:51:53.484199 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c\": container with ID starting with 043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c not found: ID does not exist" containerID="043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.484240 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c"} err="failed to get container status \"043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c\": rpc error: code = NotFound desc = could not find container \"043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c\": container with ID starting with 043f0ffd414bd3ab9ba36d82ab3bc0a2d6a8de6fc3367ae914aaede9bd56ae6c not found: ID does not exist" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.551010 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.746856 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.761038 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772198 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:51:53 crc kubenswrapper[4909]: E0202 10:51:53.772557 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772572 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api" Feb 02 10:51:53 crc kubenswrapper[4909]: E0202 10:51:53.772590 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" containerName="cinder-scheduler" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772599 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" containerName="cinder-scheduler" Feb 02 10:51:53 crc kubenswrapper[4909]: E0202 10:51:53.772615 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113b4dcd-ae8d-4c1e-af62-07441b8665a9" containerName="init" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772622 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="113b4dcd-ae8d-4c1e-af62-07441b8665a9" containerName="init" Feb 02 10:51:53 crc kubenswrapper[4909]: E0202 10:51:53.772635 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113b4dcd-ae8d-4c1e-af62-07441b8665a9" containerName="dnsmasq-dns" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772640 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="113b4dcd-ae8d-4c1e-af62-07441b8665a9" containerName="dnsmasq-dns" Feb 02 10:51:53 crc kubenswrapper[4909]: E0202 10:51:53.772652 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api-log" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772658 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api-log" Feb 02 10:51:53 crc kubenswrapper[4909]: E0202 10:51:53.772665 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerName="neutron-api" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772670 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerName="neutron-api" Feb 02 10:51:53 crc kubenswrapper[4909]: E0202 10:51:53.772683 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerName="neutron-httpd" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772689 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerName="neutron-httpd" Feb 02 10:51:53 crc kubenswrapper[4909]: E0202 10:51:53.772714 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" containerName="probe" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772719 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" containerName="probe" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772892 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772902 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" containerName="cinder-scheduler" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772910 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerName="neutron-api" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772919 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0c3b2f-546a-403b-9dee-bda4c14ab84d" containerName="neutron-httpd" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772928 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae36a22a-0d59-45ce-b94d-0e8657ad4aaa" containerName="barbican-api-log" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772949 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="113b4dcd-ae8d-4c1e-af62-07441b8665a9" containerName="dnsmasq-dns" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.772962 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" containerName="probe" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.773794 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.779353 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.790390 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.857870 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-scripts\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.857952 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.858046 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b24f572-8a70-4a46-b3cf-e50ae4859892-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.858087 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.858110 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsh84\" (UniqueName: \"kubernetes.io/projected/9b24f572-8a70-4a46-b3cf-e50ae4859892-kube-api-access-lsh84\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.858133 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.959676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.959745 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsh84\" (UniqueName: \"kubernetes.io/projected/9b24f572-8a70-4a46-b3cf-e50ae4859892-kube-api-access-lsh84\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.959767 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.959842 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-scripts\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.959907 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.959978 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b24f572-8a70-4a46-b3cf-e50ae4859892-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.960069 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b24f572-8a70-4a46-b3cf-e50ae4859892-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.965676 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.966742 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.968868 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.970754 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-scripts\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:53 crc kubenswrapper[4909]: I0202 10:51:53.984266 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsh84\" (UniqueName: \"kubernetes.io/projected/9b24f572-8a70-4a46-b3cf-e50ae4859892-kube-api-access-lsh84\") pod \"cinder-scheduler-0\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " pod="openstack/cinder-scheduler-0" Feb 02 10:51:54 crc kubenswrapper[4909]: I0202 10:51:54.123497 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:51:54 crc kubenswrapper[4909]: I0202 10:51:54.652993 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:51:54 crc kubenswrapper[4909]: W0202 10:51:54.676774 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b24f572_8a70_4a46_b3cf_e50ae4859892.slice/crio-a5a62e112f12cc73a9b1a64250d6c437a2ab6a6453b3ed40c4fdc0c5c3f85c8e WatchSource:0}: Error finding container a5a62e112f12cc73a9b1a64250d6c437a2ab6a6453b3ed40c4fdc0c5c3f85c8e: Status 404 returned error can't find the container with id a5a62e112f12cc73a9b1a64250d6c437a2ab6a6453b3ed40c4fdc0c5c3f85c8e Feb 02 10:51:54 crc kubenswrapper[4909]: I0202 10:51:54.824470 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:54 crc kubenswrapper[4909]: I0202 10:51:54.825963 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:51:55 crc kubenswrapper[4909]: I0202 10:51:55.038138 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e" path="/var/lib/kubelet/pods/0a11ec40-dcc2-48a4-81a1-5f0e3baccf2e/volumes" Feb 02 10:51:55 crc kubenswrapper[4909]: I0202 10:51:55.435211 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b24f572-8a70-4a46-b3cf-e50ae4859892","Type":"ContainerStarted","Data":"e29f67614fb9cccc18985f89282ec44019b367ff92f09a6e3769b0344cbb1193"} Feb 02 10:51:55 crc kubenswrapper[4909]: I0202 10:51:55.437332 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b24f572-8a70-4a46-b3cf-e50ae4859892","Type":"ContainerStarted","Data":"a5a62e112f12cc73a9b1a64250d6c437a2ab6a6453b3ed40c4fdc0c5c3f85c8e"} Feb 02 10:51:55 crc kubenswrapper[4909]: I0202 10:51:55.671306 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:51:56 crc kubenswrapper[4909]: I0202 10:51:56.455135 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b24f572-8a70-4a46-b3cf-e50ae4859892","Type":"ContainerStarted","Data":"5181b71ba760ed91aeb7b3915c3d7814e4e85e3d4983092599caaa351d56bd8b"} Feb 02 10:51:56 crc kubenswrapper[4909]: I0202 10:51:56.482277 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.482259065 podStartE2EDuration="3.482259065s" podCreationTimestamp="2026-02-02 10:51:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:51:56.477286744 +0000 UTC m=+1242.223387489" watchObservedRunningTime="2026-02-02 10:51:56.482259065 +0000 UTC m=+1242.228359800" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.897837 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d64c85fd5-nns29"] Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.899961 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.901751 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.903196 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.908502 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.915448 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d64c85fd5-nns29"] Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.977062 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-combined-ca-bundle\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.977158 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-run-httpd\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.977301 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-log-httpd\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.977338 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-public-tls-certs\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.977374 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fw6x\" (UniqueName: \"kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-kube-api-access-8fw6x\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.977513 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-config-data\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.977547 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-etc-swift\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:58 crc kubenswrapper[4909]: I0202 10:51:58.977578 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-internal-tls-certs\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.080955 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-public-tls-certs\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.081043 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fw6x\" (UniqueName: \"kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-kube-api-access-8fw6x\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.081213 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-config-data\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.081246 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-etc-swift\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.081283 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-internal-tls-certs\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.081337 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-combined-ca-bundle\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.081417 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-run-httpd\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.081568 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-log-httpd\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.082074 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-log-httpd\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.082368 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-run-httpd\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.087463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-combined-ca-bundle\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.090027 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-config-data\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.095144 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-public-tls-certs\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.095780 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-internal-tls-certs\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.096700 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-etc-swift\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.107576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fw6x\" (UniqueName: \"kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-kube-api-access-8fw6x\") pod \"swift-proxy-d64c85fd5-nns29\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.124562 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.221575 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.368511 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.369323 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.456779 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-54c465c874-5jkf8"] Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.457106 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-54c465c874-5jkf8" podUID="a4a64068-38da-44e0-99a6-93aa570aef32" containerName="placement-log" containerID="cri-o://6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76" gracePeriod=30 Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.457483 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-54c465c874-5jkf8" podUID="a4a64068-38da-44e0-99a6-93aa570aef32" containerName="placement-api" containerID="cri-o://68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a" gracePeriod=30 Feb 02 10:51:59 crc kubenswrapper[4909]: I0202 10:51:59.823212 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d64c85fd5-nns29"] Feb 02 10:51:59 crc kubenswrapper[4909]: W0202 10:51:59.826520 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40590ae8_432b_4d01_a586_f61f07a206b0.slice/crio-b424138c0c1add112c48a249aab00d403a2069f30efdfff11dfbcc1d85d30b5d WatchSource:0}: Error finding container b424138c0c1add112c48a249aab00d403a2069f30efdfff11dfbcc1d85d30b5d: Status 404 returned error can't find the container with id b424138c0c1add112c48a249aab00d403a2069f30efdfff11dfbcc1d85d30b5d Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.161771 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.162998 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.170203 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-t55g6" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.170477 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.170626 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.187443 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.215392 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.215780 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6vpm\" (UniqueName: \"kubernetes.io/projected/c8dba959-faf4-4f15-96d3-e8f67ae00d62-kube-api-access-m6vpm\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.215891 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.216064 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.218525 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.218993 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="ceilometer-central-agent" containerID="cri-o://5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b" gracePeriod=30 Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.221061 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="proxy-httpd" containerID="cri-o://adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd" gracePeriod=30 Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.221412 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="sg-core" containerID="cri-o://f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e" gracePeriod=30 Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.221462 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="ceilometer-notification-agent" containerID="cri-o://159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b" gracePeriod=30 Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.317482 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6vpm\" (UniqueName: \"kubernetes.io/projected/c8dba959-faf4-4f15-96d3-e8f67ae00d62-kube-api-access-m6vpm\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.317774 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.317998 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.318175 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.319077 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.322708 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.330424 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config-secret\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.335305 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.166:3000/\": read tcp 10.217.0.2:51290->10.217.0.166:3000: read: connection reset by peer" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.339411 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6vpm\" (UniqueName: \"kubernetes.io/projected/c8dba959-faf4-4f15-96d3-e8f67ae00d62-kube-api-access-m6vpm\") pod \"openstackclient\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " pod="openstack/openstackclient" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.517578 4909 generic.go:334] "Generic (PLEG): container finished" podID="a4a64068-38da-44e0-99a6-93aa570aef32" containerID="6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76" exitCode=143 Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.517878 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54c465c874-5jkf8" event={"ID":"a4a64068-38da-44e0-99a6-93aa570aef32","Type":"ContainerDied","Data":"6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76"} Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.524755 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d64c85fd5-nns29" event={"ID":"40590ae8-432b-4d01-a586-f61f07a206b0","Type":"ContainerStarted","Data":"35a67f40f9a2ea8483465449c6307919ab0017c539599beb47d55125999c2929"} Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.524824 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d64c85fd5-nns29" event={"ID":"40590ae8-432b-4d01-a586-f61f07a206b0","Type":"ContainerStarted","Data":"811b1c8d9db9cb5f0ee50813644e170eb643be43ad3fd1cf8b8def4d133ce9a4"} Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.524838 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d64c85fd5-nns29" event={"ID":"40590ae8-432b-4d01-a586-f61f07a206b0","Type":"ContainerStarted","Data":"b424138c0c1add112c48a249aab00d403a2069f30efdfff11dfbcc1d85d30b5d"} Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.524883 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.524906 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.531619 4909 generic.go:334] "Generic (PLEG): container finished" podID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerID="adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd" exitCode=0 Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.531658 4909 generic.go:334] "Generic (PLEG): container finished" podID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerID="f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e" exitCode=2 Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.531679 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29af981b-c23d-4740-8a31-a99a24d8a72e","Type":"ContainerDied","Data":"adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd"} Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.531756 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29af981b-c23d-4740-8a31-a99a24d8a72e","Type":"ContainerDied","Data":"f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e"} Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.556660 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d64c85fd5-nns29" podStartSLOduration=2.556642083 podStartE2EDuration="2.556642083s" podCreationTimestamp="2026-02-02 10:51:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:00.552386012 +0000 UTC m=+1246.298486747" watchObservedRunningTime="2026-02-02 10:52:00.556642083 +0000 UTC m=+1246.302742818" Feb 02 10:52:00 crc kubenswrapper[4909]: I0202 10:52:00.569880 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.033126 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.052437 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.140592 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-log-httpd\") pod \"29af981b-c23d-4740-8a31-a99a24d8a72e\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.141065 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-run-httpd\") pod \"29af981b-c23d-4740-8a31-a99a24d8a72e\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.141093 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-scripts\") pod \"29af981b-c23d-4740-8a31-a99a24d8a72e\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.141154 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9glsv\" (UniqueName: \"kubernetes.io/projected/29af981b-c23d-4740-8a31-a99a24d8a72e-kube-api-access-9glsv\") pod \"29af981b-c23d-4740-8a31-a99a24d8a72e\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.141188 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-config-data\") pod \"29af981b-c23d-4740-8a31-a99a24d8a72e\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.141221 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-sg-core-conf-yaml\") pod \"29af981b-c23d-4740-8a31-a99a24d8a72e\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.141269 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-combined-ca-bundle\") pod \"29af981b-c23d-4740-8a31-a99a24d8a72e\" (UID: \"29af981b-c23d-4740-8a31-a99a24d8a72e\") " Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.141424 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "29af981b-c23d-4740-8a31-a99a24d8a72e" (UID: "29af981b-c23d-4740-8a31-a99a24d8a72e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.142254 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.145410 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "29af981b-c23d-4740-8a31-a99a24d8a72e" (UID: "29af981b-c23d-4740-8a31-a99a24d8a72e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.148460 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29af981b-c23d-4740-8a31-a99a24d8a72e-kube-api-access-9glsv" (OuterVolumeSpecName: "kube-api-access-9glsv") pod "29af981b-c23d-4740-8a31-a99a24d8a72e" (UID: "29af981b-c23d-4740-8a31-a99a24d8a72e"). InnerVolumeSpecName "kube-api-access-9glsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.171236 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-scripts" (OuterVolumeSpecName: "scripts") pod "29af981b-c23d-4740-8a31-a99a24d8a72e" (UID: "29af981b-c23d-4740-8a31-a99a24d8a72e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.194035 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "29af981b-c23d-4740-8a31-a99a24d8a72e" (UID: "29af981b-c23d-4740-8a31-a99a24d8a72e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.246751 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/29af981b-c23d-4740-8a31-a99a24d8a72e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.246786 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.246797 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9glsv\" (UniqueName: \"kubernetes.io/projected/29af981b-c23d-4740-8a31-a99a24d8a72e-kube-api-access-9glsv\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.246831 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.251829 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29af981b-c23d-4740-8a31-a99a24d8a72e" (UID: "29af981b-c23d-4740-8a31-a99a24d8a72e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.262877 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-config-data" (OuterVolumeSpecName: "config-data") pod "29af981b-c23d-4740-8a31-a99a24d8a72e" (UID: "29af981b-c23d-4740-8a31-a99a24d8a72e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.347937 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.347970 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29af981b-c23d-4740-8a31-a99a24d8a72e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.542022 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c8dba959-faf4-4f15-96d3-e8f67ae00d62","Type":"ContainerStarted","Data":"d003e414ebdd3e81941086944537d4049bb7fe6b4ac3ae65f410b666bec28020"} Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.544940 4909 generic.go:334] "Generic (PLEG): container finished" podID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerID="159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b" exitCode=0 Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.544972 4909 generic.go:334] "Generic (PLEG): container finished" podID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerID="5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b" exitCode=0 Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.544999 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.545068 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29af981b-c23d-4740-8a31-a99a24d8a72e","Type":"ContainerDied","Data":"159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b"} Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.545127 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29af981b-c23d-4740-8a31-a99a24d8a72e","Type":"ContainerDied","Data":"5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b"} Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.545140 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"29af981b-c23d-4740-8a31-a99a24d8a72e","Type":"ContainerDied","Data":"da9eca87a510c4c7ccce7bb38c87d2b5f2051b9f01b432c6bd2371d65aa90ae0"} Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.545156 4909 scope.go:117] "RemoveContainer" containerID="adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.573833 4909 scope.go:117] "RemoveContainer" containerID="f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.594599 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.604168 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.608548 4909 scope.go:117] "RemoveContainer" containerID="159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.641980 4909 scope.go:117] "RemoveContainer" containerID="5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.720963 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:01 crc kubenswrapper[4909]: E0202 10:52:01.721616 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="ceilometer-central-agent" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.721631 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="ceilometer-central-agent" Feb 02 10:52:01 crc kubenswrapper[4909]: E0202 10:52:01.721663 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="proxy-httpd" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.721670 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="proxy-httpd" Feb 02 10:52:01 crc kubenswrapper[4909]: E0202 10:52:01.721684 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="sg-core" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.721690 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="sg-core" Feb 02 10:52:01 crc kubenswrapper[4909]: E0202 10:52:01.721710 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="ceilometer-notification-agent" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.721716 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="ceilometer-notification-agent" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.725078 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="sg-core" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.725131 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="proxy-httpd" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.725156 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="ceilometer-central-agent" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.725167 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" containerName="ceilometer-notification-agent" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.744264 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.744399 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.751247 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.751566 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.833027 4909 scope.go:117] "RemoveContainer" containerID="adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd" Feb 02 10:52:01 crc kubenswrapper[4909]: E0202 10:52:01.844750 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd\": container with ID starting with adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd not found: ID does not exist" containerID="adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.844826 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd"} err="failed to get container status \"adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd\": rpc error: code = NotFound desc = could not find container \"adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd\": container with ID starting with adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd not found: ID does not exist" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.844860 4909 scope.go:117] "RemoveContainer" containerID="f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e" Feb 02 10:52:01 crc kubenswrapper[4909]: E0202 10:52:01.845625 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e\": container with ID starting with f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e not found: ID does not exist" containerID="f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.845680 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e"} err="failed to get container status \"f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e\": rpc error: code = NotFound desc = could not find container \"f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e\": container with ID starting with f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e not found: ID does not exist" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.845707 4909 scope.go:117] "RemoveContainer" containerID="159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b" Feb 02 10:52:01 crc kubenswrapper[4909]: E0202 10:52:01.847559 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b\": container with ID starting with 159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b not found: ID does not exist" containerID="159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.847593 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b"} err="failed to get container status \"159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b\": rpc error: code = NotFound desc = could not find container \"159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b\": container with ID starting with 159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b not found: ID does not exist" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.847614 4909 scope.go:117] "RemoveContainer" containerID="5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b" Feb 02 10:52:01 crc kubenswrapper[4909]: E0202 10:52:01.852946 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b\": container with ID starting with 5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b not found: ID does not exist" containerID="5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.852999 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b"} err="failed to get container status \"5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b\": rpc error: code = NotFound desc = could not find container \"5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b\": container with ID starting with 5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b not found: ID does not exist" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.853023 4909 scope.go:117] "RemoveContainer" containerID="adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.855969 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd"} err="failed to get container status \"adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd\": rpc error: code = NotFound desc = could not find container \"adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd\": container with ID starting with adf6e523e8a897c955bc892fa4d418fd9055014bb080fcf7dfd32e6f684e6efd not found: ID does not exist" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.855999 4909 scope.go:117] "RemoveContainer" containerID="f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.859698 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e"} err="failed to get container status \"f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e\": rpc error: code = NotFound desc = could not find container \"f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e\": container with ID starting with f7cfef1b94d4389d3faedb57d59992ae7a239ec96c234e73c093f201219d152e not found: ID does not exist" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.859734 4909 scope.go:117] "RemoveContainer" containerID="159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.860097 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b"} err="failed to get container status \"159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b\": rpc error: code = NotFound desc = could not find container \"159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b\": container with ID starting with 159e048449683cd43fd333c1e8d4750222f229b72e630edbdc65494e8979573b not found: ID does not exist" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.860131 4909 scope.go:117] "RemoveContainer" containerID="5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.860382 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b"} err="failed to get container status \"5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b\": rpc error: code = NotFound desc = could not find container \"5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b\": container with ID starting with 5aa33b9e58ad58f9e4a3c5511ff5e0cdd09b526c823272b15202f9aba1483f6b not found: ID does not exist" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.883046 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-run-httpd\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.883109 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-config-data\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.883133 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.883222 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6bkg\" (UniqueName: \"kubernetes.io/projected/9e687182-7508-4ff5-8138-dd582e11cdc5-kube-api-access-j6bkg\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.883248 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-scripts\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.883270 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-log-httpd\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.883304 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.984517 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.984617 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6bkg\" (UniqueName: \"kubernetes.io/projected/9e687182-7508-4ff5-8138-dd582e11cdc5-kube-api-access-j6bkg\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.984648 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-scripts\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.984668 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-log-httpd\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.984692 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.984779 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-run-httpd\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.984836 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-config-data\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.985868 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-log-httpd\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.985897 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-run-httpd\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.990762 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.991171 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.991675 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-scripts\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:01 crc kubenswrapper[4909]: I0202 10:52:01.993744 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-config-data\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:02 crc kubenswrapper[4909]: I0202 10:52:02.004697 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6bkg\" (UniqueName: \"kubernetes.io/projected/9e687182-7508-4ff5-8138-dd582e11cdc5-kube-api-access-j6bkg\") pod \"ceilometer-0\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " pod="openstack/ceilometer-0" Feb 02 10:52:02 crc kubenswrapper[4909]: I0202 10:52:02.112098 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:52:02 crc kubenswrapper[4909]: I0202 10:52:02.580348 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:02 crc kubenswrapper[4909]: W0202 10:52:02.596844 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e687182_7508_4ff5_8138_dd582e11cdc5.slice/crio-15ece1fb1ebec9f996c35bda29a185687eeceabfec80722248f3d700f36fb913 WatchSource:0}: Error finding container 15ece1fb1ebec9f996c35bda29a185687eeceabfec80722248f3d700f36fb913: Status 404 returned error can't find the container with id 15ece1fb1ebec9f996c35bda29a185687eeceabfec80722248f3d700f36fb913 Feb 02 10:52:02 crc kubenswrapper[4909]: I0202 10:52:02.952985 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.054841 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29af981b-c23d-4740-8a31-a99a24d8a72e" path="/var/lib/kubelet/pods/29af981b-c23d-4740-8a31-a99a24d8a72e/volumes" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.109348 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-internal-tls-certs\") pod \"a4a64068-38da-44e0-99a6-93aa570aef32\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.110046 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a64068-38da-44e0-99a6-93aa570aef32-logs\") pod \"a4a64068-38da-44e0-99a6-93aa570aef32\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.110141 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-config-data\") pod \"a4a64068-38da-44e0-99a6-93aa570aef32\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.110233 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-public-tls-certs\") pod \"a4a64068-38da-44e0-99a6-93aa570aef32\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.110375 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-combined-ca-bundle\") pod \"a4a64068-38da-44e0-99a6-93aa570aef32\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.110448 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-scripts\") pod \"a4a64068-38da-44e0-99a6-93aa570aef32\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.110542 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnn8s\" (UniqueName: \"kubernetes.io/projected/a4a64068-38da-44e0-99a6-93aa570aef32-kube-api-access-lnn8s\") pod \"a4a64068-38da-44e0-99a6-93aa570aef32\" (UID: \"a4a64068-38da-44e0-99a6-93aa570aef32\") " Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.111067 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a64068-38da-44e0-99a6-93aa570aef32-logs" (OuterVolumeSpecName: "logs") pod "a4a64068-38da-44e0-99a6-93aa570aef32" (UID: "a4a64068-38da-44e0-99a6-93aa570aef32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.114842 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-scripts" (OuterVolumeSpecName: "scripts") pod "a4a64068-38da-44e0-99a6-93aa570aef32" (UID: "a4a64068-38da-44e0-99a6-93aa570aef32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.120933 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a64068-38da-44e0-99a6-93aa570aef32-kube-api-access-lnn8s" (OuterVolumeSpecName: "kube-api-access-lnn8s") pod "a4a64068-38da-44e0-99a6-93aa570aef32" (UID: "a4a64068-38da-44e0-99a6-93aa570aef32"). InnerVolumeSpecName "kube-api-access-lnn8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.172996 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-config-data" (OuterVolumeSpecName: "config-data") pod "a4a64068-38da-44e0-99a6-93aa570aef32" (UID: "a4a64068-38da-44e0-99a6-93aa570aef32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.175115 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a64068-38da-44e0-99a6-93aa570aef32" (UID: "a4a64068-38da-44e0-99a6-93aa570aef32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.212972 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.213009 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.213022 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnn8s\" (UniqueName: \"kubernetes.io/projected/a4a64068-38da-44e0-99a6-93aa570aef32-kube-api-access-lnn8s\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.213035 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a64068-38da-44e0-99a6-93aa570aef32-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.213045 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.234945 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a4a64068-38da-44e0-99a6-93aa570aef32" (UID: "a4a64068-38da-44e0-99a6-93aa570aef32"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.237543 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a4a64068-38da-44e0-99a6-93aa570aef32" (UID: "a4a64068-38da-44e0-99a6-93aa570aef32"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.314882 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.314922 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4a64068-38da-44e0-99a6-93aa570aef32-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.571963 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e687182-7508-4ff5-8138-dd582e11cdc5","Type":"ContainerStarted","Data":"e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f"} Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.572522 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e687182-7508-4ff5-8138-dd582e11cdc5","Type":"ContainerStarted","Data":"15ece1fb1ebec9f996c35bda29a185687eeceabfec80722248f3d700f36fb913"} Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.576970 4909 generic.go:334] "Generic (PLEG): container finished" podID="a4a64068-38da-44e0-99a6-93aa570aef32" containerID="68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a" exitCode=0 Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.577029 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54c465c874-5jkf8" event={"ID":"a4a64068-38da-44e0-99a6-93aa570aef32","Type":"ContainerDied","Data":"68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a"} Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.577056 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54c465c874-5jkf8" event={"ID":"a4a64068-38da-44e0-99a6-93aa570aef32","Type":"ContainerDied","Data":"2d3c9a4a799772b956833d2b6dfc19d9d1e4f23863940d5d9272286683b6a923"} Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.577072 4909 scope.go:117] "RemoveContainer" containerID="68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.577220 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54c465c874-5jkf8" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.618405 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-54c465c874-5jkf8"] Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.621239 4909 scope.go:117] "RemoveContainer" containerID="6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.626175 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-54c465c874-5jkf8"] Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.645954 4909 scope.go:117] "RemoveContainer" containerID="68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a" Feb 02 10:52:03 crc kubenswrapper[4909]: E0202 10:52:03.646618 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a\": container with ID starting with 68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a not found: ID does not exist" containerID="68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.646771 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a"} err="failed to get container status \"68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a\": rpc error: code = NotFound desc = could not find container \"68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a\": container with ID starting with 68e3348a64665854c90ff9f1880c72e41e44fbceb2cafe25c97f0867483d6f6a not found: ID does not exist" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.646903 4909 scope.go:117] "RemoveContainer" containerID="6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76" Feb 02 10:52:03 crc kubenswrapper[4909]: E0202 10:52:03.651001 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76\": container with ID starting with 6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76 not found: ID does not exist" containerID="6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76" Feb 02 10:52:03 crc kubenswrapper[4909]: I0202 10:52:03.651063 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76"} err="failed to get container status \"6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76\": rpc error: code = NotFound desc = could not find container \"6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76\": container with ID starting with 6b3c5fad03e8dcf1048a837c474f22709297535e0f64315bda7d7454a41aba76 not found: ID does not exist" Feb 02 10:52:04 crc kubenswrapper[4909]: I0202 10:52:04.438294 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 02 10:52:04 crc kubenswrapper[4909]: I0202 10:52:04.588442 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e687182-7508-4ff5-8138-dd582e11cdc5","Type":"ContainerStarted","Data":"23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6"} Feb 02 10:52:05 crc kubenswrapper[4909]: I0202 10:52:05.030073 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a64068-38da-44e0-99a6-93aa570aef32" path="/var/lib/kubelet/pods/a4a64068-38da-44e0-99a6-93aa570aef32/volumes" Feb 02 10:52:05 crc kubenswrapper[4909]: I0202 10:52:05.606126 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e687182-7508-4ff5-8138-dd582e11cdc5","Type":"ContainerStarted","Data":"5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb"} Feb 02 10:52:09 crc kubenswrapper[4909]: I0202 10:52:09.231133 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:52:09 crc kubenswrapper[4909]: I0202 10:52:09.234149 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:52:09 crc kubenswrapper[4909]: I0202 10:52:09.503847 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:12 crc kubenswrapper[4909]: I0202 10:52:12.666042 4909 generic.go:334] "Generic (PLEG): container finished" podID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerID="183693e0814dace9a9bccdeddbdfcb5d0e5f35e7d4ea73adfd3615097c62b6ec" exitCode=137 Feb 02 10:52:12 crc kubenswrapper[4909]: I0202 10:52:12.666114 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e462e484-c2ad-4d5f-85b8-7042663617ff","Type":"ContainerDied","Data":"183693e0814dace9a9bccdeddbdfcb5d0e5f35e7d4ea73adfd3615097c62b6ec"} Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.438363 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.507999 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ldlw\" (UniqueName: \"kubernetes.io/projected/e462e484-c2ad-4d5f-85b8-7042663617ff-kube-api-access-9ldlw\") pod \"e462e484-c2ad-4d5f-85b8-7042663617ff\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.508155 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data-custom\") pod \"e462e484-c2ad-4d5f-85b8-7042663617ff\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.508202 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-combined-ca-bundle\") pod \"e462e484-c2ad-4d5f-85b8-7042663617ff\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.508295 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e462e484-c2ad-4d5f-85b8-7042663617ff-etc-machine-id\") pod \"e462e484-c2ad-4d5f-85b8-7042663617ff\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.508328 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-scripts\") pod \"e462e484-c2ad-4d5f-85b8-7042663617ff\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.508457 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data\") pod \"e462e484-c2ad-4d5f-85b8-7042663617ff\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.508506 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e462e484-c2ad-4d5f-85b8-7042663617ff-logs\") pod \"e462e484-c2ad-4d5f-85b8-7042663617ff\" (UID: \"e462e484-c2ad-4d5f-85b8-7042663617ff\") " Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.508623 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e462e484-c2ad-4d5f-85b8-7042663617ff-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e462e484-c2ad-4d5f-85b8-7042663617ff" (UID: "e462e484-c2ad-4d5f-85b8-7042663617ff"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.508988 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e462e484-c2ad-4d5f-85b8-7042663617ff-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.509532 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e462e484-c2ad-4d5f-85b8-7042663617ff-logs" (OuterVolumeSpecName: "logs") pod "e462e484-c2ad-4d5f-85b8-7042663617ff" (UID: "e462e484-c2ad-4d5f-85b8-7042663617ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.516384 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e462e484-c2ad-4d5f-85b8-7042663617ff" (UID: "e462e484-c2ad-4d5f-85b8-7042663617ff"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.541400 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-scripts" (OuterVolumeSpecName: "scripts") pod "e462e484-c2ad-4d5f-85b8-7042663617ff" (UID: "e462e484-c2ad-4d5f-85b8-7042663617ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.541946 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e462e484-c2ad-4d5f-85b8-7042663617ff-kube-api-access-9ldlw" (OuterVolumeSpecName: "kube-api-access-9ldlw") pod "e462e484-c2ad-4d5f-85b8-7042663617ff" (UID: "e462e484-c2ad-4d5f-85b8-7042663617ff"). InnerVolumeSpecName "kube-api-access-9ldlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.610971 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e462e484-c2ad-4d5f-85b8-7042663617ff-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.611002 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ldlw\" (UniqueName: \"kubernetes.io/projected/e462e484-c2ad-4d5f-85b8-7042663617ff-kube-api-access-9ldlw\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.611014 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.611023 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.666067 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e462e484-c2ad-4d5f-85b8-7042663617ff" (UID: "e462e484-c2ad-4d5f-85b8-7042663617ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.683957 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"c8dba959-faf4-4f15-96d3-e8f67ae00d62","Type":"ContainerStarted","Data":"908ec2618be6ce0ccb5553a70d2b7d51ef915f04d269be9c3d7f53480f657d99"} Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.693992 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data" (OuterVolumeSpecName: "config-data") pod "e462e484-c2ad-4d5f-85b8-7042663617ff" (UID: "e462e484-c2ad-4d5f-85b8-7042663617ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.698757 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e462e484-c2ad-4d5f-85b8-7042663617ff","Type":"ContainerDied","Data":"f0ad5cc9bb28005692be645cc967c4db160826a751f87b117252da1ad7fc6679"} Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.698825 4909 scope.go:117] "RemoveContainer" containerID="183693e0814dace9a9bccdeddbdfcb5d0e5f35e7d4ea73adfd3615097c62b6ec" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.698957 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.707128 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.564626783 podStartE2EDuration="13.707110566s" podCreationTimestamp="2026-02-02 10:52:00 +0000 UTC" firstStartedPulling="2026-02-02 10:52:01.040177272 +0000 UTC m=+1246.786278007" lastFinishedPulling="2026-02-02 10:52:13.182661055 +0000 UTC m=+1258.928761790" observedRunningTime="2026-02-02 10:52:13.703350129 +0000 UTC m=+1259.449450864" watchObservedRunningTime="2026-02-02 10:52:13.707110566 +0000 UTC m=+1259.453211301" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.714226 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.714270 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e462e484-c2ad-4d5f-85b8-7042663617ff-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.717542 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="ceilometer-central-agent" containerID="cri-o://e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f" gracePeriod=30 Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.717862 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.718110 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="proxy-httpd" containerID="cri-o://5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435" gracePeriod=30 Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.718166 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="sg-core" containerID="cri-o://5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb" gracePeriod=30 Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.718204 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="ceilometer-notification-agent" containerID="cri-o://23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6" gracePeriod=30 Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.744673 4909 scope.go:117] "RemoveContainer" containerID="828c355e02a1496b0bcc5f2ff2fed34a28fa559bad719556214a379b96fa8585" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.754992 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.18067722 podStartE2EDuration="12.754970085s" podCreationTimestamp="2026-02-02 10:52:01 +0000 UTC" firstStartedPulling="2026-02-02 10:52:02.606146507 +0000 UTC m=+1248.352247242" lastFinishedPulling="2026-02-02 10:52:13.180439372 +0000 UTC m=+1258.926540107" observedRunningTime="2026-02-02 10:52:13.738383694 +0000 UTC m=+1259.484484439" watchObservedRunningTime="2026-02-02 10:52:13.754970085 +0000 UTC m=+1259.501070820" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.776113 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.791872 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.800432 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:52:13 crc kubenswrapper[4909]: E0202 10:52:13.800881 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a64068-38da-44e0-99a6-93aa570aef32" containerName="placement-log" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.800904 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a64068-38da-44e0-99a6-93aa570aef32" containerName="placement-log" Feb 02 10:52:13 crc kubenswrapper[4909]: E0202 10:52:13.800935 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a64068-38da-44e0-99a6-93aa570aef32" containerName="placement-api" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.800943 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a64068-38da-44e0-99a6-93aa570aef32" containerName="placement-api" Feb 02 10:52:13 crc kubenswrapper[4909]: E0202 10:52:13.800963 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerName="cinder-api-log" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.800971 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerName="cinder-api-log" Feb 02 10:52:13 crc kubenswrapper[4909]: E0202 10:52:13.800996 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerName="cinder-api" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.801003 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerName="cinder-api" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.801202 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerName="cinder-api-log" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.801226 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a64068-38da-44e0-99a6-93aa570aef32" containerName="placement-log" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.801239 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a64068-38da-44e0-99a6-93aa570aef32" containerName="placement-api" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.801259 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerName="cinder-api" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.802323 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.807545 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.809218 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.815760 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.815917 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.919774 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.919858 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.919933 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14a8699a-66db-48c6-8834-bda4e21ef1d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.919966 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.919984 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-scripts\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.920019 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.920057 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvnvx\" (UniqueName: \"kubernetes.io/projected/14a8699a-66db-48c6-8834-bda4e21ef1d9-kube-api-access-vvnvx\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.920076 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a8699a-66db-48c6-8834-bda4e21ef1d9-logs\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:13 crc kubenswrapper[4909]: I0202 10:52:13.920099 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.022464 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.022598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14a8699a-66db-48c6-8834-bda4e21ef1d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.022866 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.022889 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-scripts\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.022908 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14a8699a-66db-48c6-8834-bda4e21ef1d9-etc-machine-id\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.022938 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.022974 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvnvx\" (UniqueName: \"kubernetes.io/projected/14a8699a-66db-48c6-8834-bda4e21ef1d9-kube-api-access-vvnvx\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.023003 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a8699a-66db-48c6-8834-bda4e21ef1d9-logs\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.023034 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.023057 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.023467 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a8699a-66db-48c6-8834-bda4e21ef1d9-logs\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.028798 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-public-tls-certs\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.029701 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data-custom\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.030448 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-scripts\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.032107 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.032937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.038472 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.043841 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvnvx\" (UniqueName: \"kubernetes.io/projected/14a8699a-66db-48c6-8834-bda4e21ef1d9-kube-api-access-vvnvx\") pod \"cinder-api-0\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.134646 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.676195 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.690757 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-fz2h7"] Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.692407 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fz2h7" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.724129 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fz2h7"] Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.747258 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6x4s\" (UniqueName: \"kubernetes.io/projected/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-kube-api-access-g6x4s\") pod \"nova-api-db-create-fz2h7\" (UID: \"d38222ad-04a2-4a42-8f8f-0789dc7c4c49\") " pod="openstack/nova-api-db-create-fz2h7" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.747305 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-operator-scripts\") pod \"nova-api-db-create-fz2h7\" (UID: \"d38222ad-04a2-4a42-8f8f-0789dc7c4c49\") " pod="openstack/nova-api-db-create-fz2h7" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.758870 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerID="5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb" exitCode=2 Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.758907 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerID="e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f" exitCode=0 Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.758952 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e687182-7508-4ff5-8138-dd582e11cdc5","Type":"ContainerStarted","Data":"5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435"} Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.758979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e687182-7508-4ff5-8138-dd582e11cdc5","Type":"ContainerDied","Data":"5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb"} Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.758991 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e687182-7508-4ff5-8138-dd582e11cdc5","Type":"ContainerDied","Data":"e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f"} Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.760339 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14a8699a-66db-48c6-8834-bda4e21ef1d9","Type":"ContainerStarted","Data":"c689550316c47545c7dfb54660b66e0d29cc015454099c5186b37d663c0c183c"} Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.805075 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-ac14-account-create-update-6nxwp"] Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.806221 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac14-account-create-update-6nxwp" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.819082 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.825745 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ac14-account-create-update-6nxwp"] Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.849682 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6x4s\" (UniqueName: \"kubernetes.io/projected/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-kube-api-access-g6x4s\") pod \"nova-api-db-create-fz2h7\" (UID: \"d38222ad-04a2-4a42-8f8f-0789dc7c4c49\") " pod="openstack/nova-api-db-create-fz2h7" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.849744 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-operator-scripts\") pod \"nova-api-db-create-fz2h7\" (UID: \"d38222ad-04a2-4a42-8f8f-0789dc7c4c49\") " pod="openstack/nova-api-db-create-fz2h7" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.853881 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-operator-scripts\") pod \"nova-api-db-create-fz2h7\" (UID: \"d38222ad-04a2-4a42-8f8f-0789dc7c4c49\") " pod="openstack/nova-api-db-create-fz2h7" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.880475 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6x4s\" (UniqueName: \"kubernetes.io/projected/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-kube-api-access-g6x4s\") pod \"nova-api-db-create-fz2h7\" (UID: \"d38222ad-04a2-4a42-8f8f-0789dc7c4c49\") " pod="openstack/nova-api-db-create-fz2h7" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.888616 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5m44k"] Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.890257 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5m44k" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.897797 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5m44k"] Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.955709 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6xj2\" (UniqueName: \"kubernetes.io/projected/aee6c054-0ce0-44c7-96aa-1722b6339cfe-kube-api-access-s6xj2\") pod \"nova-cell0-db-create-5m44k\" (UID: \"aee6c054-0ce0-44c7-96aa-1722b6339cfe\") " pod="openstack/nova-cell0-db-create-5m44k" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.955772 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45nzp\" (UniqueName: \"kubernetes.io/projected/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-kube-api-access-45nzp\") pod \"nova-api-ac14-account-create-update-6nxwp\" (UID: \"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451\") " pod="openstack/nova-api-ac14-account-create-update-6nxwp" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.955856 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-operator-scripts\") pod \"nova-api-ac14-account-create-update-6nxwp\" (UID: \"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451\") " pod="openstack/nova-api-ac14-account-create-update-6nxwp" Feb 02 10:52:14 crc kubenswrapper[4909]: I0202 10:52:14.955931 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee6c054-0ce0-44c7-96aa-1722b6339cfe-operator-scripts\") pod \"nova-cell0-db-create-5m44k\" (UID: \"aee6c054-0ce0-44c7-96aa-1722b6339cfe\") " pod="openstack/nova-cell0-db-create-5m44k" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.013344 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5fks9"] Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.037514 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5fks9" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.062753 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-operator-scripts\") pod \"nova-api-ac14-account-create-update-6nxwp\" (UID: \"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451\") " pod="openstack/nova-api-ac14-account-create-update-6nxwp" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.062960 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee6c054-0ce0-44c7-96aa-1722b6339cfe-operator-scripts\") pod \"nova-cell0-db-create-5m44k\" (UID: \"aee6c054-0ce0-44c7-96aa-1722b6339cfe\") " pod="openstack/nova-cell0-db-create-5m44k" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.063093 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-operator-scripts\") pod \"nova-cell1-db-create-5fks9\" (UID: \"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb\") " pod="openstack/nova-cell1-db-create-5fks9" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.063168 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plnb7\" (UniqueName: \"kubernetes.io/projected/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-kube-api-access-plnb7\") pod \"nova-cell1-db-create-5fks9\" (UID: \"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb\") " pod="openstack/nova-cell1-db-create-5fks9" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.063261 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6xj2\" (UniqueName: \"kubernetes.io/projected/aee6c054-0ce0-44c7-96aa-1722b6339cfe-kube-api-access-s6xj2\") pod \"nova-cell0-db-create-5m44k\" (UID: \"aee6c054-0ce0-44c7-96aa-1722b6339cfe\") " pod="openstack/nova-cell0-db-create-5m44k" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.063744 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45nzp\" (UniqueName: \"kubernetes.io/projected/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-kube-api-access-45nzp\") pod \"nova-api-ac14-account-create-update-6nxwp\" (UID: \"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451\") " pod="openstack/nova-api-ac14-account-create-update-6nxwp" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.064690 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e462e484-c2ad-4d5f-85b8-7042663617ff" path="/var/lib/kubelet/pods/e462e484-c2ad-4d5f-85b8-7042663617ff/volumes" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.065016 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-operator-scripts\") pod \"nova-api-ac14-account-create-update-6nxwp\" (UID: \"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451\") " pod="openstack/nova-api-ac14-account-create-update-6nxwp" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.065615 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee6c054-0ce0-44c7-96aa-1722b6339cfe-operator-scripts\") pod \"nova-cell0-db-create-5m44k\" (UID: \"aee6c054-0ce0-44c7-96aa-1722b6339cfe\") " pod="openstack/nova-cell0-db-create-5m44k" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.078943 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-abe3-account-create-update-47nm8"] Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.082082 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-abe3-account-create-update-47nm8" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.084199 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.085258 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6xj2\" (UniqueName: \"kubernetes.io/projected/aee6c054-0ce0-44c7-96aa-1722b6339cfe-kube-api-access-s6xj2\") pod \"nova-cell0-db-create-5m44k\" (UID: \"aee6c054-0ce0-44c7-96aa-1722b6339cfe\") " pod="openstack/nova-cell0-db-create-5m44k" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.087338 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fz2h7" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.089276 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45nzp\" (UniqueName: \"kubernetes.io/projected/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-kube-api-access-45nzp\") pod \"nova-api-ac14-account-create-update-6nxwp\" (UID: \"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451\") " pod="openstack/nova-api-ac14-account-create-update-6nxwp" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.103848 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5fks9"] Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.114405 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-abe3-account-create-update-47nm8"] Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.141218 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac14-account-create-update-6nxwp" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.165853 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd533d6b-9338-4896-b52c-c3123a5e0467-operator-scripts\") pod \"nova-cell0-abe3-account-create-update-47nm8\" (UID: \"dd533d6b-9338-4896-b52c-c3123a5e0467\") " pod="openstack/nova-cell0-abe3-account-create-update-47nm8" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.166008 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-operator-scripts\") pod \"nova-cell1-db-create-5fks9\" (UID: \"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb\") " pod="openstack/nova-cell1-db-create-5fks9" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.166034 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plnb7\" (UniqueName: \"kubernetes.io/projected/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-kube-api-access-plnb7\") pod \"nova-cell1-db-create-5fks9\" (UID: \"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb\") " pod="openstack/nova-cell1-db-create-5fks9" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.166118 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jtrm\" (UniqueName: \"kubernetes.io/projected/dd533d6b-9338-4896-b52c-c3123a5e0467-kube-api-access-8jtrm\") pod \"nova-cell0-abe3-account-create-update-47nm8\" (UID: \"dd533d6b-9338-4896-b52c-c3123a5e0467\") " pod="openstack/nova-cell0-abe3-account-create-update-47nm8" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.168967 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-operator-scripts\") pod \"nova-cell1-db-create-5fks9\" (UID: \"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb\") " pod="openstack/nova-cell1-db-create-5fks9" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.194329 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plnb7\" (UniqueName: \"kubernetes.io/projected/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-kube-api-access-plnb7\") pod \"nova-cell1-db-create-5fks9\" (UID: \"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb\") " pod="openstack/nova-cell1-db-create-5fks9" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.215883 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5m44k" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.269235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jtrm\" (UniqueName: \"kubernetes.io/projected/dd533d6b-9338-4896-b52c-c3123a5e0467-kube-api-access-8jtrm\") pod \"nova-cell0-abe3-account-create-update-47nm8\" (UID: \"dd533d6b-9338-4896-b52c-c3123a5e0467\") " pod="openstack/nova-cell0-abe3-account-create-update-47nm8" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.269302 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd533d6b-9338-4896-b52c-c3123a5e0467-operator-scripts\") pod \"nova-cell0-abe3-account-create-update-47nm8\" (UID: \"dd533d6b-9338-4896-b52c-c3123a5e0467\") " pod="openstack/nova-cell0-abe3-account-create-update-47nm8" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.270092 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd533d6b-9338-4896-b52c-c3123a5e0467-operator-scripts\") pod \"nova-cell0-abe3-account-create-update-47nm8\" (UID: \"dd533d6b-9338-4896-b52c-c3123a5e0467\") " pod="openstack/nova-cell0-abe3-account-create-update-47nm8" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.295497 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jtrm\" (UniqueName: \"kubernetes.io/projected/dd533d6b-9338-4896-b52c-c3123a5e0467-kube-api-access-8jtrm\") pod \"nova-cell0-abe3-account-create-update-47nm8\" (UID: \"dd533d6b-9338-4896-b52c-c3123a5e0467\") " pod="openstack/nova-cell0-abe3-account-create-update-47nm8" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.315536 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-90c9-account-create-update-6nkzl"] Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.316853 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.319098 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.327316 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-90c9-account-create-update-6nkzl"] Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.371330 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32e77d63-a0db-4845-80c5-9815b46a9e21-operator-scripts\") pod \"nova-cell1-90c9-account-create-update-6nkzl\" (UID: \"32e77d63-a0db-4845-80c5-9815b46a9e21\") " pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.371479 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh6s9\" (UniqueName: \"kubernetes.io/projected/32e77d63-a0db-4845-80c5-9815b46a9e21-kube-api-access-wh6s9\") pod \"nova-cell1-90c9-account-create-update-6nkzl\" (UID: \"32e77d63-a0db-4845-80c5-9815b46a9e21\") " pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.474406 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32e77d63-a0db-4845-80c5-9815b46a9e21-operator-scripts\") pod \"nova-cell1-90c9-account-create-update-6nkzl\" (UID: \"32e77d63-a0db-4845-80c5-9815b46a9e21\") " pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.475199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh6s9\" (UniqueName: \"kubernetes.io/projected/32e77d63-a0db-4845-80c5-9815b46a9e21-kube-api-access-wh6s9\") pod \"nova-cell1-90c9-account-create-update-6nkzl\" (UID: \"32e77d63-a0db-4845-80c5-9815b46a9e21\") " pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.475519 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32e77d63-a0db-4845-80c5-9815b46a9e21-operator-scripts\") pod \"nova-cell1-90c9-account-create-update-6nkzl\" (UID: \"32e77d63-a0db-4845-80c5-9815b46a9e21\") " pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.477391 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5fks9" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.486522 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-abe3-account-create-update-47nm8" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.490590 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh6s9\" (UniqueName: \"kubernetes.io/projected/32e77d63-a0db-4845-80c5-9815b46a9e21-kube-api-access-wh6s9\") pod \"nova-cell1-90c9-account-create-update-6nkzl\" (UID: \"32e77d63-a0db-4845-80c5-9815b46a9e21\") " pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.641987 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fz2h7"] Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.645200 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.733137 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-ac14-account-create-update-6nxwp"] Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.758451 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.789647 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fz2h7" event={"ID":"d38222ad-04a2-4a42-8f8f-0789dc7c4c49","Type":"ContainerStarted","Data":"3f8e383075be8fb92a6cd5bb3112bd381d328da81d8a6358ee4c3182164ab01a"} Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.798831 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14a8699a-66db-48c6-8834-bda4e21ef1d9","Type":"ContainerStarted","Data":"e41df562f1a41e47706274614d03c13faaba64903b9ae50a415bad9f51e06946"} Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.802625 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac14-account-create-update-6nxwp" event={"ID":"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451","Type":"ContainerStarted","Data":"b4b31a02de1bf3eeadcac3a5e29c25b56d57105b52e3c61910d1652a9b1f7973"} Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.832268 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerID="23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6" exitCode=0 Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.832317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e687182-7508-4ff5-8138-dd582e11cdc5","Type":"ContainerDied","Data":"23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6"} Feb 02 10:52:15 crc kubenswrapper[4909]: I0202 10:52:15.868858 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5m44k"] Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.002910 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5fks9"] Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.167323 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-abe3-account-create-update-47nm8"] Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.310233 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-90c9-account-create-update-6nkzl"] Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.533778 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.621444 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d95b455f4-xnd62"] Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.622131 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d95b455f4-xnd62" podUID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" containerName="neutron-api" containerID="cri-o://6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008" gracePeriod=30 Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.622655 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d95b455f4-xnd62" podUID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" containerName="neutron-httpd" containerID="cri-o://1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda" gracePeriod=30 Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.845276 4909 generic.go:334] "Generic (PLEG): container finished" podID="1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb" containerID="ce78989884566e501936aa0f9c402244d1dd8a9250194989d7173d57f7bcf1f8" exitCode=0 Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.845335 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5fks9" event={"ID":"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb","Type":"ContainerDied","Data":"ce78989884566e501936aa0f9c402244d1dd8a9250194989d7173d57f7bcf1f8"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.845400 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5fks9" event={"ID":"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb","Type":"ContainerStarted","Data":"b49b5624cf971dafaa560b1407615972e9128ace1a9f081d1b7a5ccad0d9c560"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.846768 4909 generic.go:334] "Generic (PLEG): container finished" podID="f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451" containerID="aa861435346ae6420a59ce2b05c25881fc8f592ed7abec3f230f2dfe7c08cf09" exitCode=0 Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.846847 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac14-account-create-update-6nxwp" event={"ID":"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451","Type":"ContainerDied","Data":"aa861435346ae6420a59ce2b05c25881fc8f592ed7abec3f230f2dfe7c08cf09"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.852175 4909 generic.go:334] "Generic (PLEG): container finished" podID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" containerID="1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda" exitCode=0 Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.852212 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d95b455f4-xnd62" event={"ID":"ff044864-ba16-4d8f-86bc-7677e7d4f8ad","Type":"ContainerDied","Data":"1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.857607 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-abe3-account-create-update-47nm8" event={"ID":"dd533d6b-9338-4896-b52c-c3123a5e0467","Type":"ContainerStarted","Data":"abe9b83cee5ffbc3e19cf902b840a1d2bf660fa5d7600b5b0657196354a95639"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.857650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-abe3-account-create-update-47nm8" event={"ID":"dd533d6b-9338-4896-b52c-c3123a5e0467","Type":"ContainerStarted","Data":"a9e300ace75a7df031cb5f45a61cd0b6ba4974111a0520bd0c0bcd9525d1ad29"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.868478 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" event={"ID":"32e77d63-a0db-4845-80c5-9815b46a9e21","Type":"ContainerStarted","Data":"54beb5c8ca4e401e90a1f311f21072bcb862ee38775a25ba04703ad4a2386559"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.868525 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" event={"ID":"32e77d63-a0db-4845-80c5-9815b46a9e21","Type":"ContainerStarted","Data":"bc24b82c72520772f6ab645be1ccc4ea1f931b445bb201f30237f1393e3dcc25"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.871344 4909 generic.go:334] "Generic (PLEG): container finished" podID="aee6c054-0ce0-44c7-96aa-1722b6339cfe" containerID="d673f64542822bb2e56ba036a91f3532349c8a50f5cec241f137474be541104b" exitCode=0 Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.871421 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5m44k" event={"ID":"aee6c054-0ce0-44c7-96aa-1722b6339cfe","Type":"ContainerDied","Data":"d673f64542822bb2e56ba036a91f3532349c8a50f5cec241f137474be541104b"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.871710 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5m44k" event={"ID":"aee6c054-0ce0-44c7-96aa-1722b6339cfe","Type":"ContainerStarted","Data":"f08d4ebed6353e34eddcab7d11365f4366e2e4df743d3a147f6a6b4a69f12891"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.874151 4909 generic.go:334] "Generic (PLEG): container finished" podID="d38222ad-04a2-4a42-8f8f-0789dc7c4c49" containerID="0de2c93e8b03385a8e5852fd9c213a372e98af76b96beea748839bfdc6ed33cd" exitCode=0 Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.874249 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fz2h7" event={"ID":"d38222ad-04a2-4a42-8f8f-0789dc7c4c49","Type":"ContainerDied","Data":"0de2c93e8b03385a8e5852fd9c213a372e98af76b96beea748839bfdc6ed33cd"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.885434 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14a8699a-66db-48c6-8834-bda4e21ef1d9","Type":"ContainerStarted","Data":"9b97792ee0643776ed45dd9bce3742b6344f3f3eb329b31a1d413958f371ef6e"} Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.886081 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.908324 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-abe3-account-create-update-47nm8" podStartSLOduration=2.90830547 podStartE2EDuration="2.90830547s" podCreationTimestamp="2026-02-02 10:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:16.899753597 +0000 UTC m=+1262.645854332" watchObservedRunningTime="2026-02-02 10:52:16.90830547 +0000 UTC m=+1262.654406195" Feb 02 10:52:16 crc kubenswrapper[4909]: I0202 10:52:16.977078 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" podStartSLOduration=1.977055972 podStartE2EDuration="1.977055972s" podCreationTimestamp="2026-02-02 10:52:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:16.967273534 +0000 UTC m=+1262.713374269" watchObservedRunningTime="2026-02-02 10:52:16.977055972 +0000 UTC m=+1262.723156707" Feb 02 10:52:17 crc kubenswrapper[4909]: I0202 10:52:17.001394 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.001370193 podStartE2EDuration="4.001370193s" podCreationTimestamp="2026-02-02 10:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:16.999365986 +0000 UTC m=+1262.745466721" watchObservedRunningTime="2026-02-02 10:52:17.001370193 +0000 UTC m=+1262.747470938" Feb 02 10:52:17 crc kubenswrapper[4909]: I0202 10:52:17.894561 4909 generic.go:334] "Generic (PLEG): container finished" podID="dd533d6b-9338-4896-b52c-c3123a5e0467" containerID="abe9b83cee5ffbc3e19cf902b840a1d2bf660fa5d7600b5b0657196354a95639" exitCode=0 Feb 02 10:52:17 crc kubenswrapper[4909]: I0202 10:52:17.894619 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-abe3-account-create-update-47nm8" event={"ID":"dd533d6b-9338-4896-b52c-c3123a5e0467","Type":"ContainerDied","Data":"abe9b83cee5ffbc3e19cf902b840a1d2bf660fa5d7600b5b0657196354a95639"} Feb 02 10:52:17 crc kubenswrapper[4909]: I0202 10:52:17.895874 4909 generic.go:334] "Generic (PLEG): container finished" podID="32e77d63-a0db-4845-80c5-9815b46a9e21" containerID="54beb5c8ca4e401e90a1f311f21072bcb862ee38775a25ba04703ad4a2386559" exitCode=0 Feb 02 10:52:17 crc kubenswrapper[4909]: I0202 10:52:17.895930 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" event={"ID":"32e77d63-a0db-4845-80c5-9815b46a9e21","Type":"ContainerDied","Data":"54beb5c8ca4e401e90a1f311f21072bcb862ee38775a25ba04703ad4a2386559"} Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.334628 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5fks9" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.350114 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plnb7\" (UniqueName: \"kubernetes.io/projected/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-kube-api-access-plnb7\") pod \"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb\" (UID: \"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb\") " Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.350188 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-operator-scripts\") pod \"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb\" (UID: \"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb\") " Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.350927 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb" (UID: "1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.371774 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-kube-api-access-plnb7" (OuterVolumeSpecName: "kube-api-access-plnb7") pod "1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb" (UID: "1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb"). InnerVolumeSpecName "kube-api-access-plnb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.397882 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="e462e484-c2ad-4d5f-85b8-7042663617ff" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.165:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.451797 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.451855 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plnb7\" (UniqueName: \"kubernetes.io/projected/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb-kube-api-access-plnb7\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.565714 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fz2h7" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.572114 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac14-account-create-update-6nxwp" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.582490 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5m44k" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.757227 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee6c054-0ce0-44c7-96aa-1722b6339cfe-operator-scripts\") pod \"aee6c054-0ce0-44c7-96aa-1722b6339cfe\" (UID: \"aee6c054-0ce0-44c7-96aa-1722b6339cfe\") " Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.757283 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6xj2\" (UniqueName: \"kubernetes.io/projected/aee6c054-0ce0-44c7-96aa-1722b6339cfe-kube-api-access-s6xj2\") pod \"aee6c054-0ce0-44c7-96aa-1722b6339cfe\" (UID: \"aee6c054-0ce0-44c7-96aa-1722b6339cfe\") " Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.757378 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-operator-scripts\") pod \"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451\" (UID: \"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451\") " Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.757470 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6x4s\" (UniqueName: \"kubernetes.io/projected/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-kube-api-access-g6x4s\") pod \"d38222ad-04a2-4a42-8f8f-0789dc7c4c49\" (UID: \"d38222ad-04a2-4a42-8f8f-0789dc7c4c49\") " Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.757526 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-operator-scripts\") pod \"d38222ad-04a2-4a42-8f8f-0789dc7c4c49\" (UID: \"d38222ad-04a2-4a42-8f8f-0789dc7c4c49\") " Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.757659 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45nzp\" (UniqueName: \"kubernetes.io/projected/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-kube-api-access-45nzp\") pod \"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451\" (UID: \"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451\") " Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.757754 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aee6c054-0ce0-44c7-96aa-1722b6339cfe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aee6c054-0ce0-44c7-96aa-1722b6339cfe" (UID: "aee6c054-0ce0-44c7-96aa-1722b6339cfe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.758157 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aee6c054-0ce0-44c7-96aa-1722b6339cfe-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.758170 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451" (UID: "f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.758215 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d38222ad-04a2-4a42-8f8f-0789dc7c4c49" (UID: "d38222ad-04a2-4a42-8f8f-0789dc7c4c49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.761346 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee6c054-0ce0-44c7-96aa-1722b6339cfe-kube-api-access-s6xj2" (OuterVolumeSpecName: "kube-api-access-s6xj2") pod "aee6c054-0ce0-44c7-96aa-1722b6339cfe" (UID: "aee6c054-0ce0-44c7-96aa-1722b6339cfe"). InnerVolumeSpecName "kube-api-access-s6xj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.761758 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-kube-api-access-45nzp" (OuterVolumeSpecName: "kube-api-access-45nzp") pod "f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451" (UID: "f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451"). InnerVolumeSpecName "kube-api-access-45nzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.763149 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-kube-api-access-g6x4s" (OuterVolumeSpecName: "kube-api-access-g6x4s") pod "d38222ad-04a2-4a42-8f8f-0789dc7c4c49" (UID: "d38222ad-04a2-4a42-8f8f-0789dc7c4c49"). InnerVolumeSpecName "kube-api-access-g6x4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.860076 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45nzp\" (UniqueName: \"kubernetes.io/projected/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-kube-api-access-45nzp\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.860120 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6xj2\" (UniqueName: \"kubernetes.io/projected/aee6c054-0ce0-44c7-96aa-1722b6339cfe-kube-api-access-s6xj2\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.860134 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.860146 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6x4s\" (UniqueName: \"kubernetes.io/projected/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-kube-api-access-g6x4s\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.860158 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d38222ad-04a2-4a42-8f8f-0789dc7c4c49-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.905531 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fz2h7" event={"ID":"d38222ad-04a2-4a42-8f8f-0789dc7c4c49","Type":"ContainerDied","Data":"3f8e383075be8fb92a6cd5bb3112bd381d328da81d8a6358ee4c3182164ab01a"} Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.905575 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f8e383075be8fb92a6cd5bb3112bd381d328da81d8a6358ee4c3182164ab01a" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.905572 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fz2h7" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.907105 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-ac14-account-create-update-6nxwp" event={"ID":"f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451","Type":"ContainerDied","Data":"b4b31a02de1bf3eeadcac3a5e29c25b56d57105b52e3c61910d1652a9b1f7973"} Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.907152 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4b31a02de1bf3eeadcac3a5e29c25b56d57105b52e3c61910d1652a9b1f7973" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.907162 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-ac14-account-create-update-6nxwp" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.908831 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5fks9" event={"ID":"1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb","Type":"ContainerDied","Data":"b49b5624cf971dafaa560b1407615972e9128ace1a9f081d1b7a5ccad0d9c560"} Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.908862 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5fks9" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.908868 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b49b5624cf971dafaa560b1407615972e9128ace1a9f081d1b7a5ccad0d9c560" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.910219 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5m44k" Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.917932 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5m44k" event={"ID":"aee6c054-0ce0-44c7-96aa-1722b6339cfe","Type":"ContainerDied","Data":"f08d4ebed6353e34eddcab7d11365f4366e2e4df743d3a147f6a6b4a69f12891"} Feb 02 10:52:18 crc kubenswrapper[4909]: I0202 10:52:18.917971 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f08d4ebed6353e34eddcab7d11365f4366e2e4df743d3a147f6a6b4a69f12891" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.400372 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.405900 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-abe3-account-create-update-47nm8" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.476424 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32e77d63-a0db-4845-80c5-9815b46a9e21-operator-scripts\") pod \"32e77d63-a0db-4845-80c5-9815b46a9e21\" (UID: \"32e77d63-a0db-4845-80c5-9815b46a9e21\") " Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.476602 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh6s9\" (UniqueName: \"kubernetes.io/projected/32e77d63-a0db-4845-80c5-9815b46a9e21-kube-api-access-wh6s9\") pod \"32e77d63-a0db-4845-80c5-9815b46a9e21\" (UID: \"32e77d63-a0db-4845-80c5-9815b46a9e21\") " Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.476963 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32e77d63-a0db-4845-80c5-9815b46a9e21-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32e77d63-a0db-4845-80c5-9815b46a9e21" (UID: "32e77d63-a0db-4845-80c5-9815b46a9e21"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.476989 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd533d6b-9338-4896-b52c-c3123a5e0467-operator-scripts\") pod \"dd533d6b-9338-4896-b52c-c3123a5e0467\" (UID: \"dd533d6b-9338-4896-b52c-c3123a5e0467\") " Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.477084 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jtrm\" (UniqueName: \"kubernetes.io/projected/dd533d6b-9338-4896-b52c-c3123a5e0467-kube-api-access-8jtrm\") pod \"dd533d6b-9338-4896-b52c-c3123a5e0467\" (UID: \"dd533d6b-9338-4896-b52c-c3123a5e0467\") " Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.477306 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd533d6b-9338-4896-b52c-c3123a5e0467-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd533d6b-9338-4896-b52c-c3123a5e0467" (UID: "dd533d6b-9338-4896-b52c-c3123a5e0467"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.477898 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd533d6b-9338-4896-b52c-c3123a5e0467-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.477920 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32e77d63-a0db-4845-80c5-9815b46a9e21-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.480627 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32e77d63-a0db-4845-80c5-9815b46a9e21-kube-api-access-wh6s9" (OuterVolumeSpecName: "kube-api-access-wh6s9") pod "32e77d63-a0db-4845-80c5-9815b46a9e21" (UID: "32e77d63-a0db-4845-80c5-9815b46a9e21"). InnerVolumeSpecName "kube-api-access-wh6s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.481237 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd533d6b-9338-4896-b52c-c3123a5e0467-kube-api-access-8jtrm" (OuterVolumeSpecName: "kube-api-access-8jtrm") pod "dd533d6b-9338-4896-b52c-c3123a5e0467" (UID: "dd533d6b-9338-4896-b52c-c3123a5e0467"). InnerVolumeSpecName "kube-api-access-8jtrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.580011 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh6s9\" (UniqueName: \"kubernetes.io/projected/32e77d63-a0db-4845-80c5-9815b46a9e21-kube-api-access-wh6s9\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.580050 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jtrm\" (UniqueName: \"kubernetes.io/projected/dd533d6b-9338-4896-b52c-c3123a5e0467-kube-api-access-8jtrm\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.919345 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" event={"ID":"32e77d63-a0db-4845-80c5-9815b46a9e21","Type":"ContainerDied","Data":"bc24b82c72520772f6ab645be1ccc4ea1f931b445bb201f30237f1393e3dcc25"} Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.919390 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc24b82c72520772f6ab645be1ccc4ea1f931b445bb201f30237f1393e3dcc25" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.919361 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-90c9-account-create-update-6nkzl" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.921441 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-abe3-account-create-update-47nm8" event={"ID":"dd533d6b-9338-4896-b52c-c3123a5e0467","Type":"ContainerDied","Data":"a9e300ace75a7df031cb5f45a61cd0b6ba4974111a0520bd0c0bcd9525d1ad29"} Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.921481 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9e300ace75a7df031cb5f45a61cd0b6ba4974111a0520bd0c0bcd9525d1ad29" Feb 02 10:52:19 crc kubenswrapper[4909]: I0202 10:52:19.921501 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-abe3-account-create-update-47nm8" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.510518 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.621513 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-ovndb-tls-certs\") pod \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.621567 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-combined-ca-bundle\") pod \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.621684 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-httpd-config\") pod \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.621726 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-config\") pod \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.621748 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mp84k\" (UniqueName: \"kubernetes.io/projected/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-kube-api-access-mp84k\") pod \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\" (UID: \"ff044864-ba16-4d8f-86bc-7677e7d4f8ad\") " Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.630268 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-kube-api-access-mp84k" (OuterVolumeSpecName: "kube-api-access-mp84k") pod "ff044864-ba16-4d8f-86bc-7677e7d4f8ad" (UID: "ff044864-ba16-4d8f-86bc-7677e7d4f8ad"). InnerVolumeSpecName "kube-api-access-mp84k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.636973 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ff044864-ba16-4d8f-86bc-7677e7d4f8ad" (UID: "ff044864-ba16-4d8f-86bc-7677e7d4f8ad"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.682747 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-config" (OuterVolumeSpecName: "config") pod "ff044864-ba16-4d8f-86bc-7677e7d4f8ad" (UID: "ff044864-ba16-4d8f-86bc-7677e7d4f8ad"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.701885 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff044864-ba16-4d8f-86bc-7677e7d4f8ad" (UID: "ff044864-ba16-4d8f-86bc-7677e7d4f8ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.721719 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ff044864-ba16-4d8f-86bc-7677e7d4f8ad" (UID: "ff044864-ba16-4d8f-86bc-7677e7d4f8ad"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.723917 4909 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.724243 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.724404 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.724474 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.724541 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mp84k\" (UniqueName: \"kubernetes.io/projected/ff044864-ba16-4d8f-86bc-7677e7d4f8ad-kube-api-access-mp84k\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.939363 4909 generic.go:334] "Generic (PLEG): container finished" podID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" containerID="6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008" exitCode=0 Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.939426 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d95b455f4-xnd62" event={"ID":"ff044864-ba16-4d8f-86bc-7677e7d4f8ad","Type":"ContainerDied","Data":"6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008"} Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.939827 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d95b455f4-xnd62" event={"ID":"ff044864-ba16-4d8f-86bc-7677e7d4f8ad","Type":"ContainerDied","Data":"822edfe230c2fb83dc9f0ef06223974e9da2cd8e9c2730ddf58421ccee590ce3"} Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.939851 4909 scope.go:117] "RemoveContainer" containerID="1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.939444 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d95b455f4-xnd62" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.961821 4909 scope.go:117] "RemoveContainer" containerID="6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.982163 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d95b455f4-xnd62"] Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.992711 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5d95b455f4-xnd62"] Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.995501 4909 scope.go:117] "RemoveContainer" containerID="1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda" Feb 02 10:52:21 crc kubenswrapper[4909]: E0202 10:52:21.997957 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda\": container with ID starting with 1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda not found: ID does not exist" containerID="1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.998001 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda"} err="failed to get container status \"1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda\": rpc error: code = NotFound desc = could not find container \"1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda\": container with ID starting with 1720ade48c28093480d355eaa93a6320aa69180f5c15d2c0424eab0e255eefda not found: ID does not exist" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.998027 4909 scope.go:117] "RemoveContainer" containerID="6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008" Feb 02 10:52:21 crc kubenswrapper[4909]: E0202 10:52:21.998408 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008\": container with ID starting with 6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008 not found: ID does not exist" containerID="6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008" Feb 02 10:52:21 crc kubenswrapper[4909]: I0202 10:52:21.998516 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008"} err="failed to get container status \"6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008\": rpc error: code = NotFound desc = could not find container \"6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008\": container with ID starting with 6cbde56032c9246f4b60d498066a55f7ba8a171303e707d077f2323fcbf55008 not found: ID does not exist" Feb 02 10:52:23 crc kubenswrapper[4909]: I0202 10:52:23.027250 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" path="/var/lib/kubelet/pods/ff044864-ba16-4d8f-86bc-7677e7d4f8ad/volumes" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.295178 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5vq8t"] Feb 02 10:52:25 crc kubenswrapper[4909]: E0202 10:52:25.296523 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d38222ad-04a2-4a42-8f8f-0789dc7c4c49" containerName="mariadb-database-create" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.296604 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d38222ad-04a2-4a42-8f8f-0789dc7c4c49" containerName="mariadb-database-create" Feb 02 10:52:25 crc kubenswrapper[4909]: E0202 10:52:25.296663 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" containerName="neutron-api" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.296726 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" containerName="neutron-api" Feb 02 10:52:25 crc kubenswrapper[4909]: E0202 10:52:25.296786 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32e77d63-a0db-4845-80c5-9815b46a9e21" containerName="mariadb-account-create-update" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.296861 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="32e77d63-a0db-4845-80c5-9815b46a9e21" containerName="mariadb-account-create-update" Feb 02 10:52:25 crc kubenswrapper[4909]: E0202 10:52:25.296933 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb" containerName="mariadb-database-create" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.297011 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb" containerName="mariadb-database-create" Feb 02 10:52:25 crc kubenswrapper[4909]: E0202 10:52:25.297074 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd533d6b-9338-4896-b52c-c3123a5e0467" containerName="mariadb-account-create-update" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.297130 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd533d6b-9338-4896-b52c-c3123a5e0467" containerName="mariadb-account-create-update" Feb 02 10:52:25 crc kubenswrapper[4909]: E0202 10:52:25.297183 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451" containerName="mariadb-account-create-update" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.297235 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451" containerName="mariadb-account-create-update" Feb 02 10:52:25 crc kubenswrapper[4909]: E0202 10:52:25.297289 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" containerName="neutron-httpd" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.297344 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" containerName="neutron-httpd" Feb 02 10:52:25 crc kubenswrapper[4909]: E0202 10:52:25.297404 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee6c054-0ce0-44c7-96aa-1722b6339cfe" containerName="mariadb-database-create" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.297454 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee6c054-0ce0-44c7-96aa-1722b6339cfe" containerName="mariadb-database-create" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.297700 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd533d6b-9338-4896-b52c-c3123a5e0467" containerName="mariadb-account-create-update" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.297761 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451" containerName="mariadb-account-create-update" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.297836 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" containerName="neutron-api" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.297895 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d38222ad-04a2-4a42-8f8f-0789dc7c4c49" containerName="mariadb-database-create" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.297960 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="32e77d63-a0db-4845-80c5-9815b46a9e21" containerName="mariadb-account-create-update" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.298037 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff044864-ba16-4d8f-86bc-7677e7d4f8ad" containerName="neutron-httpd" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.298092 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee6c054-0ce0-44c7-96aa-1722b6339cfe" containerName="mariadb-database-create" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.298155 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb" containerName="mariadb-database-create" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.298774 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.304771 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5vq8t"] Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.304790 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.304859 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dqv7l" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.305123 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.492153 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-config-data\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.492508 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.492560 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-scripts\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.492599 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5txg6\" (UniqueName: \"kubernetes.io/projected/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-kube-api-access-5txg6\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.593767 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-scripts\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.593868 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5txg6\" (UniqueName: \"kubernetes.io/projected/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-kube-api-access-5txg6\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.593987 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-config-data\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.594023 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.600303 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-scripts\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.600447 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.607335 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-config-data\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.612005 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5txg6\" (UniqueName: \"kubernetes.io/projected/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-kube-api-access-5txg6\") pod \"nova-cell0-conductor-db-sync-5vq8t\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:25 crc kubenswrapper[4909]: I0202 10:52:25.615098 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:26 crc kubenswrapper[4909]: I0202 10:52:26.071705 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5vq8t"] Feb 02 10:52:26 crc kubenswrapper[4909]: I0202 10:52:26.238548 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 02 10:52:26 crc kubenswrapper[4909]: I0202 10:52:26.987727 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5vq8t" event={"ID":"36fe6eb3-17e6-417d-86c8-5b776beb7ddd","Type":"ContainerStarted","Data":"dd49b8d1924f8c336454cb309090a4582466a54ca0c0c0c8388499c79c1845ee"} Feb 02 10:52:29 crc kubenswrapper[4909]: I0202 10:52:29.907439 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:29 crc kubenswrapper[4909]: I0202 10:52:29.908282 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fb15af0b-4954-4dba-b189-c193408924f3" containerName="glance-log" containerID="cri-o://288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37" gracePeriod=30 Feb 02 10:52:29 crc kubenswrapper[4909]: I0202 10:52:29.908688 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fb15af0b-4954-4dba-b189-c193408924f3" containerName="glance-httpd" containerID="cri-o://2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44" gracePeriod=30 Feb 02 10:52:31 crc kubenswrapper[4909]: I0202 10:52:31.039462 4909 generic.go:334] "Generic (PLEG): container finished" podID="fb15af0b-4954-4dba-b189-c193408924f3" containerID="288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37" exitCode=143 Feb 02 10:52:31 crc kubenswrapper[4909]: I0202 10:52:31.039557 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb15af0b-4954-4dba-b189-c193408924f3","Type":"ContainerDied","Data":"288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37"} Feb 02 10:52:32 crc kubenswrapper[4909]: I0202 10:52:32.117768 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 02 10:52:32 crc kubenswrapper[4909]: I0202 10:52:32.537410 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:32 crc kubenswrapper[4909]: I0202 10:52:32.537684 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" containerName="glance-log" containerID="cri-o://60e7a89b43e404c571abf85b133ea6e9e4167764ad23f1e40b9ac64785177e7f" gracePeriod=30 Feb 02 10:52:32 crc kubenswrapper[4909]: I0202 10:52:32.538113 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" containerName="glance-httpd" containerID="cri-o://8567952a983ebcc3a59a1c216958b61dc8e42144faabd6a76f371350ce4e5467" gracePeriod=30 Feb 02 10:52:33 crc kubenswrapper[4909]: I0202 10:52:33.067138 4909 generic.go:334] "Generic (PLEG): container finished" podID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" containerID="60e7a89b43e404c571abf85b133ea6e9e4167764ad23f1e40b9ac64785177e7f" exitCode=143 Feb 02 10:52:33 crc kubenswrapper[4909]: I0202 10:52:33.067210 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f4bc941-d43d-4e64-b0b6-a677ab0374f8","Type":"ContainerDied","Data":"60e7a89b43e404c571abf85b133ea6e9e4167764ad23f1e40b9ac64785177e7f"} Feb 02 10:52:33 crc kubenswrapper[4909]: I0202 10:52:33.963435 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.057730 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-logs\") pod \"fb15af0b-4954-4dba-b189-c193408924f3\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.058413 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-logs" (OuterVolumeSpecName: "logs") pod "fb15af0b-4954-4dba-b189-c193408924f3" (UID: "fb15af0b-4954-4dba-b189-c193408924f3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.058602 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-scripts\") pod \"fb15af0b-4954-4dba-b189-c193408924f3\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.058678 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-combined-ca-bundle\") pod \"fb15af0b-4954-4dba-b189-c193408924f3\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.059280 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpqmq\" (UniqueName: \"kubernetes.io/projected/fb15af0b-4954-4dba-b189-c193408924f3-kube-api-access-tpqmq\") pod \"fb15af0b-4954-4dba-b189-c193408924f3\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.059388 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"fb15af0b-4954-4dba-b189-c193408924f3\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.059647 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-httpd-run\") pod \"fb15af0b-4954-4dba-b189-c193408924f3\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.059721 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-public-tls-certs\") pod \"fb15af0b-4954-4dba-b189-c193408924f3\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.059754 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-config-data\") pod \"fb15af0b-4954-4dba-b189-c193408924f3\" (UID: \"fb15af0b-4954-4dba-b189-c193408924f3\") " Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.060517 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.061060 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fb15af0b-4954-4dba-b189-c193408924f3" (UID: "fb15af0b-4954-4dba-b189-c193408924f3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.088187 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-scripts" (OuterVolumeSpecName: "scripts") pod "fb15af0b-4954-4dba-b189-c193408924f3" (UID: "fb15af0b-4954-4dba-b189-c193408924f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.088299 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "fb15af0b-4954-4dba-b189-c193408924f3" (UID: "fb15af0b-4954-4dba-b189-c193408924f3"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.089037 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb15af0b-4954-4dba-b189-c193408924f3-kube-api-access-tpqmq" (OuterVolumeSpecName: "kube-api-access-tpqmq") pod "fb15af0b-4954-4dba-b189-c193408924f3" (UID: "fb15af0b-4954-4dba-b189-c193408924f3"). InnerVolumeSpecName "kube-api-access-tpqmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.106238 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb15af0b-4954-4dba-b189-c193408924f3" (UID: "fb15af0b-4954-4dba-b189-c193408924f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.109643 4909 generic.go:334] "Generic (PLEG): container finished" podID="fb15af0b-4954-4dba-b189-c193408924f3" containerID="2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44" exitCode=0 Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.109720 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb15af0b-4954-4dba-b189-c193408924f3","Type":"ContainerDied","Data":"2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44"} Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.109752 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fb15af0b-4954-4dba-b189-c193408924f3","Type":"ContainerDied","Data":"f7ef12baf139de721318198c27daa0a8a8c9f344dd87883d50457ca4353e8ce7"} Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.109772 4909 scope.go:117] "RemoveContainer" containerID="2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.109908 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.111960 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5vq8t" event={"ID":"36fe6eb3-17e6-417d-86c8-5b776beb7ddd","Type":"ContainerStarted","Data":"e73ee1c3eb5c48046de751f50517094408c4995c4f0ca716981bc10d3c559f1f"} Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.166007 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.166043 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpqmq\" (UniqueName: \"kubernetes.io/projected/fb15af0b-4954-4dba-b189-c193408924f3-kube-api-access-tpqmq\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.166067 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.166078 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fb15af0b-4954-4dba-b189-c193408924f3-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.166090 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.190698 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5vq8t" podStartSLOduration=1.58717401 podStartE2EDuration="9.190682423s" podCreationTimestamp="2026-02-02 10:52:25 +0000 UTC" firstStartedPulling="2026-02-02 10:52:26.075053969 +0000 UTC m=+1271.821154704" lastFinishedPulling="2026-02-02 10:52:33.678562382 +0000 UTC m=+1279.424663117" observedRunningTime="2026-02-02 10:52:34.166095775 +0000 UTC m=+1279.912196510" watchObservedRunningTime="2026-02-02 10:52:34.190682423 +0000 UTC m=+1279.936783158" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.195698 4909 scope.go:117] "RemoveContainer" containerID="288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.196634 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-config-data" (OuterVolumeSpecName: "config-data") pod "fb15af0b-4954-4dba-b189-c193408924f3" (UID: "fb15af0b-4954-4dba-b189-c193408924f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.210271 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.260767 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fb15af0b-4954-4dba-b189-c193408924f3" (UID: "fb15af0b-4954-4dba-b189-c193408924f3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.261005 4909 scope.go:117] "RemoveContainer" containerID="2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44" Feb 02 10:52:34 crc kubenswrapper[4909]: E0202 10:52:34.261525 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44\": container with ID starting with 2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44 not found: ID does not exist" containerID="2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.261569 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44"} err="failed to get container status \"2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44\": rpc error: code = NotFound desc = could not find container \"2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44\": container with ID starting with 2d2a240aae0aedcdbd22559ad9807a9f22789e0f73a8c28a1081b7787a008e44 not found: ID does not exist" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.261599 4909 scope.go:117] "RemoveContainer" containerID="288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37" Feb 02 10:52:34 crc kubenswrapper[4909]: E0202 10:52:34.261867 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37\": container with ID starting with 288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37 not found: ID does not exist" containerID="288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.261906 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37"} err="failed to get container status \"288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37\": rpc error: code = NotFound desc = could not find container \"288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37\": container with ID starting with 288ab4ff8642e013b75d14b82ca3c9040c1f5bffb9921cc325eee31940359f37 not found: ID does not exist" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.270185 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.270217 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.270228 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb15af0b-4954-4dba-b189-c193408924f3-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.437957 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.446925 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.516991 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:34 crc kubenswrapper[4909]: E0202 10:52:34.517487 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb15af0b-4954-4dba-b189-c193408924f3" containerName="glance-httpd" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.517516 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb15af0b-4954-4dba-b189-c193408924f3" containerName="glance-httpd" Feb 02 10:52:34 crc kubenswrapper[4909]: E0202 10:52:34.517538 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb15af0b-4954-4dba-b189-c193408924f3" containerName="glance-log" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.517548 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb15af0b-4954-4dba-b189-c193408924f3" containerName="glance-log" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.517773 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb15af0b-4954-4dba-b189-c193408924f3" containerName="glance-log" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.517862 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb15af0b-4954-4dba-b189-c193408924f3" containerName="glance-httpd" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.518778 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.521820 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.522042 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.552213 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.694584 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.695024 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwc9w\" (UniqueName: \"kubernetes.io/projected/8c52b752-391b-4770-9191-3494df4e3999-kube-api-access-pwc9w\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.695092 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.695239 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.695293 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.695360 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.695414 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.695451 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-logs\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.797267 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.797345 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.797396 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.797432 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.797468 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-logs\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.797510 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.797549 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwc9w\" (UniqueName: \"kubernetes.io/projected/8c52b752-391b-4770-9191-3494df4e3999-kube-api-access-pwc9w\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.797568 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.797968 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.798500 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-logs\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.801053 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.801324 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.801470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.801981 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.805374 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.830247 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.833554 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwc9w\" (UniqueName: \"kubernetes.io/projected/8c52b752-391b-4770-9191-3494df4e3999-kube-api-access-pwc9w\") pod \"glance-default-external-api-0\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " pod="openstack/glance-default-external-api-0" Feb 02 10:52:34 crc kubenswrapper[4909]: I0202 10:52:34.850293 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:52:35 crc kubenswrapper[4909]: I0202 10:52:35.031101 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb15af0b-4954-4dba-b189-c193408924f3" path="/var/lib/kubelet/pods/fb15af0b-4954-4dba-b189-c193408924f3/volumes" Feb 02 10:52:35 crc kubenswrapper[4909]: I0202 10:52:35.425080 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.157307 4909 generic.go:334] "Generic (PLEG): container finished" podID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" containerID="8567952a983ebcc3a59a1c216958b61dc8e42144faabd6a76f371350ce4e5467" exitCode=0 Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.157588 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f4bc941-d43d-4e64-b0b6-a677ab0374f8","Type":"ContainerDied","Data":"8567952a983ebcc3a59a1c216958b61dc8e42144faabd6a76f371350ce4e5467"} Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.159930 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c52b752-391b-4770-9191-3494df4e3999","Type":"ContainerStarted","Data":"3f7435b37c532d142cc3a2e44d9a5f007446b4a37f3b1f4607541c257406cba7"} Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.159963 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c52b752-391b-4770-9191-3494df4e3999","Type":"ContainerStarted","Data":"62b4ce862ca6d33d4c3131f7c2d0009cb4ae7a959c2b8124e274d729561a329f"} Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.187079 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.323441 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-internal-tls-certs\") pod \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.323487 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-combined-ca-bundle\") pod \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.323595 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jph9d\" (UniqueName: \"kubernetes.io/projected/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-kube-api-access-jph9d\") pod \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.323627 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-scripts\") pod \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.323693 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-httpd-run\") pod \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.323727 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-logs\") pod \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.323767 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-config-data\") pod \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.323789 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\" (UID: \"3f4bc941-d43d-4e64-b0b6-a677ab0374f8\") " Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.324324 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3f4bc941-d43d-4e64-b0b6-a677ab0374f8" (UID: "3f4bc941-d43d-4e64-b0b6-a677ab0374f8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.324391 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-logs" (OuterVolumeSpecName: "logs") pod "3f4bc941-d43d-4e64-b0b6-a677ab0374f8" (UID: "3f4bc941-d43d-4e64-b0b6-a677ab0374f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.324792 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.324940 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.330236 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "3f4bc941-d43d-4e64-b0b6-a677ab0374f8" (UID: "3f4bc941-d43d-4e64-b0b6-a677ab0374f8"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.331223 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-scripts" (OuterVolumeSpecName: "scripts") pod "3f4bc941-d43d-4e64-b0b6-a677ab0374f8" (UID: "3f4bc941-d43d-4e64-b0b6-a677ab0374f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.364552 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-kube-api-access-jph9d" (OuterVolumeSpecName: "kube-api-access-jph9d") pod "3f4bc941-d43d-4e64-b0b6-a677ab0374f8" (UID: "3f4bc941-d43d-4e64-b0b6-a677ab0374f8"). InnerVolumeSpecName "kube-api-access-jph9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.381902 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-config-data" (OuterVolumeSpecName: "config-data") pod "3f4bc941-d43d-4e64-b0b6-a677ab0374f8" (UID: "3f4bc941-d43d-4e64-b0b6-a677ab0374f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.391767 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f4bc941-d43d-4e64-b0b6-a677ab0374f8" (UID: "3f4bc941-d43d-4e64-b0b6-a677ab0374f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.410863 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3f4bc941-d43d-4e64-b0b6-a677ab0374f8" (UID: "3f4bc941-d43d-4e64-b0b6-a677ab0374f8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.427105 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.427180 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.427196 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.427210 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.427222 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jph9d\" (UniqueName: \"kubernetes.io/projected/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-kube-api-access-jph9d\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.427236 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f4bc941-d43d-4e64-b0b6-a677ab0374f8-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.449723 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 02 10:52:36 crc kubenswrapper[4909]: I0202 10:52:36.528731 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.172354 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3f4bc941-d43d-4e64-b0b6-a677ab0374f8","Type":"ContainerDied","Data":"b86f86d1a6d89fa117c2c872986406a245dda9f89fcc9b2e998480c4365277de"} Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.172427 4909 scope.go:117] "RemoveContainer" containerID="8567952a983ebcc3a59a1c216958b61dc8e42144faabd6a76f371350ce4e5467" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.172498 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.195400 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.195584 4909 scope.go:117] "RemoveContainer" containerID="60e7a89b43e404c571abf85b133ea6e9e4167764ad23f1e40b9ac64785177e7f" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.203535 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.228915 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:37 crc kubenswrapper[4909]: E0202 10:52:37.229285 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" containerName="glance-log" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.229303 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" containerName="glance-log" Feb 02 10:52:37 crc kubenswrapper[4909]: E0202 10:52:37.229334 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" containerName="glance-httpd" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.229340 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" containerName="glance-httpd" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.229496 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" containerName="glance-log" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.229523 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" containerName="glance-httpd" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.230432 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.232930 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.233285 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.238616 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.344109 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.344404 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-logs\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.344482 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.344523 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.344598 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.344619 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhq9p\" (UniqueName: \"kubernetes.io/projected/97409ffd-f1ab-4a1a-9939-a041a4085b1a-kube-api-access-xhq9p\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.344664 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.344682 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.446051 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-logs\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.446103 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.446124 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.446153 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.446169 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhq9p\" (UniqueName: \"kubernetes.io/projected/97409ffd-f1ab-4a1a-9939-a041a4085b1a-kube-api-access-xhq9p\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.446194 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.446209 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.446295 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.446695 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.446724 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-logs\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.447159 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.451159 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.451979 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.454880 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.460437 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.478303 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhq9p\" (UniqueName: \"kubernetes.io/projected/97409ffd-f1ab-4a1a-9939-a041a4085b1a-kube-api-access-xhq9p\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.478643 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " pod="openstack/glance-default-internal-api-0" Feb 02 10:52:37 crc kubenswrapper[4909]: I0202 10:52:37.546309 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:38 crc kubenswrapper[4909]: I0202 10:52:38.090154 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:52:38 crc kubenswrapper[4909]: I0202 10:52:38.204090 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97409ffd-f1ab-4a1a-9939-a041a4085b1a","Type":"ContainerStarted","Data":"7262c0c2c09b87e3d707614ee0dbfc03f24e14199b066cf7824b4f081f53c249"} Feb 02 10:52:39 crc kubenswrapper[4909]: I0202 10:52:39.033442 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4bc941-d43d-4e64-b0b6-a677ab0374f8" path="/var/lib/kubelet/pods/3f4bc941-d43d-4e64-b0b6-a677ab0374f8/volumes" Feb 02 10:52:39 crc kubenswrapper[4909]: I0202 10:52:39.217117 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97409ffd-f1ab-4a1a-9939-a041a4085b1a","Type":"ContainerStarted","Data":"ecc6af1fec99e4b5728487ec159109b228720e7e811d71b0bc2fd00d2e8a68cc"} Feb 02 10:52:39 crc kubenswrapper[4909]: I0202 10:52:39.217185 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97409ffd-f1ab-4a1a-9939-a041a4085b1a","Type":"ContainerStarted","Data":"1ea9ba4798372399476cd86a9d7248e0dc1ac843d30751cd5067d09ad4963c62"} Feb 02 10:52:39 crc kubenswrapper[4909]: I0202 10:52:39.220478 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c52b752-391b-4770-9191-3494df4e3999","Type":"ContainerStarted","Data":"79243ba3eb5d42f1bd0f138c616462a4f129bde3f205dc9026f0724a065bb0f7"} Feb 02 10:52:39 crc kubenswrapper[4909]: I0202 10:52:39.251611 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.251594871 podStartE2EDuration="2.251594871s" podCreationTimestamp="2026-02-02 10:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:39.243590454 +0000 UTC m=+1284.989691189" watchObservedRunningTime="2026-02-02 10:52:39.251594871 +0000 UTC m=+1284.997695606" Feb 02 10:52:39 crc kubenswrapper[4909]: I0202 10:52:39.267267 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.267250946 podStartE2EDuration="5.267250946s" podCreationTimestamp="2026-02-02 10:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:39.265154466 +0000 UTC m=+1285.011255201" watchObservedRunningTime="2026-02-02 10:52:39.267250946 +0000 UTC m=+1285.013351671" Feb 02 10:52:43 crc kubenswrapper[4909]: W0202 10:52:43.781743 4909 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32e77d63_a0db_4845_80c5_9815b46a9e21.slice/crio-bc24b82c72520772f6ab645be1ccc4ea1f931b445bb201f30237f1393e3dcc25": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32e77d63_a0db_4845_80c5_9815b46a9e21.slice/crio-bc24b82c72520772f6ab645be1ccc4ea1f931b445bb201f30237f1393e3dcc25: no such file or directory Feb 02 10:52:43 crc kubenswrapper[4909]: W0202 10:52:43.785204 4909 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd533d6b_9338_4896_b52c_c3123a5e0467.slice/crio-conmon-abe9b83cee5ffbc3e19cf902b840a1d2bf660fa5d7600b5b0657196354a95639.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd533d6b_9338_4896_b52c_c3123a5e0467.slice/crio-conmon-abe9b83cee5ffbc3e19cf902b840a1d2bf660fa5d7600b5b0657196354a95639.scope: no such file or directory Feb 02 10:52:43 crc kubenswrapper[4909]: W0202 10:52:43.785274 4909 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f19c47b_c9fc_4762_9cb0_fcb8b9ad95bb.slice/crio-ce78989884566e501936aa0f9c402244d1dd8a9250194989d7173d57f7bcf1f8.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f19c47b_c9fc_4762_9cb0_fcb8b9ad95bb.slice/crio-ce78989884566e501936aa0f9c402244d1dd8a9250194989d7173d57f7bcf1f8.scope: no such file or directory Feb 02 10:52:43 crc kubenswrapper[4909]: W0202 10:52:43.785303 4909 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32e77d63_a0db_4845_80c5_9815b46a9e21.slice/crio-conmon-54beb5c8ca4e401e90a1f311f21072bcb862ee38775a25ba04703ad4a2386559.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32e77d63_a0db_4845_80c5_9815b46a9e21.slice/crio-conmon-54beb5c8ca4e401e90a1f311f21072bcb862ee38775a25ba04703ad4a2386559.scope: no such file or directory Feb 02 10:52:43 crc kubenswrapper[4909]: W0202 10:52:43.789492 4909 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd533d6b_9338_4896_b52c_c3123a5e0467.slice/crio-abe9b83cee5ffbc3e19cf902b840a1d2bf660fa5d7600b5b0657196354a95639.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd533d6b_9338_4896_b52c_c3123a5e0467.slice/crio-abe9b83cee5ffbc3e19cf902b840a1d2bf660fa5d7600b5b0657196354a95639.scope: no such file or directory Feb 02 10:52:43 crc kubenswrapper[4909]: W0202 10:52:43.789528 4909 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32e77d63_a0db_4845_80c5_9815b46a9e21.slice/crio-54beb5c8ca4e401e90a1f311f21072bcb862ee38775a25ba04703ad4a2386559.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32e77d63_a0db_4845_80c5_9815b46a9e21.slice/crio-54beb5c8ca4e401e90a1f311f21072bcb862ee38775a25ba04703ad4a2386559.scope: no such file or directory Feb 02 10:52:44 crc kubenswrapper[4909]: E0202 10:52:44.044072 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f4bc941_d43d_4e64_b0b6_a677ab0374f8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e687182_7508_4ff5_8138_dd582e11cdc5.slice/crio-conmon-5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f4bc941_d43d_4e64_b0b6_a677ab0374f8.slice/crio-conmon-8567952a983ebcc3a59a1c216958b61dc8e42144faabd6a76f371350ce4e5467.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f4bc941_d43d_4e64_b0b6_a677ab0374f8.slice/crio-b86f86d1a6d89fa117c2c872986406a245dda9f89fcc9b2e998480c4365277de\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e687182_7508_4ff5_8138_dd582e11cdc5.slice/crio-5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f4bc941_d43d_4e64_b0b6_a677ab0374f8.slice/crio-8567952a983ebcc3a59a1c216958b61dc8e42144faabd6a76f371350ce4e5467.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.157086 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.276679 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-combined-ca-bundle\") pod \"9e687182-7508-4ff5-8138-dd582e11cdc5\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.276735 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6bkg\" (UniqueName: \"kubernetes.io/projected/9e687182-7508-4ff5-8138-dd582e11cdc5-kube-api-access-j6bkg\") pod \"9e687182-7508-4ff5-8138-dd582e11cdc5\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.276838 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-log-httpd\") pod \"9e687182-7508-4ff5-8138-dd582e11cdc5\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.276869 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-scripts\") pod \"9e687182-7508-4ff5-8138-dd582e11cdc5\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.276973 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-config-data\") pod \"9e687182-7508-4ff5-8138-dd582e11cdc5\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.277026 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-run-httpd\") pod \"9e687182-7508-4ff5-8138-dd582e11cdc5\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.277633 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.277452 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9e687182-7508-4ff5-8138-dd582e11cdc5" (UID: "9e687182-7508-4ff5-8138-dd582e11cdc5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.277545 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9e687182-7508-4ff5-8138-dd582e11cdc5" (UID: "9e687182-7508-4ff5-8138-dd582e11cdc5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.277638 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-sg-core-conf-yaml\") pod \"9e687182-7508-4ff5-8138-dd582e11cdc5\" (UID: \"9e687182-7508-4ff5-8138-dd582e11cdc5\") " Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.277573 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerID="5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435" exitCode=137 Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.277598 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e687182-7508-4ff5-8138-dd582e11cdc5","Type":"ContainerDied","Data":"5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435"} Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.277765 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e687182-7508-4ff5-8138-dd582e11cdc5","Type":"ContainerDied","Data":"15ece1fb1ebec9f996c35bda29a185687eeceabfec80722248f3d700f36fb913"} Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.277799 4909 scope.go:117] "RemoveContainer" containerID="5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.278465 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.278496 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e687182-7508-4ff5-8138-dd582e11cdc5-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.284795 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e687182-7508-4ff5-8138-dd582e11cdc5-kube-api-access-j6bkg" (OuterVolumeSpecName: "kube-api-access-j6bkg") pod "9e687182-7508-4ff5-8138-dd582e11cdc5" (UID: "9e687182-7508-4ff5-8138-dd582e11cdc5"). InnerVolumeSpecName "kube-api-access-j6bkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.288981 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-scripts" (OuterVolumeSpecName: "scripts") pod "9e687182-7508-4ff5-8138-dd582e11cdc5" (UID: "9e687182-7508-4ff5-8138-dd582e11cdc5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.308409 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9e687182-7508-4ff5-8138-dd582e11cdc5" (UID: "9e687182-7508-4ff5-8138-dd582e11cdc5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.370011 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e687182-7508-4ff5-8138-dd582e11cdc5" (UID: "9e687182-7508-4ff5-8138-dd582e11cdc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.379782 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.379827 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.379837 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6bkg\" (UniqueName: \"kubernetes.io/projected/9e687182-7508-4ff5-8138-dd582e11cdc5-kube-api-access-j6bkg\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.379848 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.382245 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-config-data" (OuterVolumeSpecName: "config-data") pod "9e687182-7508-4ff5-8138-dd582e11cdc5" (UID: "9e687182-7508-4ff5-8138-dd582e11cdc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.382313 4909 scope.go:117] "RemoveContainer" containerID="5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.400582 4909 scope.go:117] "RemoveContainer" containerID="23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.423493 4909 scope.go:117] "RemoveContainer" containerID="e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.446175 4909 scope.go:117] "RemoveContainer" containerID="5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435" Feb 02 10:52:44 crc kubenswrapper[4909]: E0202 10:52:44.446615 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435\": container with ID starting with 5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435 not found: ID does not exist" containerID="5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.446644 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435"} err="failed to get container status \"5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435\": rpc error: code = NotFound desc = could not find container \"5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435\": container with ID starting with 5dcf5a9224650780abadd0fb2e13f43e6896ba6ebd00dd773cc07732b98a7435 not found: ID does not exist" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.446670 4909 scope.go:117] "RemoveContainer" containerID="5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb" Feb 02 10:52:44 crc kubenswrapper[4909]: E0202 10:52:44.447096 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb\": container with ID starting with 5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb not found: ID does not exist" containerID="5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.447121 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb"} err="failed to get container status \"5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb\": rpc error: code = NotFound desc = could not find container \"5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb\": container with ID starting with 5e99a7e3dee6342ef19032476169ddbbf0e3de2356cda6bcda71a199577770fb not found: ID does not exist" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.447136 4909 scope.go:117] "RemoveContainer" containerID="23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6" Feb 02 10:52:44 crc kubenswrapper[4909]: E0202 10:52:44.447452 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6\": container with ID starting with 23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6 not found: ID does not exist" containerID="23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.447494 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6"} err="failed to get container status \"23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6\": rpc error: code = NotFound desc = could not find container \"23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6\": container with ID starting with 23ddc9046c50e2fdd5d5b05e3f9c550a643f31014c6f4b418a8b08f7bc758bc6 not found: ID does not exist" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.447522 4909 scope.go:117] "RemoveContainer" containerID="e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f" Feb 02 10:52:44 crc kubenswrapper[4909]: E0202 10:52:44.447772 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f\": container with ID starting with e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f not found: ID does not exist" containerID="e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.447797 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f"} err="failed to get container status \"e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f\": rpc error: code = NotFound desc = could not find container \"e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f\": container with ID starting with e0a153c99271be1d041656c8daa8fd2c6d759ed21c667da0084ba87594f2985f not found: ID does not exist" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.481097 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e687182-7508-4ff5-8138-dd582e11cdc5-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.611913 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.622194 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.646236 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:44 crc kubenswrapper[4909]: E0202 10:52:44.646791 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="ceilometer-central-agent" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.646841 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="ceilometer-central-agent" Feb 02 10:52:44 crc kubenswrapper[4909]: E0202 10:52:44.646862 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="ceilometer-notification-agent" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.646870 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="ceilometer-notification-agent" Feb 02 10:52:44 crc kubenswrapper[4909]: E0202 10:52:44.646895 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="proxy-httpd" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.646903 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="proxy-httpd" Feb 02 10:52:44 crc kubenswrapper[4909]: E0202 10:52:44.646922 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="sg-core" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.646929 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="sg-core" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.647117 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="ceilometer-notification-agent" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.647141 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="sg-core" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.647159 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="proxy-httpd" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.647175 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" containerName="ceilometer-central-agent" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.648966 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.651212 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.651514 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.658732 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.787691 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-config-data\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.788382 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.788425 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.788656 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ls29\" (UniqueName: \"kubernetes.io/projected/701fbae9-e013-4311-ab91-55c3fca98e66-kube-api-access-7ls29\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.788761 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-log-httpd\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.788935 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-scripts\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.789052 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-run-httpd\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.849873 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.849957 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.883077 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.891403 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-config-data\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.891474 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.891506 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.891571 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ls29\" (UniqueName: \"kubernetes.io/projected/701fbae9-e013-4311-ab91-55c3fca98e66-kube-api-access-7ls29\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.891608 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-log-httpd\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.891669 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-scripts\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.891739 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-run-httpd\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.892477 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-run-httpd\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.892483 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-log-httpd\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.896801 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.897718 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-config-data\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.899260 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-scripts\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.901982 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.913753 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ls29\" (UniqueName: \"kubernetes.io/projected/701fbae9-e013-4311-ab91-55c3fca98e66-kube-api-access-7ls29\") pod \"ceilometer-0\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " pod="openstack/ceilometer-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.922877 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 10:52:44 crc kubenswrapper[4909]: I0202 10:52:44.971657 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:52:45 crc kubenswrapper[4909]: I0202 10:52:45.027876 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e687182-7508-4ff5-8138-dd582e11cdc5" path="/var/lib/kubelet/pods/9e687182-7508-4ff5-8138-dd582e11cdc5/volumes" Feb 02 10:52:45 crc kubenswrapper[4909]: I0202 10:52:45.308762 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:52:45 crc kubenswrapper[4909]: I0202 10:52:45.308994 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 10:52:45 crc kubenswrapper[4909]: I0202 10:52:45.585880 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:52:46 crc kubenswrapper[4909]: I0202 10:52:46.317018 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701fbae9-e013-4311-ab91-55c3fca98e66","Type":"ContainerStarted","Data":"c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141"} Feb 02 10:52:46 crc kubenswrapper[4909]: I0202 10:52:46.317397 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701fbae9-e013-4311-ab91-55c3fca98e66","Type":"ContainerStarted","Data":"347596334d7a1fcf2d9c8db015a83592d2503b3cc1a0135c843311bfc1ac86f9"} Feb 02 10:52:47 crc kubenswrapper[4909]: I0202 10:52:47.229221 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4909]: I0202 10:52:47.256093 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 10:52:47 crc kubenswrapper[4909]: I0202 10:52:47.330190 4909 generic.go:334] "Generic (PLEG): container finished" podID="36fe6eb3-17e6-417d-86c8-5b776beb7ddd" containerID="e73ee1c3eb5c48046de751f50517094408c4995c4f0ca716981bc10d3c559f1f" exitCode=0 Feb 02 10:52:47 crc kubenswrapper[4909]: I0202 10:52:47.330253 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5vq8t" event={"ID":"36fe6eb3-17e6-417d-86c8-5b776beb7ddd","Type":"ContainerDied","Data":"e73ee1c3eb5c48046de751f50517094408c4995c4f0ca716981bc10d3c559f1f"} Feb 02 10:52:47 crc kubenswrapper[4909]: I0202 10:52:47.342121 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701fbae9-e013-4311-ab91-55c3fca98e66","Type":"ContainerStarted","Data":"6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636"} Feb 02 10:52:47 crc kubenswrapper[4909]: I0202 10:52:47.547348 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:47 crc kubenswrapper[4909]: I0202 10:52:47.547395 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:47 crc kubenswrapper[4909]: I0202 10:52:47.592573 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:47 crc kubenswrapper[4909]: I0202 10:52:47.602326 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.356944 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701fbae9-e013-4311-ab91-55c3fca98e66","Type":"ContainerStarted","Data":"85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74"} Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.357452 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.357479 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.729310 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.812689 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-config-data\") pod \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.812851 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-scripts\") pod \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.812891 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-combined-ca-bundle\") pod \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.812992 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5txg6\" (UniqueName: \"kubernetes.io/projected/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-kube-api-access-5txg6\") pod \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\" (UID: \"36fe6eb3-17e6-417d-86c8-5b776beb7ddd\") " Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.822173 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-scripts" (OuterVolumeSpecName: "scripts") pod "36fe6eb3-17e6-417d-86c8-5b776beb7ddd" (UID: "36fe6eb3-17e6-417d-86c8-5b776beb7ddd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.825024 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-kube-api-access-5txg6" (OuterVolumeSpecName: "kube-api-access-5txg6") pod "36fe6eb3-17e6-417d-86c8-5b776beb7ddd" (UID: "36fe6eb3-17e6-417d-86c8-5b776beb7ddd"). InnerVolumeSpecName "kube-api-access-5txg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.846221 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36fe6eb3-17e6-417d-86c8-5b776beb7ddd" (UID: "36fe6eb3-17e6-417d-86c8-5b776beb7ddd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.851781 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-config-data" (OuterVolumeSpecName: "config-data") pod "36fe6eb3-17e6-417d-86c8-5b776beb7ddd" (UID: "36fe6eb3-17e6-417d-86c8-5b776beb7ddd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.915056 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.915087 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.915100 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5txg6\" (UniqueName: \"kubernetes.io/projected/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-kube-api-access-5txg6\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:48 crc kubenswrapper[4909]: I0202 10:52:48.915108 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36fe6eb3-17e6-417d-86c8-5b776beb7ddd-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.366848 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5vq8t" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.367331 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5vq8t" event={"ID":"36fe6eb3-17e6-417d-86c8-5b776beb7ddd","Type":"ContainerDied","Data":"dd49b8d1924f8c336454cb309090a4582466a54ca0c0c0c8388499c79c1845ee"} Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.367357 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd49b8d1924f8c336454cb309090a4582466a54ca0c0c0c8388499c79c1845ee" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.511160 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.511614 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.536389 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:52:49 crc kubenswrapper[4909]: E0202 10:52:49.536991 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36fe6eb3-17e6-417d-86c8-5b776beb7ddd" containerName="nova-cell0-conductor-db-sync" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.537012 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="36fe6eb3-17e6-417d-86c8-5b776beb7ddd" containerName="nova-cell0-conductor-db-sync" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.537275 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="36fe6eb3-17e6-417d-86c8-5b776beb7ddd" containerName="nova-cell0-conductor-db-sync" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.550006 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.552572 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.556370 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dqv7l" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.556657 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.628439 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.628587 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.628655 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhp6b\" (UniqueName: \"kubernetes.io/projected/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-kube-api-access-jhp6b\") pod \"nova-cell0-conductor-0\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.730011 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.730135 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.730210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhp6b\" (UniqueName: \"kubernetes.io/projected/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-kube-api-access-jhp6b\") pod \"nova-cell0-conductor-0\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.737123 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.742623 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.754479 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhp6b\" (UniqueName: \"kubernetes.io/projected/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-kube-api-access-jhp6b\") pod \"nova-cell0-conductor-0\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:49 crc kubenswrapper[4909]: I0202 10:52:49.875194 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:50 crc kubenswrapper[4909]: I0202 10:52:50.360747 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:52:50 crc kubenswrapper[4909]: I0202 10:52:50.378763 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2399f7a5-86e0-46bd-9f3d-624d1208b9cc","Type":"ContainerStarted","Data":"a0f1e33d6f4f839c914c2fcd7281839ff059cf40f2d13b954274a4f08f99dc88"} Feb 02 10:52:50 crc kubenswrapper[4909]: I0202 10:52:50.380951 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:52:50 crc kubenswrapper[4909]: I0202 10:52:50.380968 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 10:52:50 crc kubenswrapper[4909]: I0202 10:52:50.382428 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701fbae9-e013-4311-ab91-55c3fca98e66","Type":"ContainerStarted","Data":"f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c"} Feb 02 10:52:50 crc kubenswrapper[4909]: I0202 10:52:50.382469 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:52:50 crc kubenswrapper[4909]: I0202 10:52:50.403320 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.374396134 podStartE2EDuration="6.4032803s" podCreationTimestamp="2026-02-02 10:52:44 +0000 UTC" firstStartedPulling="2026-02-02 10:52:45.595655034 +0000 UTC m=+1291.341755769" lastFinishedPulling="2026-02-02 10:52:49.6245392 +0000 UTC m=+1295.370639935" observedRunningTime="2026-02-02 10:52:50.399268487 +0000 UTC m=+1296.145369222" watchObservedRunningTime="2026-02-02 10:52:50.4032803 +0000 UTC m=+1296.149381036" Feb 02 10:52:50 crc kubenswrapper[4909]: I0202 10:52:50.650522 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:50 crc kubenswrapper[4909]: I0202 10:52:50.723523 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 10:52:51 crc kubenswrapper[4909]: I0202 10:52:51.391036 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2399f7a5-86e0-46bd-9f3d-624d1208b9cc","Type":"ContainerStarted","Data":"e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51"} Feb 02 10:52:51 crc kubenswrapper[4909]: I0202 10:52:51.391518 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 02 10:52:51 crc kubenswrapper[4909]: I0202 10:52:51.405869 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.405847198 podStartE2EDuration="2.405847198s" podCreationTimestamp="2026-02-02 10:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:52:51.403888812 +0000 UTC m=+1297.149989557" watchObservedRunningTime="2026-02-02 10:52:51.405847198 +0000 UTC m=+1297.151947933" Feb 02 10:52:59 crc kubenswrapper[4909]: I0202 10:52:59.902609 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.334387 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-hgjsh"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.340475 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.343825 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.344280 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hgjsh"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.347303 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.476156 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.477565 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.479748 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.502722 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.542012 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-scripts\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.542083 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.542143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkh6q\" (UniqueName: \"kubernetes.io/projected/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-kube-api-access-lkh6q\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.542199 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-config-data\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.581236 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.583086 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.584913 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.608143 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.645968 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-scripts\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646027 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646075 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkh6q\" (UniqueName: \"kubernetes.io/projected/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-kube-api-access-lkh6q\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646108 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646137 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-config-data\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646167 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-config-data\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646191 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce01e534-820e-4cff-bcf6-f8d401e89e04-logs\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646211 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-config-data\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646264 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2966e07-3069-4fd7-b707-c805e95edb41-logs\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646280 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgk7b\" (UniqueName: \"kubernetes.io/projected/a2966e07-3069-4fd7-b707-c805e95edb41-kube-api-access-kgk7b\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646302 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkl5x\" (UniqueName: \"kubernetes.io/projected/ce01e534-820e-4cff-bcf6-f8d401e89e04-kube-api-access-wkl5x\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.646319 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.655543 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-scripts\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.656678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.661454 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-config-data\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.681178 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkh6q\" (UniqueName: \"kubernetes.io/projected/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-kube-api-access-lkh6q\") pod \"nova-cell0-cell-mapping-hgjsh\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.690941 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.706474 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.708098 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.714358 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.723879 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.752907 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.752975 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-config-data\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.753002 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce01e534-820e-4cff-bcf6-f8d401e89e04-logs\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.753019 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-config-data\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.753070 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2966e07-3069-4fd7-b707-c805e95edb41-logs\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.753089 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgk7b\" (UniqueName: \"kubernetes.io/projected/a2966e07-3069-4fd7-b707-c805e95edb41-kube-api-access-kgk7b\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.753110 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkl5x\" (UniqueName: \"kubernetes.io/projected/ce01e534-820e-4cff-bcf6-f8d401e89e04-kube-api-access-wkl5x\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.753127 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.754671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce01e534-820e-4cff-bcf6-f8d401e89e04-logs\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.755182 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2966e07-3069-4fd7-b707-c805e95edb41-logs\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.763698 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-config-data\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.770397 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.772295 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-config-data\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.776887 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.778271 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.784383 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.797321 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgk7b\" (UniqueName: \"kubernetes.io/projected/a2966e07-3069-4fd7-b707-c805e95edb41-kube-api-access-kgk7b\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.803251 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.807946 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkl5x\" (UniqueName: \"kubernetes.io/projected/ce01e534-820e-4cff-bcf6-f8d401e89e04-kube-api-access-wkl5x\") pod \"nova-api-0\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " pod="openstack/nova-api-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.813901 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.846475 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-znn65"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.848233 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.857944 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858025 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9q22\" (UniqueName: \"kubernetes.io/projected/4ac70436-19ae-40b7-b329-33e08bef15bd-kube-api-access-p9q22\") pod \"nova-scheduler-0\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858133 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwv7x\" (UniqueName: \"kubernetes.io/projected/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-kube-api-access-dwv7x\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858175 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858241 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858303 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858345 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858416 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-swift-storage-0\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858440 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d42qp\" (UniqueName: \"kubernetes.io/projected/9e425915-23f7-4cde-8c2d-3dcbca42e315-kube-api-access-d42qp\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-svc\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858581 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-config-data\") pod \"nova-scheduler-0\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.858673 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-config\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.881579 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-znn65"] Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.914892 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967396 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-swift-storage-0\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967474 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d42qp\" (UniqueName: \"kubernetes.io/projected/9e425915-23f7-4cde-8c2d-3dcbca42e315-kube-api-access-d42qp\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967514 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-svc\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967537 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-config-data\") pod \"nova-scheduler-0\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967583 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-config\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967640 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967659 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9q22\" (UniqueName: \"kubernetes.io/projected/4ac70436-19ae-40b7-b329-33e08bef15bd-kube-api-access-p9q22\") pod \"nova-scheduler-0\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967675 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwv7x\" (UniqueName: \"kubernetes.io/projected/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-kube-api-access-dwv7x\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967698 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967736 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.967760 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.969166 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.969877 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-svc\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.972214 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-swift-storage-0\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.976891 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.989029 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-config-data\") pod \"nova-scheduler-0\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:00 crc kubenswrapper[4909]: I0202 10:53:00.995374 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.006374 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-config\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.010561 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwv7x\" (UniqueName: \"kubernetes.io/projected/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-kube-api-access-dwv7x\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.035151 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.035463 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.036015 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d42qp\" (UniqueName: \"kubernetes.io/projected/9e425915-23f7-4cde-8c2d-3dcbca42e315-kube-api-access-d42qp\") pod \"dnsmasq-dns-6bc699f5c5-znn65\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.055944 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9q22\" (UniqueName: \"kubernetes.io/projected/4ac70436-19ae-40b7-b329-33e08bef15bd-kube-api-access-p9q22\") pod \"nova-scheduler-0\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.107011 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.232753 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hgjsh"] Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.235911 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.256999 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.268370 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.515729 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hgjsh" event={"ID":"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf","Type":"ContainerStarted","Data":"4066ceb2c47b2028a49eed7d0f991f10b024f4891a5d57b189339fd56a0f0e58"} Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.556191 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:01 crc kubenswrapper[4909]: W0202 10:53:01.575964 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2966e07_3069_4fd7_b707_c805e95edb41.slice/crio-6c2ec160e1af542d73c04233f7947239cc44a99a7ca5b67b75d41e351afe953d WatchSource:0}: Error finding container 6c2ec160e1af542d73c04233f7947239cc44a99a7ca5b67b75d41e351afe953d: Status 404 returned error can't find the container with id 6c2ec160e1af542d73c04233f7947239cc44a99a7ca5b67b75d41e351afe953d Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.771917 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:01 crc kubenswrapper[4909]: W0202 10:53:01.775411 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce01e534_820e_4cff_bcf6_f8d401e89e04.slice/crio-eb3365b227224f470909a76192653a6b8bed83c864e2e9a3ffa9102804a3456c WatchSource:0}: Error finding container eb3365b227224f470909a76192653a6b8bed83c864e2e9a3ffa9102804a3456c: Status 404 returned error can't find the container with id eb3365b227224f470909a76192653a6b8bed83c864e2e9a3ffa9102804a3456c Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.890979 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.909285 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:53:01 crc kubenswrapper[4909]: W0202 10:53:01.935796 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a48086e_b086_4f81_9cb3_16e40fc2c1d5.slice/crio-2f21e81db5b7d7769aa137dba2bbe6f234ea6b31ddd4aca3edbdcbd2a017b03a WatchSource:0}: Error finding container 2f21e81db5b7d7769aa137dba2bbe6f234ea6b31ddd4aca3edbdcbd2a017b03a: Status 404 returned error can't find the container with id 2f21e81db5b7d7769aa137dba2bbe6f234ea6b31ddd4aca3edbdcbd2a017b03a Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.974870 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-chlnb"] Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.976001 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.979890 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.980092 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 02 10:53:01 crc kubenswrapper[4909]: I0202 10:53:01.990885 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-chlnb"] Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.107852 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-znn65"] Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.121687 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbqkr\" (UniqueName: \"kubernetes.io/projected/25588ec8-c1dd-42ca-983e-54a84e3f8a15-kube-api-access-qbqkr\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.121797 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.121948 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-config-data\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.121967 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-scripts\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.226446 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbqkr\" (UniqueName: \"kubernetes.io/projected/25588ec8-c1dd-42ca-983e-54a84e3f8a15-kube-api-access-qbqkr\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.226557 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.226599 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-config-data\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.226619 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-scripts\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.232751 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-scripts\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.233402 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-config-data\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.237249 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.252757 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbqkr\" (UniqueName: \"kubernetes.io/projected/25588ec8-c1dd-42ca-983e-54a84e3f8a15-kube-api-access-qbqkr\") pod \"nova-cell1-conductor-db-sync-chlnb\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.336889 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.535939 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2966e07-3069-4fd7-b707-c805e95edb41","Type":"ContainerStarted","Data":"6c2ec160e1af542d73c04233f7947239cc44a99a7ca5b67b75d41e351afe953d"} Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.539582 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce01e534-820e-4cff-bcf6-f8d401e89e04","Type":"ContainerStarted","Data":"eb3365b227224f470909a76192653a6b8bed83c864e2e9a3ffa9102804a3456c"} Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.558768 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hgjsh" event={"ID":"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf","Type":"ContainerStarted","Data":"174be2f9179aec2d0d5dfb3b6c0efd4bcdf236b7cca281a9e2eacfe94339df84"} Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.565116 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a48086e-b086-4f81-9cb3-16e40fc2c1d5","Type":"ContainerStarted","Data":"2f21e81db5b7d7769aa137dba2bbe6f234ea6b31ddd4aca3edbdcbd2a017b03a"} Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.584720 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-hgjsh" podStartSLOduration=2.584694888 podStartE2EDuration="2.584694888s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:02.574975542 +0000 UTC m=+1308.321076277" watchObservedRunningTime="2026-02-02 10:53:02.584694888 +0000 UTC m=+1308.330795623" Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.589265 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e425915-23f7-4cde-8c2d-3dcbca42e315" containerID="42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc" exitCode=0 Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.589334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" event={"ID":"9e425915-23f7-4cde-8c2d-3dcbca42e315","Type":"ContainerDied","Data":"42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc"} Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.589362 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" event={"ID":"9e425915-23f7-4cde-8c2d-3dcbca42e315","Type":"ContainerStarted","Data":"b90a2c8bbe2b2d0938c0b5f242ed061e894319444fb0e8ff280e38b908439109"} Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.599002 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4ac70436-19ae-40b7-b329-33e08bef15bd","Type":"ContainerStarted","Data":"3501ad7e7f5c3ba3544728fb18a1a93b5b3b47281f28d89c082a8235ebca8c05"} Feb 02 10:53:02 crc kubenswrapper[4909]: I0202 10:53:02.717859 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-chlnb"] Feb 02 10:53:02 crc kubenswrapper[4909]: W0202 10:53:02.734030 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25588ec8_c1dd_42ca_983e_54a84e3f8a15.slice/crio-326d28d170b74dc763a97d6e2d21e282864120b889515e6047e895953e1e6a55 WatchSource:0}: Error finding container 326d28d170b74dc763a97d6e2d21e282864120b889515e6047e895953e1e6a55: Status 404 returned error can't find the container with id 326d28d170b74dc763a97d6e2d21e282864120b889515e6047e895953e1e6a55 Feb 02 10:53:03 crc kubenswrapper[4909]: I0202 10:53:03.613944 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-chlnb" event={"ID":"25588ec8-c1dd-42ca-983e-54a84e3f8a15","Type":"ContainerStarted","Data":"e94e4dc6904fd62932a918a1984c4c0852316abca98ec3e71a4141a0396799d0"} Feb 02 10:53:03 crc kubenswrapper[4909]: I0202 10:53:03.614321 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-chlnb" event={"ID":"25588ec8-c1dd-42ca-983e-54a84e3f8a15","Type":"ContainerStarted","Data":"326d28d170b74dc763a97d6e2d21e282864120b889515e6047e895953e1e6a55"} Feb 02 10:53:03 crc kubenswrapper[4909]: I0202 10:53:03.616458 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" event={"ID":"9e425915-23f7-4cde-8c2d-3dcbca42e315","Type":"ContainerStarted","Data":"edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a"} Feb 02 10:53:03 crc kubenswrapper[4909]: I0202 10:53:03.640531 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-chlnb" podStartSLOduration=2.640514097 podStartE2EDuration="2.640514097s" podCreationTimestamp="2026-02-02 10:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:03.632771167 +0000 UTC m=+1309.378871902" watchObservedRunningTime="2026-02-02 10:53:03.640514097 +0000 UTC m=+1309.386614832" Feb 02 10:53:03 crc kubenswrapper[4909]: I0202 10:53:03.659891 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" podStartSLOduration=3.659867137 podStartE2EDuration="3.659867137s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:03.655022329 +0000 UTC m=+1309.401123064" watchObservedRunningTime="2026-02-02 10:53:03.659867137 +0000 UTC m=+1309.405967872" Feb 02 10:53:04 crc kubenswrapper[4909]: I0202 10:53:04.626004 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:04 crc kubenswrapper[4909]: I0202 10:53:04.996510 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:05 crc kubenswrapper[4909]: I0202 10:53:05.013935 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.647101 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2966e07-3069-4fd7-b707-c805e95edb41","Type":"ContainerStarted","Data":"d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06"} Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.647732 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2966e07-3069-4fd7-b707-c805e95edb41","Type":"ContainerStarted","Data":"9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a"} Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.647262 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a2966e07-3069-4fd7-b707-c805e95edb41" containerName="nova-metadata-metadata" containerID="cri-o://d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06" gracePeriod=30 Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.647207 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a2966e07-3069-4fd7-b707-c805e95edb41" containerName="nova-metadata-log" containerID="cri-o://9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a" gracePeriod=30 Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.649706 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce01e534-820e-4cff-bcf6-f8d401e89e04","Type":"ContainerStarted","Data":"a4aa6b07c367fc7cfccceb7d438ff24f47d22ec8fb03f3acd437ee89b27c8ba1"} Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.649738 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce01e534-820e-4cff-bcf6-f8d401e89e04","Type":"ContainerStarted","Data":"b05c5b98d511e2a3da5be2387945a639c6a9aff97a09b61b93ceef5e4e6bd022"} Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.653038 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a48086e-b086-4f81-9cb3-16e40fc2c1d5","Type":"ContainerStarted","Data":"806bbf2f7813821e387fd50d1030b913b4e7caeb65653bb90e0d90a42508f01e"} Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.653173 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0a48086e-b086-4f81-9cb3-16e40fc2c1d5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://806bbf2f7813821e387fd50d1030b913b4e7caeb65653bb90e0d90a42508f01e" gracePeriod=30 Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.659890 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4ac70436-19ae-40b7-b329-33e08bef15bd","Type":"ContainerStarted","Data":"a80850c211cdc87d3c1e82c0cff67aa1c2184690bbe3cd4753f80ee2c8a921b4"} Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.688237 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.861370053 podStartE2EDuration="6.688219842s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:01.595591213 +0000 UTC m=+1307.341691958" lastFinishedPulling="2026-02-02 10:53:05.422441012 +0000 UTC m=+1311.168541747" observedRunningTime="2026-02-02 10:53:06.675378188 +0000 UTC m=+1312.421478923" watchObservedRunningTime="2026-02-02 10:53:06.688219842 +0000 UTC m=+1312.434320577" Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.707266 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.223718052 podStartE2EDuration="6.707248133s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:01.940243229 +0000 UTC m=+1307.686343964" lastFinishedPulling="2026-02-02 10:53:05.42377331 +0000 UTC m=+1311.169874045" observedRunningTime="2026-02-02 10:53:06.700604364 +0000 UTC m=+1312.446705099" watchObservedRunningTime="2026-02-02 10:53:06.707248133 +0000 UTC m=+1312.453348868" Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.731527 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.096047866 podStartE2EDuration="6.731503711s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:01.782603543 +0000 UTC m=+1307.528704278" lastFinishedPulling="2026-02-02 10:53:05.418059388 +0000 UTC m=+1311.164160123" observedRunningTime="2026-02-02 10:53:06.720447447 +0000 UTC m=+1312.466548182" watchObservedRunningTime="2026-02-02 10:53:06.731503711 +0000 UTC m=+1312.477604446" Feb 02 10:53:06 crc kubenswrapper[4909]: I0202 10:53:06.739570 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.2534663569999998 podStartE2EDuration="6.73954681s" podCreationTimestamp="2026-02-02 10:53:00 +0000 UTC" firstStartedPulling="2026-02-02 10:53:01.940158877 +0000 UTC m=+1307.686259612" lastFinishedPulling="2026-02-02 10:53:05.42623933 +0000 UTC m=+1311.172340065" observedRunningTime="2026-02-02 10:53:06.736461362 +0000 UTC m=+1312.482562097" watchObservedRunningTime="2026-02-02 10:53:06.73954681 +0000 UTC m=+1312.485647545" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.293622 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.357690 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2966e07-3069-4fd7-b707-c805e95edb41-logs\") pod \"a2966e07-3069-4fd7-b707-c805e95edb41\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.358160 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-config-data\") pod \"a2966e07-3069-4fd7-b707-c805e95edb41\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.358212 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2966e07-3069-4fd7-b707-c805e95edb41-logs" (OuterVolumeSpecName: "logs") pod "a2966e07-3069-4fd7-b707-c805e95edb41" (UID: "a2966e07-3069-4fd7-b707-c805e95edb41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.358418 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgk7b\" (UniqueName: \"kubernetes.io/projected/a2966e07-3069-4fd7-b707-c805e95edb41-kube-api-access-kgk7b\") pod \"a2966e07-3069-4fd7-b707-c805e95edb41\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.358575 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-combined-ca-bundle\") pod \"a2966e07-3069-4fd7-b707-c805e95edb41\" (UID: \"a2966e07-3069-4fd7-b707-c805e95edb41\") " Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.359446 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2966e07-3069-4fd7-b707-c805e95edb41-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.372214 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2966e07-3069-4fd7-b707-c805e95edb41-kube-api-access-kgk7b" (OuterVolumeSpecName: "kube-api-access-kgk7b") pod "a2966e07-3069-4fd7-b707-c805e95edb41" (UID: "a2966e07-3069-4fd7-b707-c805e95edb41"). InnerVolumeSpecName "kube-api-access-kgk7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.389005 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-config-data" (OuterVolumeSpecName: "config-data") pod "a2966e07-3069-4fd7-b707-c805e95edb41" (UID: "a2966e07-3069-4fd7-b707-c805e95edb41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.411274 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2966e07-3069-4fd7-b707-c805e95edb41" (UID: "a2966e07-3069-4fd7-b707-c805e95edb41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.464079 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgk7b\" (UniqueName: \"kubernetes.io/projected/a2966e07-3069-4fd7-b707-c805e95edb41-kube-api-access-kgk7b\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.464107 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.464116 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2966e07-3069-4fd7-b707-c805e95edb41-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.671618 4909 generic.go:334] "Generic (PLEG): container finished" podID="a2966e07-3069-4fd7-b707-c805e95edb41" containerID="d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06" exitCode=0 Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.671645 4909 generic.go:334] "Generic (PLEG): container finished" podID="a2966e07-3069-4fd7-b707-c805e95edb41" containerID="9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a" exitCode=143 Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.671680 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.673946 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2966e07-3069-4fd7-b707-c805e95edb41","Type":"ContainerDied","Data":"d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06"} Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.674086 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2966e07-3069-4fd7-b707-c805e95edb41","Type":"ContainerDied","Data":"9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a"} Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.674203 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2966e07-3069-4fd7-b707-c805e95edb41","Type":"ContainerDied","Data":"6c2ec160e1af542d73c04233f7947239cc44a99a7ca5b67b75d41e351afe953d"} Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.674179 4909 scope.go:117] "RemoveContainer" containerID="d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.712907 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.713500 4909 scope.go:117] "RemoveContainer" containerID="9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.732455 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.753367 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:07 crc kubenswrapper[4909]: E0202 10:53:07.753886 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2966e07-3069-4fd7-b707-c805e95edb41" containerName="nova-metadata-metadata" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.753904 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2966e07-3069-4fd7-b707-c805e95edb41" containerName="nova-metadata-metadata" Feb 02 10:53:07 crc kubenswrapper[4909]: E0202 10:53:07.753926 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2966e07-3069-4fd7-b707-c805e95edb41" containerName="nova-metadata-log" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.753940 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2966e07-3069-4fd7-b707-c805e95edb41" containerName="nova-metadata-log" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.754241 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2966e07-3069-4fd7-b707-c805e95edb41" containerName="nova-metadata-log" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.754939 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2966e07-3069-4fd7-b707-c805e95edb41" containerName="nova-metadata-metadata" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.791454 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.791571 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.794087 4909 scope.go:117] "RemoveContainer" containerID="d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.794150 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.796025 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 10:53:07 crc kubenswrapper[4909]: E0202 10:53:07.799946 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06\": container with ID starting with d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06 not found: ID does not exist" containerID="d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.800046 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06"} err="failed to get container status \"d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06\": rpc error: code = NotFound desc = could not find container \"d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06\": container with ID starting with d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06 not found: ID does not exist" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.800075 4909 scope.go:117] "RemoveContainer" containerID="9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a" Feb 02 10:53:07 crc kubenswrapper[4909]: E0202 10:53:07.800400 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a\": container with ID starting with 9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a not found: ID does not exist" containerID="9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.800418 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a"} err="failed to get container status \"9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a\": rpc error: code = NotFound desc = could not find container \"9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a\": container with ID starting with 9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a not found: ID does not exist" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.800430 4909 scope.go:117] "RemoveContainer" containerID="d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.800637 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06"} err="failed to get container status \"d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06\": rpc error: code = NotFound desc = could not find container \"d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06\": container with ID starting with d21252d79ad0d3d878f9f1e8c3d6b39324fd3e8dfd1666709c97ce0bd2ecdc06 not found: ID does not exist" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.800669 4909 scope.go:117] "RemoveContainer" containerID="9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.800935 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a"} err="failed to get container status \"9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a\": rpc error: code = NotFound desc = could not find container \"9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a\": container with ID starting with 9e3c2126168005fa0badbb290ad2ede47c93e11f6398b32899efcac20f32416a not found: ID does not exist" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.875241 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d9bb-ef17-41fb-9f68-d689beacce94-logs\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.875372 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.875407 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.875496 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-config-data\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.875517 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d8hg\" (UniqueName: \"kubernetes.io/projected/9ce1d9bb-ef17-41fb-9f68-d689beacce94-kube-api-access-8d8hg\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.977656 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d9bb-ef17-41fb-9f68-d689beacce94-logs\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.977970 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.978074 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.978151 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d9bb-ef17-41fb-9f68-d689beacce94-logs\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.978246 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-config-data\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.978405 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d8hg\" (UniqueName: \"kubernetes.io/projected/9ce1d9bb-ef17-41fb-9f68-d689beacce94-kube-api-access-8d8hg\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.981809 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.981959 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:07 crc kubenswrapper[4909]: I0202 10:53:07.982169 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-config-data\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:08 crc kubenswrapper[4909]: I0202 10:53:08.007206 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d8hg\" (UniqueName: \"kubernetes.io/projected/9ce1d9bb-ef17-41fb-9f68-d689beacce94-kube-api-access-8d8hg\") pod \"nova-metadata-0\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " pod="openstack/nova-metadata-0" Feb 02 10:53:08 crc kubenswrapper[4909]: I0202 10:53:08.109878 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:53:08 crc kubenswrapper[4909]: I0202 10:53:08.584104 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:08 crc kubenswrapper[4909]: I0202 10:53:08.680691 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ce1d9bb-ef17-41fb-9f68-d689beacce94","Type":"ContainerStarted","Data":"6cf24af71946dd087ac66fc4cf80e8660a4a7f4520333b09f98438fd665e89c2"} Feb 02 10:53:09 crc kubenswrapper[4909]: I0202 10:53:09.028774 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2966e07-3069-4fd7-b707-c805e95edb41" path="/var/lib/kubelet/pods/a2966e07-3069-4fd7-b707-c805e95edb41/volumes" Feb 02 10:53:09 crc kubenswrapper[4909]: I0202 10:53:09.692736 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ce1d9bb-ef17-41fb-9f68-d689beacce94","Type":"ContainerStarted","Data":"48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5"} Feb 02 10:53:09 crc kubenswrapper[4909]: I0202 10:53:09.692777 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ce1d9bb-ef17-41fb-9f68-d689beacce94","Type":"ContainerStarted","Data":"312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786"} Feb 02 10:53:09 crc kubenswrapper[4909]: I0202 10:53:09.694196 4909 generic.go:334] "Generic (PLEG): container finished" podID="0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf" containerID="174be2f9179aec2d0d5dfb3b6c0efd4bcdf236b7cca281a9e2eacfe94339df84" exitCode=0 Feb 02 10:53:09 crc kubenswrapper[4909]: I0202 10:53:09.694223 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hgjsh" event={"ID":"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf","Type":"ContainerDied","Data":"174be2f9179aec2d0d5dfb3b6c0efd4bcdf236b7cca281a9e2eacfe94339df84"} Feb 02 10:53:09 crc kubenswrapper[4909]: I0202 10:53:09.740449 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.740429146 podStartE2EDuration="2.740429146s" podCreationTimestamp="2026-02-02 10:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:09.719137102 +0000 UTC m=+1315.465237837" watchObservedRunningTime="2026-02-02 10:53:09.740429146 +0000 UTC m=+1315.486529881" Feb 02 10:53:10 crc kubenswrapper[4909]: I0202 10:53:10.705314 4909 generic.go:334] "Generic (PLEG): container finished" podID="25588ec8-c1dd-42ca-983e-54a84e3f8a15" containerID="e94e4dc6904fd62932a918a1984c4c0852316abca98ec3e71a4141a0396799d0" exitCode=0 Feb 02 10:53:10 crc kubenswrapper[4909]: I0202 10:53:10.705403 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-chlnb" event={"ID":"25588ec8-c1dd-42ca-983e-54a84e3f8a15","Type":"ContainerDied","Data":"e94e4dc6904fd62932a918a1984c4c0852316abca98ec3e71a4141a0396799d0"} Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.066485 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.108352 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.108450 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.236303 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.236357 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.248289 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-combined-ca-bundle\") pod \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.248376 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-scripts\") pod \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.248448 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-config-data\") pod \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.248634 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkh6q\" (UniqueName: \"kubernetes.io/projected/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-kube-api-access-lkh6q\") pod \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\" (UID: \"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf\") " Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.256593 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-kube-api-access-lkh6q" (OuterVolumeSpecName: "kube-api-access-lkh6q") pod "0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf" (UID: "0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf"). InnerVolumeSpecName "kube-api-access-lkh6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.257592 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.260983 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-scripts" (OuterVolumeSpecName: "scripts") pod "0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf" (UID: "0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.270983 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.279497 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-config-data" (OuterVolumeSpecName: "config-data") pod "0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf" (UID: "0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.279734 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf" (UID: "0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.282993 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.348998 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-rl62l"] Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.349496 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" podUID="17452cea-74ae-4e13-8369-431a2062addc" containerName="dnsmasq-dns" containerID="cri-o://5fcb95bd792a909ea5b933458787933e9ca37ce2f1240323b32b4b2c86cc21ef" gracePeriod=10 Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.353162 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkh6q\" (UniqueName: \"kubernetes.io/projected/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-kube-api-access-lkh6q\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.353376 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.353468 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.353541 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.725090 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hgjsh" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.725100 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hgjsh" event={"ID":"0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf","Type":"ContainerDied","Data":"4066ceb2c47b2028a49eed7d0f991f10b024f4891a5d57b189339fd56a0f0e58"} Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.725136 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4066ceb2c47b2028a49eed7d0f991f10b024f4891a5d57b189339fd56a0f0e58" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.733422 4909 generic.go:334] "Generic (PLEG): container finished" podID="17452cea-74ae-4e13-8369-431a2062addc" containerID="5fcb95bd792a909ea5b933458787933e9ca37ce2f1240323b32b4b2c86cc21ef" exitCode=0 Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.733496 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" event={"ID":"17452cea-74ae-4e13-8369-431a2062addc","Type":"ContainerDied","Data":"5fcb95bd792a909ea5b933458787933e9ca37ce2f1240323b32b4b2c86cc21ef"} Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.794116 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.953611 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.954271 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerName="nova-api-log" containerID="cri-o://b05c5b98d511e2a3da5be2387945a639c6a9aff97a09b61b93ceef5e4e6bd022" gracePeriod=30 Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.954547 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerName="nova-api-api" containerID="cri-o://a4aa6b07c367fc7cfccceb7d438ff24f47d22ec8fb03f3acd437ee89b27c8ba1" gracePeriod=30 Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.962940 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": EOF" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.965961 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": EOF" Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.982013 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.982228 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" containerName="nova-metadata-log" containerID="cri-o://312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786" gracePeriod=30 Feb 02 10:53:11 crc kubenswrapper[4909]: I0202 10:53:11.982681 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" containerName="nova-metadata-metadata" containerID="cri-o://48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5" gracePeriod=30 Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.293825 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.310177 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.379389 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-config\") pod \"17452cea-74ae-4e13-8369-431a2062addc\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.379505 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-swift-storage-0\") pod \"17452cea-74ae-4e13-8369-431a2062addc\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.379552 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-nb\") pod \"17452cea-74ae-4e13-8369-431a2062addc\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.379576 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-sb\") pod \"17452cea-74ae-4e13-8369-431a2062addc\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.379644 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j477p\" (UniqueName: \"kubernetes.io/projected/17452cea-74ae-4e13-8369-431a2062addc-kube-api-access-j477p\") pod \"17452cea-74ae-4e13-8369-431a2062addc\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.379754 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-svc\") pod \"17452cea-74ae-4e13-8369-431a2062addc\" (UID: \"17452cea-74ae-4e13-8369-431a2062addc\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.391861 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17452cea-74ae-4e13-8369-431a2062addc-kube-api-access-j477p" (OuterVolumeSpecName: "kube-api-access-j477p") pod "17452cea-74ae-4e13-8369-431a2062addc" (UID: "17452cea-74ae-4e13-8369-431a2062addc"). InnerVolumeSpecName "kube-api-access-j477p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.444404 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.473738 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "17452cea-74ae-4e13-8369-431a2062addc" (UID: "17452cea-74ae-4e13-8369-431a2062addc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.482124 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbqkr\" (UniqueName: \"kubernetes.io/projected/25588ec8-c1dd-42ca-983e-54a84e3f8a15-kube-api-access-qbqkr\") pod \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.482214 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-scripts\") pod \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.482230 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-config" (OuterVolumeSpecName: "config") pod "17452cea-74ae-4e13-8369-431a2062addc" (UID: "17452cea-74ae-4e13-8369-431a2062addc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.482286 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-combined-ca-bundle\") pod \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.482491 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-config-data\") pod \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\" (UID: \"25588ec8-c1dd-42ca-983e-54a84e3f8a15\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.483035 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.483060 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.483076 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j477p\" (UniqueName: \"kubernetes.io/projected/17452cea-74ae-4e13-8369-431a2062addc-kube-api-access-j477p\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.485403 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "17452cea-74ae-4e13-8369-431a2062addc" (UID: "17452cea-74ae-4e13-8369-431a2062addc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.486448 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25588ec8-c1dd-42ca-983e-54a84e3f8a15-kube-api-access-qbqkr" (OuterVolumeSpecName: "kube-api-access-qbqkr") pod "25588ec8-c1dd-42ca-983e-54a84e3f8a15" (UID: "25588ec8-c1dd-42ca-983e-54a84e3f8a15"). InnerVolumeSpecName "kube-api-access-qbqkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.496141 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-scripts" (OuterVolumeSpecName: "scripts") pod "25588ec8-c1dd-42ca-983e-54a84e3f8a15" (UID: "25588ec8-c1dd-42ca-983e-54a84e3f8a15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.496628 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "17452cea-74ae-4e13-8369-431a2062addc" (UID: "17452cea-74ae-4e13-8369-431a2062addc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.523266 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25588ec8-c1dd-42ca-983e-54a84e3f8a15" (UID: "25588ec8-c1dd-42ca-983e-54a84e3f8a15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.529604 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-config-data" (OuterVolumeSpecName: "config-data") pod "25588ec8-c1dd-42ca-983e-54a84e3f8a15" (UID: "25588ec8-c1dd-42ca-983e-54a84e3f8a15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.534946 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "17452cea-74ae-4e13-8369-431a2062addc" (UID: "17452cea-74ae-4e13-8369-431a2062addc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.554677 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.584375 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.584414 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.584431 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbqkr\" (UniqueName: \"kubernetes.io/projected/25588ec8-c1dd-42ca-983e-54a84e3f8a15-kube-api-access-qbqkr\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.584445 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.584457 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.584467 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25588ec8-c1dd-42ca-983e-54a84e3f8a15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.584477 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/17452cea-74ae-4e13-8369-431a2062addc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.685728 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d9bb-ef17-41fb-9f68-d689beacce94-logs\") pod \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.685784 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-nova-metadata-tls-certs\") pod \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.685888 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-combined-ca-bundle\") pod \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.686113 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce1d9bb-ef17-41fb-9f68-d689beacce94-logs" (OuterVolumeSpecName: "logs") pod "9ce1d9bb-ef17-41fb-9f68-d689beacce94" (UID: "9ce1d9bb-ef17-41fb-9f68-d689beacce94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.686345 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d8hg\" (UniqueName: \"kubernetes.io/projected/9ce1d9bb-ef17-41fb-9f68-d689beacce94-kube-api-access-8d8hg\") pod \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.686462 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-config-data\") pod \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\" (UID: \"9ce1d9bb-ef17-41fb-9f68-d689beacce94\") " Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.686990 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d9bb-ef17-41fb-9f68-d689beacce94-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.690792 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce1d9bb-ef17-41fb-9f68-d689beacce94-kube-api-access-8d8hg" (OuterVolumeSpecName: "kube-api-access-8d8hg") pod "9ce1d9bb-ef17-41fb-9f68-d689beacce94" (UID: "9ce1d9bb-ef17-41fb-9f68-d689beacce94"). InnerVolumeSpecName "kube-api-access-8d8hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.711183 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ce1d9bb-ef17-41fb-9f68-d689beacce94" (UID: "9ce1d9bb-ef17-41fb-9f68-d689beacce94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.719532 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-config-data" (OuterVolumeSpecName: "config-data") pod "9ce1d9bb-ef17-41fb-9f68-d689beacce94" (UID: "9ce1d9bb-ef17-41fb-9f68-d689beacce94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.754304 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9ce1d9bb-ef17-41fb-9f68-d689beacce94" (UID: "9ce1d9bb-ef17-41fb-9f68-d689beacce94"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.766695 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerID="b05c5b98d511e2a3da5be2387945a639c6a9aff97a09b61b93ceef5e4e6bd022" exitCode=143 Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.766758 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce01e534-820e-4cff-bcf6-f8d401e89e04","Type":"ContainerDied","Data":"b05c5b98d511e2a3da5be2387945a639c6a9aff97a09b61b93ceef5e4e6bd022"} Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.768720 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-chlnb" event={"ID":"25588ec8-c1dd-42ca-983e-54a84e3f8a15","Type":"ContainerDied","Data":"326d28d170b74dc763a97d6e2d21e282864120b889515e6047e895953e1e6a55"} Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.768748 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="326d28d170b74dc763a97d6e2d21e282864120b889515e6047e895953e1e6a55" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.768768 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-chlnb" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.771481 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" event={"ID":"17452cea-74ae-4e13-8369-431a2062addc","Type":"ContainerDied","Data":"7cd572050039adbc36366d7ae04bb6867825f7aea830bab43fac184cd47d3c25"} Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.771533 4909 scope.go:117] "RemoveContainer" containerID="5fcb95bd792a909ea5b933458787933e9ca37ce2f1240323b32b4b2c86cc21ef" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.771714 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-rl62l" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.787127 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.788214 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ce1d9bb-ef17-41fb-9f68-d689beacce94","Type":"ContainerDied","Data":"48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5"} Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.788280 4909 generic.go:334] "Generic (PLEG): container finished" podID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" containerID="48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5" exitCode=0 Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.788406 4909 generic.go:334] "Generic (PLEG): container finished" podID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" containerID="312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786" exitCode=143 Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.788519 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ce1d9bb-ef17-41fb-9f68-d689beacce94","Type":"ContainerDied","Data":"312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786"} Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.788545 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9ce1d9bb-ef17-41fb-9f68-d689beacce94","Type":"ContainerDied","Data":"6cf24af71946dd087ac66fc4cf80e8660a4a7f4520333b09f98438fd665e89c2"} Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.789107 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.789987 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.790682 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d8hg\" (UniqueName: \"kubernetes.io/projected/9ce1d9bb-ef17-41fb-9f68-d689beacce94-kube-api-access-8d8hg\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.790848 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ce1d9bb-ef17-41fb-9f68-d689beacce94-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.822511 4909 scope.go:117] "RemoveContainer" containerID="02865c2c13ba1491c90611238a5f366850156749981c87da1bb4d977674ed09d" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.832952 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:53:12 crc kubenswrapper[4909]: E0202 10:53:12.833500 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17452cea-74ae-4e13-8369-431a2062addc" containerName="init" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833517 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="17452cea-74ae-4e13-8369-431a2062addc" containerName="init" Feb 02 10:53:12 crc kubenswrapper[4909]: E0202 10:53:12.833536 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17452cea-74ae-4e13-8369-431a2062addc" containerName="dnsmasq-dns" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833542 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="17452cea-74ae-4e13-8369-431a2062addc" containerName="dnsmasq-dns" Feb 02 10:53:12 crc kubenswrapper[4909]: E0202 10:53:12.833555 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" containerName="nova-metadata-metadata" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833561 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" containerName="nova-metadata-metadata" Feb 02 10:53:12 crc kubenswrapper[4909]: E0202 10:53:12.833577 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf" containerName="nova-manage" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833583 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf" containerName="nova-manage" Feb 02 10:53:12 crc kubenswrapper[4909]: E0202 10:53:12.833596 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25588ec8-c1dd-42ca-983e-54a84e3f8a15" containerName="nova-cell1-conductor-db-sync" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833602 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="25588ec8-c1dd-42ca-983e-54a84e3f8a15" containerName="nova-cell1-conductor-db-sync" Feb 02 10:53:12 crc kubenswrapper[4909]: E0202 10:53:12.833613 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" containerName="nova-metadata-log" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833619 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" containerName="nova-metadata-log" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833771 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="25588ec8-c1dd-42ca-983e-54a84e3f8a15" containerName="nova-cell1-conductor-db-sync" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833786 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="17452cea-74ae-4e13-8369-431a2062addc" containerName="dnsmasq-dns" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833796 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" containerName="nova-metadata-log" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833824 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" containerName="nova-metadata-metadata" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.833839 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf" containerName="nova-manage" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.834481 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.840381 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.848461 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.866562 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-rl62l"] Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.885056 4909 scope.go:117] "RemoveContainer" containerID="48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.887121 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-rl62l"] Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.913585 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.917247 4909 scope.go:117] "RemoveContainer" containerID="312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.935763 4909 scope.go:117] "RemoveContainer" containerID="48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5" Feb 02 10:53:12 crc kubenswrapper[4909]: E0202 10:53:12.936203 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5\": container with ID starting with 48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5 not found: ID does not exist" containerID="48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.936239 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5"} err="failed to get container status \"48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5\": rpc error: code = NotFound desc = could not find container \"48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5\": container with ID starting with 48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5 not found: ID does not exist" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.936272 4909 scope.go:117] "RemoveContainer" containerID="312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786" Feb 02 10:53:12 crc kubenswrapper[4909]: E0202 10:53:12.937321 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786\": container with ID starting with 312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786 not found: ID does not exist" containerID="312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.937352 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786"} err="failed to get container status \"312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786\": rpc error: code = NotFound desc = could not find container \"312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786\": container with ID starting with 312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786 not found: ID does not exist" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.937378 4909 scope.go:117] "RemoveContainer" containerID="48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.938910 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.939748 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5"} err="failed to get container status \"48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5\": rpc error: code = NotFound desc = could not find container \"48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5\": container with ID starting with 48109e8adda0c10b831c666efd17814cd53648452d6a7bec8b3fba5065591ba5 not found: ID does not exist" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.940063 4909 scope.go:117] "RemoveContainer" containerID="312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.940376 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786"} err="failed to get container status \"312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786\": rpc error: code = NotFound desc = could not find container \"312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786\": container with ID starting with 312db1cee4e77cfd89b6767d0f8515bb35c24f4f20097ae148c2602e45219786 not found: ID does not exist" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.967766 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.969856 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.972009 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.972175 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.977518 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.993201 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.993264 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:12 crc kubenswrapper[4909]: I0202 10:53:12.993338 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2s7w\" (UniqueName: \"kubernetes.io/projected/05143579-706d-4107-9d7a-a63b4a13c187-kube-api-access-l2s7w\") pod \"nova-cell1-conductor-0\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.028035 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17452cea-74ae-4e13-8369-431a2062addc" path="/var/lib/kubelet/pods/17452cea-74ae-4e13-8369-431a2062addc/volumes" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.028949 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ce1d9bb-ef17-41fb-9f68-d689beacce94" path="/var/lib/kubelet/pods/9ce1d9bb-ef17-41fb-9f68-d689beacce94/volumes" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.094865 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.094926 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2s7w\" (UniqueName: \"kubernetes.io/projected/05143579-706d-4107-9d7a-a63b4a13c187-kube-api-access-l2s7w\") pod \"nova-cell1-conductor-0\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.094963 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b44m\" (UniqueName: \"kubernetes.io/projected/411a3543-746a-4905-bd62-c712fa09daef-kube-api-access-9b44m\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.095002 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-config-data\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.095141 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411a3543-746a-4905-bd62-c712fa09daef-logs\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.095187 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.095382 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.095460 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.108643 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.114488 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.115998 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2s7w\" (UniqueName: \"kubernetes.io/projected/05143579-706d-4107-9d7a-a63b4a13c187-kube-api-access-l2s7w\") pod \"nova-cell1-conductor-0\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.160294 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.197331 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b44m\" (UniqueName: \"kubernetes.io/projected/411a3543-746a-4905-bd62-c712fa09daef-kube-api-access-9b44m\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.197397 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-config-data\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.197454 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411a3543-746a-4905-bd62-c712fa09daef-logs\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.197476 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.197827 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.198004 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411a3543-746a-4905-bd62-c712fa09daef-logs\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.206494 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-config-data\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.206630 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.210825 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.217544 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b44m\" (UniqueName: \"kubernetes.io/projected/411a3543-746a-4905-bd62-c712fa09daef-kube-api-access-9b44m\") pod \"nova-metadata-0\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.298499 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.660191 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:53:13 crc kubenswrapper[4909]: W0202 10:53:13.665625 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05143579_706d_4107_9d7a_a63b4a13c187.slice/crio-b02bf9e93273b22183635526cc42cdd13e7fa25dcb85df2b9d0b26fc7b7fe2d1 WatchSource:0}: Error finding container b02bf9e93273b22183635526cc42cdd13e7fa25dcb85df2b9d0b26fc7b7fe2d1: Status 404 returned error can't find the container with id b02bf9e93273b22183635526cc42cdd13e7fa25dcb85df2b9d0b26fc7b7fe2d1 Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.799405 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"05143579-706d-4107-9d7a-a63b4a13c187","Type":"ContainerStarted","Data":"b02bf9e93273b22183635526cc42cdd13e7fa25dcb85df2b9d0b26fc7b7fe2d1"} Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.802527 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4ac70436-19ae-40b7-b329-33e08bef15bd" containerName="nova-scheduler-scheduler" containerID="cri-o://a80850c211cdc87d3c1e82c0cff67aa1c2184690bbe3cd4753f80ee2c8a921b4" gracePeriod=30 Feb 02 10:53:13 crc kubenswrapper[4909]: I0202 10:53:13.827056 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:14 crc kubenswrapper[4909]: I0202 10:53:14.811560 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"05143579-706d-4107-9d7a-a63b4a13c187","Type":"ContainerStarted","Data":"7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466"} Feb 02 10:53:14 crc kubenswrapper[4909]: I0202 10:53:14.812062 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:14 crc kubenswrapper[4909]: I0202 10:53:14.813862 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"411a3543-746a-4905-bd62-c712fa09daef","Type":"ContainerStarted","Data":"39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6"} Feb 02 10:53:14 crc kubenswrapper[4909]: I0202 10:53:14.813911 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"411a3543-746a-4905-bd62-c712fa09daef","Type":"ContainerStarted","Data":"14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756"} Feb 02 10:53:14 crc kubenswrapper[4909]: I0202 10:53:14.813924 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"411a3543-746a-4905-bd62-c712fa09daef","Type":"ContainerStarted","Data":"9aea5850e7f172549c73eaf0a4fe548e419f737812cb382c8f4b06645b7254a3"} Feb 02 10:53:14 crc kubenswrapper[4909]: I0202 10:53:14.846656 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.846635601 podStartE2EDuration="2.846635601s" podCreationTimestamp="2026-02-02 10:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:14.836298107 +0000 UTC m=+1320.582398842" watchObservedRunningTime="2026-02-02 10:53:14.846635601 +0000 UTC m=+1320.592736336" Feb 02 10:53:14 crc kubenswrapper[4909]: I0202 10:53:14.856274 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.856255904 podStartE2EDuration="2.856255904s" podCreationTimestamp="2026-02-02 10:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:14.853525296 +0000 UTC m=+1320.599626041" watchObservedRunningTime="2026-02-02 10:53:14.856255904 +0000 UTC m=+1320.602356649" Feb 02 10:53:14 crc kubenswrapper[4909]: I0202 10:53:14.976076 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 10:53:16 crc kubenswrapper[4909]: E0202 10:53:16.238794 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a80850c211cdc87d3c1e82c0cff67aa1c2184690bbe3cd4753f80ee2c8a921b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:53:16 crc kubenswrapper[4909]: E0202 10:53:16.241674 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a80850c211cdc87d3c1e82c0cff67aa1c2184690bbe3cd4753f80ee2c8a921b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:53:16 crc kubenswrapper[4909]: E0202 10:53:16.242858 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a80850c211cdc87d3c1e82c0cff67aa1c2184690bbe3cd4753f80ee2c8a921b4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:53:16 crc kubenswrapper[4909]: E0202 10:53:16.242892 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4ac70436-19ae-40b7-b329-33e08bef15bd" containerName="nova-scheduler-scheduler" Feb 02 10:53:17 crc kubenswrapper[4909]: I0202 10:53:17.836669 4909 generic.go:334] "Generic (PLEG): container finished" podID="4ac70436-19ae-40b7-b329-33e08bef15bd" containerID="a80850c211cdc87d3c1e82c0cff67aa1c2184690bbe3cd4753f80ee2c8a921b4" exitCode=0 Feb 02 10:53:17 crc kubenswrapper[4909]: I0202 10:53:17.837014 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4ac70436-19ae-40b7-b329-33e08bef15bd","Type":"ContainerDied","Data":"a80850c211cdc87d3c1e82c0cff67aa1c2184690bbe3cd4753f80ee2c8a921b4"} Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.189721 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.243583 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.298643 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.299599 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.312045 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-combined-ca-bundle\") pod \"4ac70436-19ae-40b7-b329-33e08bef15bd\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.312102 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-config-data\") pod \"4ac70436-19ae-40b7-b329-33e08bef15bd\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.312234 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9q22\" (UniqueName: \"kubernetes.io/projected/4ac70436-19ae-40b7-b329-33e08bef15bd-kube-api-access-p9q22\") pod \"4ac70436-19ae-40b7-b329-33e08bef15bd\" (UID: \"4ac70436-19ae-40b7-b329-33e08bef15bd\") " Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.317999 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ac70436-19ae-40b7-b329-33e08bef15bd-kube-api-access-p9q22" (OuterVolumeSpecName: "kube-api-access-p9q22") pod "4ac70436-19ae-40b7-b329-33e08bef15bd" (UID: "4ac70436-19ae-40b7-b329-33e08bef15bd"). InnerVolumeSpecName "kube-api-access-p9q22". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.346181 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-config-data" (OuterVolumeSpecName: "config-data") pod "4ac70436-19ae-40b7-b329-33e08bef15bd" (UID: "4ac70436-19ae-40b7-b329-33e08bef15bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.346286 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ac70436-19ae-40b7-b329-33e08bef15bd" (UID: "4ac70436-19ae-40b7-b329-33e08bef15bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.414032 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.414066 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ac70436-19ae-40b7-b329-33e08bef15bd-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.414077 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9q22\" (UniqueName: \"kubernetes.io/projected/4ac70436-19ae-40b7-b329-33e08bef15bd-kube-api-access-p9q22\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.510409 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.510642 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="6b8f1870-afe0-4ac5-a633-e87905ab1d5b" containerName="kube-state-metrics" containerID="cri-o://36feeb53bfb30eccb100ee6f2672b7eaa4b2f848c04254eb7c5e1d09d61b996a" gracePeriod=30 Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.848557 4909 generic.go:334] "Generic (PLEG): container finished" podID="6b8f1870-afe0-4ac5-a633-e87905ab1d5b" containerID="36feeb53bfb30eccb100ee6f2672b7eaa4b2f848c04254eb7c5e1d09d61b996a" exitCode=2 Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.848707 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b8f1870-afe0-4ac5-a633-e87905ab1d5b","Type":"ContainerDied","Data":"36feeb53bfb30eccb100ee6f2672b7eaa4b2f848c04254eb7c5e1d09d61b996a"} Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.851296 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.851443 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4ac70436-19ae-40b7-b329-33e08bef15bd","Type":"ContainerDied","Data":"3501ad7e7f5c3ba3544728fb18a1a93b5b3b47281f28d89c082a8235ebca8c05"} Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.851516 4909 scope.go:117] "RemoveContainer" containerID="a80850c211cdc87d3c1e82c0cff67aa1c2184690bbe3cd4753f80ee2c8a921b4" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.856401 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerID="a4aa6b07c367fc7cfccceb7d438ff24f47d22ec8fb03f3acd437ee89b27c8ba1" exitCode=0 Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.856470 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce01e534-820e-4cff-bcf6-f8d401e89e04","Type":"ContainerDied","Data":"a4aa6b07c367fc7cfccceb7d438ff24f47d22ec8fb03f3acd437ee89b27c8ba1"} Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.856512 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ce01e534-820e-4cff-bcf6-f8d401e89e04","Type":"ContainerDied","Data":"eb3365b227224f470909a76192653a6b8bed83c864e2e9a3ffa9102804a3456c"} Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.856525 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb3365b227224f470909a76192653a6b8bed83c864e2e9a3ffa9102804a3456c" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.935380 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.944117 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.980840 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.995025 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:53:18 crc kubenswrapper[4909]: E0202 10:53:18.995662 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerName="nova-api-api" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.995789 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerName="nova-api-api" Feb 02 10:53:18 crc kubenswrapper[4909]: E0202 10:53:18.995887 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerName="nova-api-log" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.995968 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerName="nova-api-log" Feb 02 10:53:18 crc kubenswrapper[4909]: E0202 10:53:18.996047 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ac70436-19ae-40b7-b329-33e08bef15bd" containerName="nova-scheduler-scheduler" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.996122 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ac70436-19ae-40b7-b329-33e08bef15bd" containerName="nova-scheduler-scheduler" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.996626 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ac70436-19ae-40b7-b329-33e08bef15bd" containerName="nova-scheduler-scheduler" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.996738 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerName="nova-api-api" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.996848 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" containerName="nova-api-log" Feb 02 10:53:18 crc kubenswrapper[4909]: I0202 10:53:18.997951 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.000365 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.027020 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.027128 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-config-data\") pod \"nova-scheduler-0\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.027342 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbnjf\" (UniqueName: \"kubernetes.io/projected/d858154b-d091-48f8-9473-69432c34a59e-kube-api-access-dbnjf\") pod \"nova-scheduler-0\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.029987 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ac70436-19ae-40b7-b329-33e08bef15bd" path="/var/lib/kubelet/pods/4ac70436-19ae-40b7-b329-33e08bef15bd/volumes" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.070991 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.083166 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.128270 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-config-data\") pod \"ce01e534-820e-4cff-bcf6-f8d401e89e04\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.128315 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkl5x\" (UniqueName: \"kubernetes.io/projected/ce01e534-820e-4cff-bcf6-f8d401e89e04-kube-api-access-wkl5x\") pod \"ce01e534-820e-4cff-bcf6-f8d401e89e04\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.128444 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-combined-ca-bundle\") pod \"ce01e534-820e-4cff-bcf6-f8d401e89e04\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.128491 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce01e534-820e-4cff-bcf6-f8d401e89e04-logs\") pod \"ce01e534-820e-4cff-bcf6-f8d401e89e04\" (UID: \"ce01e534-820e-4cff-bcf6-f8d401e89e04\") " Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.128740 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.128828 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-config-data\") pod \"nova-scheduler-0\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.129189 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbnjf\" (UniqueName: \"kubernetes.io/projected/d858154b-d091-48f8-9473-69432c34a59e-kube-api-access-dbnjf\") pod \"nova-scheduler-0\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.132404 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce01e534-820e-4cff-bcf6-f8d401e89e04-logs" (OuterVolumeSpecName: "logs") pod "ce01e534-820e-4cff-bcf6-f8d401e89e04" (UID: "ce01e534-820e-4cff-bcf6-f8d401e89e04"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.134112 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce01e534-820e-4cff-bcf6-f8d401e89e04-kube-api-access-wkl5x" (OuterVolumeSpecName: "kube-api-access-wkl5x") pod "ce01e534-820e-4cff-bcf6-f8d401e89e04" (UID: "ce01e534-820e-4cff-bcf6-f8d401e89e04"). InnerVolumeSpecName "kube-api-access-wkl5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.141450 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-config-data\") pod \"nova-scheduler-0\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.149230 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.156460 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbnjf\" (UniqueName: \"kubernetes.io/projected/d858154b-d091-48f8-9473-69432c34a59e-kube-api-access-dbnjf\") pod \"nova-scheduler-0\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.171701 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-config-data" (OuterVolumeSpecName: "config-data") pod "ce01e534-820e-4cff-bcf6-f8d401e89e04" (UID: "ce01e534-820e-4cff-bcf6-f8d401e89e04"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.190504 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce01e534-820e-4cff-bcf6-f8d401e89e04" (UID: "ce01e534-820e-4cff-bcf6-f8d401e89e04"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.230158 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbsfn\" (UniqueName: \"kubernetes.io/projected/6b8f1870-afe0-4ac5-a633-e87905ab1d5b-kube-api-access-zbsfn\") pod \"6b8f1870-afe0-4ac5-a633-e87905ab1d5b\" (UID: \"6b8f1870-afe0-4ac5-a633-e87905ab1d5b\") " Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.230655 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.230687 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce01e534-820e-4cff-bcf6-f8d401e89e04-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.230697 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce01e534-820e-4cff-bcf6-f8d401e89e04-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.230705 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkl5x\" (UniqueName: \"kubernetes.io/projected/ce01e534-820e-4cff-bcf6-f8d401e89e04-kube-api-access-wkl5x\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.233360 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8f1870-afe0-4ac5-a633-e87905ab1d5b-kube-api-access-zbsfn" (OuterVolumeSpecName: "kube-api-access-zbsfn") pod "6b8f1870-afe0-4ac5-a633-e87905ab1d5b" (UID: "6b8f1870-afe0-4ac5-a633-e87905ab1d5b"). InnerVolumeSpecName "kube-api-access-zbsfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.332037 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbsfn\" (UniqueName: \"kubernetes.io/projected/6b8f1870-afe0-4ac5-a633-e87905ab1d5b-kube-api-access-zbsfn\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.403214 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.511262 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.511474 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.869832 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6b8f1870-afe0-4ac5-a633-e87905ab1d5b","Type":"ContainerDied","Data":"2debc0db52befda55cf65379c84a06e45bb718483eba379a79184a62c7c47797"} Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.869881 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.869845 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.869900 4909 scope.go:117] "RemoveContainer" containerID="36feeb53bfb30eccb100ee6f2672b7eaa4b2f848c04254eb7c5e1d09d61b996a" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.894647 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.919175 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.937859 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.963636 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.974455 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.986087 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:53:19 crc kubenswrapper[4909]: E0202 10:53:19.986571 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8f1870-afe0-4ac5-a633-e87905ab1d5b" containerName="kube-state-metrics" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.986596 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8f1870-afe0-4ac5-a633-e87905ab1d5b" containerName="kube-state-metrics" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.987002 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8f1870-afe0-4ac5-a633-e87905ab1d5b" containerName="kube-state-metrics" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.987724 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.992622 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.993090 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.997944 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:19 crc kubenswrapper[4909]: I0202 10:53:19.999478 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.001774 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.015323 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.027557 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.050611 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-config-data\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.050653 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.050701 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.050915 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.051038 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10cb2fd1-d09d-451f-97db-76a45a247a5b-logs\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.051130 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96t4p\" (UniqueName: \"kubernetes.io/projected/10cb2fd1-d09d-451f-97db-76a45a247a5b-kube-api-access-96t4p\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.051214 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.051358 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skgpn\" (UniqueName: \"kubernetes.io/projected/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-api-access-skgpn\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.153016 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10cb2fd1-d09d-451f-97db-76a45a247a5b-logs\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.153077 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96t4p\" (UniqueName: \"kubernetes.io/projected/10cb2fd1-d09d-451f-97db-76a45a247a5b-kube-api-access-96t4p\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.153178 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.153222 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skgpn\" (UniqueName: \"kubernetes.io/projected/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-api-access-skgpn\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.153249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-config-data\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.153275 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.153323 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.153382 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.153841 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10cb2fd1-d09d-451f-97db-76a45a247a5b-logs\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.158353 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.160357 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-config-data\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.160367 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.160369 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.161147 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.172943 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skgpn\" (UniqueName: \"kubernetes.io/projected/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-api-access-skgpn\") pod \"kube-state-metrics-0\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.172996 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96t4p\" (UniqueName: \"kubernetes.io/projected/10cb2fd1-d09d-451f-97db-76a45a247a5b-kube-api-access-96t4p\") pod \"nova-api-0\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.353909 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.376560 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.636256 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.636821 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="ceilometer-central-agent" containerID="cri-o://c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141" gracePeriod=30 Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.637058 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="ceilometer-notification-agent" containerID="cri-o://6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636" gracePeriod=30 Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.637055 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="proxy-httpd" containerID="cri-o://f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c" gracePeriod=30 Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.637134 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="sg-core" containerID="cri-o://85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74" gracePeriod=30 Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.815761 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.883685 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d858154b-d091-48f8-9473-69432c34a59e","Type":"ContainerStarted","Data":"2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6"} Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.883735 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d858154b-d091-48f8-9473-69432c34a59e","Type":"ContainerStarted","Data":"5b433b7eb3a6546d1f40baa527418cb0f6b8492fe2fb2cbe12bc9dd86afafa65"} Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.884971 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"236e10c6-5b7d-4f9e-b82a-5c68edc93692","Type":"ContainerStarted","Data":"1269fcc242324b31f01bdc44584ebc1d0b910481dab215f3c382b9ce22b3d69a"} Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.889884 4909 generic.go:334] "Generic (PLEG): container finished" podID="701fbae9-e013-4311-ab91-55c3fca98e66" containerID="f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c" exitCode=0 Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.889922 4909 generic.go:334] "Generic (PLEG): container finished" podID="701fbae9-e013-4311-ab91-55c3fca98e66" containerID="85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74" exitCode=2 Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.889939 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701fbae9-e013-4311-ab91-55c3fca98e66","Type":"ContainerDied","Data":"f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c"} Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.889971 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701fbae9-e013-4311-ab91-55c3fca98e66","Type":"ContainerDied","Data":"85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74"} Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.910440 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.910420766 podStartE2EDuration="2.910420766s" podCreationTimestamp="2026-02-02 10:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:20.906137244 +0000 UTC m=+1326.652237989" watchObservedRunningTime="2026-02-02 10:53:20.910420766 +0000 UTC m=+1326.656521501" Feb 02 10:53:20 crc kubenswrapper[4909]: I0202 10:53:20.928268 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.050834 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8f1870-afe0-4ac5-a633-e87905ab1d5b" path="/var/lib/kubelet/pods/6b8f1870-afe0-4ac5-a633-e87905ab1d5b/volumes" Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.051387 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce01e534-820e-4cff-bcf6-f8d401e89e04" path="/var/lib/kubelet/pods/ce01e534-820e-4cff-bcf6-f8d401e89e04/volumes" Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.906213 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"236e10c6-5b7d-4f9e-b82a-5c68edc93692","Type":"ContainerStarted","Data":"a5429c7323d593269c829013791c5ec02f6c2311d2ea72699e259a5d72454bac"} Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.907000 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.909237 4909 generic.go:334] "Generic (PLEG): container finished" podID="701fbae9-e013-4311-ab91-55c3fca98e66" containerID="c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141" exitCode=0 Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.909278 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701fbae9-e013-4311-ab91-55c3fca98e66","Type":"ContainerDied","Data":"c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141"} Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.911288 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10cb2fd1-d09d-451f-97db-76a45a247a5b","Type":"ContainerStarted","Data":"63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca"} Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.911317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10cb2fd1-d09d-451f-97db-76a45a247a5b","Type":"ContainerStarted","Data":"8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d"} Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.911327 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10cb2fd1-d09d-451f-97db-76a45a247a5b","Type":"ContainerStarted","Data":"d7d9e97bd3ab4f15b664eaa6320bdcd9c78bc74119b8ff9541f0c3bd1954dd47"} Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.929111 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.569013266 podStartE2EDuration="2.92909344s" podCreationTimestamp="2026-02-02 10:53:19 +0000 UTC" firstStartedPulling="2026-02-02 10:53:20.820532193 +0000 UTC m=+1326.566632928" lastFinishedPulling="2026-02-02 10:53:21.180612367 +0000 UTC m=+1326.926713102" observedRunningTime="2026-02-02 10:53:21.928434281 +0000 UTC m=+1327.674535026" watchObservedRunningTime="2026-02-02 10:53:21.92909344 +0000 UTC m=+1327.675194175" Feb 02 10:53:21 crc kubenswrapper[4909]: I0202 10:53:21.956479 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.956455147 podStartE2EDuration="2.956455147s" podCreationTimestamp="2026-02-02 10:53:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:21.944997172 +0000 UTC m=+1327.691097917" watchObservedRunningTime="2026-02-02 10:53:21.956455147 +0000 UTC m=+1327.702555892" Feb 02 10:53:23 crc kubenswrapper[4909]: I0202 10:53:23.299260 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:53:23 crc kubenswrapper[4909]: I0202 10:53:23.299301 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:53:24 crc kubenswrapper[4909]: I0202 10:53:24.314028 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="411a3543-746a-4905-bd62-c712fa09daef" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:53:24 crc kubenswrapper[4909]: I0202 10:53:24.314580 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="411a3543-746a-4905-bd62-c712fa09daef" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:53:24 crc kubenswrapper[4909]: I0202 10:53:24.404319 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.464867 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.554527 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-run-httpd\") pod \"701fbae9-e013-4311-ab91-55c3fca98e66\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.554687 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-log-httpd\") pod \"701fbae9-e013-4311-ab91-55c3fca98e66\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.554751 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-scripts\") pod \"701fbae9-e013-4311-ab91-55c3fca98e66\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.554876 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-sg-core-conf-yaml\") pod \"701fbae9-e013-4311-ab91-55c3fca98e66\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.554944 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-combined-ca-bundle\") pod \"701fbae9-e013-4311-ab91-55c3fca98e66\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.554992 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ls29\" (UniqueName: \"kubernetes.io/projected/701fbae9-e013-4311-ab91-55c3fca98e66-kube-api-access-7ls29\") pod \"701fbae9-e013-4311-ab91-55c3fca98e66\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.555077 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-config-data\") pod \"701fbae9-e013-4311-ab91-55c3fca98e66\" (UID: \"701fbae9-e013-4311-ab91-55c3fca98e66\") " Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.555338 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "701fbae9-e013-4311-ab91-55c3fca98e66" (UID: "701fbae9-e013-4311-ab91-55c3fca98e66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.556037 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.556183 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "701fbae9-e013-4311-ab91-55c3fca98e66" (UID: "701fbae9-e013-4311-ab91-55c3fca98e66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.561178 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-scripts" (OuterVolumeSpecName: "scripts") pod "701fbae9-e013-4311-ab91-55c3fca98e66" (UID: "701fbae9-e013-4311-ab91-55c3fca98e66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.562427 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701fbae9-e013-4311-ab91-55c3fca98e66-kube-api-access-7ls29" (OuterVolumeSpecName: "kube-api-access-7ls29") pod "701fbae9-e013-4311-ab91-55c3fca98e66" (UID: "701fbae9-e013-4311-ab91-55c3fca98e66"). InnerVolumeSpecName "kube-api-access-7ls29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.583009 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "701fbae9-e013-4311-ab91-55c3fca98e66" (UID: "701fbae9-e013-4311-ab91-55c3fca98e66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.628093 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "701fbae9-e013-4311-ab91-55c3fca98e66" (UID: "701fbae9-e013-4311-ab91-55c3fca98e66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.657349 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.657382 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.657395 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.657404 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ls29\" (UniqueName: \"kubernetes.io/projected/701fbae9-e013-4311-ab91-55c3fca98e66-kube-api-access-7ls29\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.657414 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/701fbae9-e013-4311-ab91-55c3fca98e66-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.670616 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-config-data" (OuterVolumeSpecName: "config-data") pod "701fbae9-e013-4311-ab91-55c3fca98e66" (UID: "701fbae9-e013-4311-ab91-55c3fca98e66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.759101 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/701fbae9-e013-4311-ab91-55c3fca98e66-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.949628 4909 generic.go:334] "Generic (PLEG): container finished" podID="701fbae9-e013-4311-ab91-55c3fca98e66" containerID="6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636" exitCode=0 Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.949675 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701fbae9-e013-4311-ab91-55c3fca98e66","Type":"ContainerDied","Data":"6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636"} Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.949705 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"701fbae9-e013-4311-ab91-55c3fca98e66","Type":"ContainerDied","Data":"347596334d7a1fcf2d9c8db015a83592d2503b3cc1a0135c843311bfc1ac86f9"} Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.949730 4909 scope.go:117] "RemoveContainer" containerID="f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.949914 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.985114 4909 scope.go:117] "RemoveContainer" containerID="85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74" Feb 02 10:53:25 crc kubenswrapper[4909]: I0202 10:53:25.993876 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.004338 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.012892 4909 scope.go:117] "RemoveContainer" containerID="6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.020172 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:26 crc kubenswrapper[4909]: E0202 10:53:26.020666 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="sg-core" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.020691 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="sg-core" Feb 02 10:53:26 crc kubenswrapper[4909]: E0202 10:53:26.020707 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="ceilometer-notification-agent" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.020715 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="ceilometer-notification-agent" Feb 02 10:53:26 crc kubenswrapper[4909]: E0202 10:53:26.020731 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="proxy-httpd" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.020739 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="proxy-httpd" Feb 02 10:53:26 crc kubenswrapper[4909]: E0202 10:53:26.020748 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="ceilometer-central-agent" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.020755 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="ceilometer-central-agent" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.025275 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="sg-core" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.025325 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="ceilometer-central-agent" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.025344 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="proxy-httpd" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.025381 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" containerName="ceilometer-notification-agent" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.027595 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.030597 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.030911 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.033413 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.033563 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.041162 4909 scope.go:117] "RemoveContainer" containerID="c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.077112 4909 scope.go:117] "RemoveContainer" containerID="f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c" Feb 02 10:53:26 crc kubenswrapper[4909]: E0202 10:53:26.077517 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c\": container with ID starting with f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c not found: ID does not exist" containerID="f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.077550 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c"} err="failed to get container status \"f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c\": rpc error: code = NotFound desc = could not find container \"f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c\": container with ID starting with f7e1747b6d1f313242d82af25e7dd991bf6bf022e08eb2690531ae6d72eade2c not found: ID does not exist" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.077576 4909 scope.go:117] "RemoveContainer" containerID="85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74" Feb 02 10:53:26 crc kubenswrapper[4909]: E0202 10:53:26.078048 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74\": container with ID starting with 85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74 not found: ID does not exist" containerID="85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.078070 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74"} err="failed to get container status \"85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74\": rpc error: code = NotFound desc = could not find container \"85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74\": container with ID starting with 85c50d0b0aab7c6b62add53fdb78c7c9c19f7033ce17680ae1b0fc4dca9f2c74 not found: ID does not exist" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.078083 4909 scope.go:117] "RemoveContainer" containerID="6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636" Feb 02 10:53:26 crc kubenswrapper[4909]: E0202 10:53:26.078314 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636\": container with ID starting with 6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636 not found: ID does not exist" containerID="6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.078343 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636"} err="failed to get container status \"6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636\": rpc error: code = NotFound desc = could not find container \"6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636\": container with ID starting with 6756b8b50bce9e6ac3428c7e6ef1818cba927c1445cc09621f987b9deccbc636 not found: ID does not exist" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.078358 4909 scope.go:117] "RemoveContainer" containerID="c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141" Feb 02 10:53:26 crc kubenswrapper[4909]: E0202 10:53:26.078736 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141\": container with ID starting with c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141 not found: ID does not exist" containerID="c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.078766 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141"} err="failed to get container status \"c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141\": rpc error: code = NotFound desc = could not find container \"c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141\": container with ID starting with c7520860ed8086830f83ea8eeb75a91efc73f9f715cc7e8edcb4df6de3285141 not found: ID does not exist" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.165546 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.165603 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.165654 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-config-data\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.165708 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-log-httpd\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.165735 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.165788 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-run-httpd\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.165872 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f954w\" (UniqueName: \"kubernetes.io/projected/7e603361-b833-4b97-8aab-52601d806d34-kube-api-access-f954w\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.165951 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-scripts\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.267450 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.267495 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.267526 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-config-data\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.267562 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-log-httpd\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.267582 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.267599 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-run-httpd\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.267645 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f954w\" (UniqueName: \"kubernetes.io/projected/7e603361-b833-4b97-8aab-52601d806d34-kube-api-access-f954w\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.267695 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-scripts\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.268537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-log-httpd\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.268674 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-run-httpd\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.278937 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.280389 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-config-data\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.288531 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-scripts\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.292436 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.292655 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.299649 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f954w\" (UniqueName: \"kubernetes.io/projected/7e603361-b833-4b97-8aab-52601d806d34-kube-api-access-f954w\") pod \"ceilometer-0\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.369003 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.810936 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:26 crc kubenswrapper[4909]: I0202 10:53:26.960530 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e603361-b833-4b97-8aab-52601d806d34","Type":"ContainerStarted","Data":"6d10c2de560da6b31b27e4c0383d29861b7d7e47310a2c6e7a741dcaab40b762"} Feb 02 10:53:27 crc kubenswrapper[4909]: I0202 10:53:27.026581 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701fbae9-e013-4311-ab91-55c3fca98e66" path="/var/lib/kubelet/pods/701fbae9-e013-4311-ab91-55c3fca98e66/volumes" Feb 02 10:53:27 crc kubenswrapper[4909]: I0202 10:53:27.974165 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e603361-b833-4b97-8aab-52601d806d34","Type":"ContainerStarted","Data":"0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe"} Feb 02 10:53:29 crc kubenswrapper[4909]: I0202 10:53:29.004295 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e603361-b833-4b97-8aab-52601d806d34","Type":"ContainerStarted","Data":"389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6"} Feb 02 10:53:29 crc kubenswrapper[4909]: I0202 10:53:29.004610 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e603361-b833-4b97-8aab-52601d806d34","Type":"ContainerStarted","Data":"3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65"} Feb 02 10:53:29 crc kubenswrapper[4909]: I0202 10:53:29.404276 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 10:53:29 crc kubenswrapper[4909]: I0202 10:53:29.433152 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 10:53:30 crc kubenswrapper[4909]: I0202 10:53:30.045324 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 10:53:30 crc kubenswrapper[4909]: I0202 10:53:30.371514 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 02 10:53:30 crc kubenswrapper[4909]: I0202 10:53:30.377082 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:53:30 crc kubenswrapper[4909]: I0202 10:53:30.377197 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:53:31 crc kubenswrapper[4909]: I0202 10:53:31.030303 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e603361-b833-4b97-8aab-52601d806d34","Type":"ContainerStarted","Data":"4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa"} Feb 02 10:53:31 crc kubenswrapper[4909]: I0202 10:53:31.031185 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:53:31 crc kubenswrapper[4909]: I0202 10:53:31.059620 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.476551073 podStartE2EDuration="6.05960172s" podCreationTimestamp="2026-02-02 10:53:25 +0000 UTC" firstStartedPulling="2026-02-02 10:53:26.811493339 +0000 UTC m=+1332.557594074" lastFinishedPulling="2026-02-02 10:53:30.394543966 +0000 UTC m=+1336.140644721" observedRunningTime="2026-02-02 10:53:31.054629288 +0000 UTC m=+1336.800730023" watchObservedRunningTime="2026-02-02 10:53:31.05960172 +0000 UTC m=+1336.805702455" Feb 02 10:53:31 crc kubenswrapper[4909]: I0202 10:53:31.460012 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:53:31 crc kubenswrapper[4909]: I0202 10:53:31.460202 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:53:33 crc kubenswrapper[4909]: I0202 10:53:33.304441 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:53:33 crc kubenswrapper[4909]: I0202 10:53:33.305227 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:53:33 crc kubenswrapper[4909]: I0202 10:53:33.315360 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:53:34 crc kubenswrapper[4909]: I0202 10:53:34.061240 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:53:36 crc kubenswrapper[4909]: E0202 10:53:36.915890 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a48086e_b086_4f81_9cb3_16e40fc2c1d5.slice/crio-806bbf2f7813821e387fd50d1030b913b4e7caeb65653bb90e0d90a42508f01e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a48086e_b086_4f81_9cb3_16e40fc2c1d5.slice/crio-conmon-806bbf2f7813821e387fd50d1030b913b4e7caeb65653bb90e0d90a42508f01e.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.075433 4909 generic.go:334] "Generic (PLEG): container finished" podID="0a48086e-b086-4f81-9cb3-16e40fc2c1d5" containerID="806bbf2f7813821e387fd50d1030b913b4e7caeb65653bb90e0d90a42508f01e" exitCode=137 Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.075532 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a48086e-b086-4f81-9cb3-16e40fc2c1d5","Type":"ContainerDied","Data":"806bbf2f7813821e387fd50d1030b913b4e7caeb65653bb90e0d90a42508f01e"} Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.075766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a48086e-b086-4f81-9cb3-16e40fc2c1d5","Type":"ContainerDied","Data":"2f21e81db5b7d7769aa137dba2bbe6f234ea6b31ddd4aca3edbdcbd2a017b03a"} Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.075785 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f21e81db5b7d7769aa137dba2bbe6f234ea6b31ddd4aca3edbdcbd2a017b03a" Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.084977 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.166517 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-config-data\") pod \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.166908 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwv7x\" (UniqueName: \"kubernetes.io/projected/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-kube-api-access-dwv7x\") pod \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.167565 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-combined-ca-bundle\") pod \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\" (UID: \"0a48086e-b086-4f81-9cb3-16e40fc2c1d5\") " Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.173695 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-kube-api-access-dwv7x" (OuterVolumeSpecName: "kube-api-access-dwv7x") pod "0a48086e-b086-4f81-9cb3-16e40fc2c1d5" (UID: "0a48086e-b086-4f81-9cb3-16e40fc2c1d5"). InnerVolumeSpecName "kube-api-access-dwv7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.196096 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a48086e-b086-4f81-9cb3-16e40fc2c1d5" (UID: "0a48086e-b086-4f81-9cb3-16e40fc2c1d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.200266 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-config-data" (OuterVolumeSpecName: "config-data") pod "0a48086e-b086-4f81-9cb3-16e40fc2c1d5" (UID: "0a48086e-b086-4f81-9cb3-16e40fc2c1d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.270288 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.270327 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:37 crc kubenswrapper[4909]: I0202 10:53:37.270336 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwv7x\" (UniqueName: \"kubernetes.io/projected/0a48086e-b086-4f81-9cb3-16e40fc2c1d5-kube-api-access-dwv7x\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.083258 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.113986 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.124386 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.141495 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:53:38 crc kubenswrapper[4909]: E0202 10:53:38.141900 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a48086e-b086-4f81-9cb3-16e40fc2c1d5" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.141919 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a48086e-b086-4f81-9cb3-16e40fc2c1d5" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.142114 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a48086e-b086-4f81-9cb3-16e40fc2c1d5" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.142712 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.144788 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.145127 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.148061 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.181444 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.289763 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.289862 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgwmc\" (UniqueName: \"kubernetes.io/projected/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-kube-api-access-lgwmc\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.289892 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.290002 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.290080 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.392199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.392337 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.392372 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.392431 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgwmc\" (UniqueName: \"kubernetes.io/projected/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-kube-api-access-lgwmc\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.392456 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.402773 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.402823 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.406490 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.412685 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.423468 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgwmc\" (UniqueName: \"kubernetes.io/projected/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-kube-api-access-lgwmc\") pod \"nova-cell1-novncproxy-0\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.464580 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:38 crc kubenswrapper[4909]: I0202 10:53:38.949631 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:53:38 crc kubenswrapper[4909]: W0202 10:53:38.952044 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20cda1f6_04e5_4e71_9fa8_4cad8dbac52c.slice/crio-c1c243cca4352a0b3706630b065e4cb97271d4046dbed87883879523bfd5cc00 WatchSource:0}: Error finding container c1c243cca4352a0b3706630b065e4cb97271d4046dbed87883879523bfd5cc00: Status 404 returned error can't find the container with id c1c243cca4352a0b3706630b065e4cb97271d4046dbed87883879523bfd5cc00 Feb 02 10:53:39 crc kubenswrapper[4909]: I0202 10:53:39.029949 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a48086e-b086-4f81-9cb3-16e40fc2c1d5" path="/var/lib/kubelet/pods/0a48086e-b086-4f81-9cb3-16e40fc2c1d5/volumes" Feb 02 10:53:39 crc kubenswrapper[4909]: I0202 10:53:39.096024 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c","Type":"ContainerStarted","Data":"c1c243cca4352a0b3706630b065e4cb97271d4046dbed87883879523bfd5cc00"} Feb 02 10:53:40 crc kubenswrapper[4909]: I0202 10:53:40.105541 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c","Type":"ContainerStarted","Data":"3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d"} Feb 02 10:53:40 crc kubenswrapper[4909]: I0202 10:53:40.387281 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:53:40 crc kubenswrapper[4909]: I0202 10:53:40.388012 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:53:40 crc kubenswrapper[4909]: I0202 10:53:40.392085 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:53:40 crc kubenswrapper[4909]: I0202 10:53:40.393024 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:53:40 crc kubenswrapper[4909]: I0202 10:53:40.412261 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.412241147 podStartE2EDuration="2.412241147s" podCreationTimestamp="2026-02-02 10:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:40.12744374 +0000 UTC m=+1345.873544475" watchObservedRunningTime="2026-02-02 10:53:40.412241147 +0000 UTC m=+1346.158341882" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.113586 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.117296 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.327546 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-j8gxg"] Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.329871 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.371084 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-j8gxg"] Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.451069 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-swift-storage-0\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.451152 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-svc\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.451192 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.451313 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.451346 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-config\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.451472 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkm9j\" (UniqueName: \"kubernetes.io/projected/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-kube-api-access-jkm9j\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.552817 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkm9j\" (UniqueName: \"kubernetes.io/projected/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-kube-api-access-jkm9j\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.552943 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-swift-storage-0\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.552996 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-svc\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.553017 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.553053 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.553079 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-config\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.553933 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-svc\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.553964 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-config\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.554094 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.554278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.554521 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-swift-storage-0\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.575999 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkm9j\" (UniqueName: \"kubernetes.io/projected/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-kube-api-access-jkm9j\") pod \"dnsmasq-dns-7dcd758995-j8gxg\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:41 crc kubenswrapper[4909]: I0202 10:53:41.680190 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:42 crc kubenswrapper[4909]: I0202 10:53:42.181749 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-j8gxg"] Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.131716 4909 generic.go:334] "Generic (PLEG): container finished" podID="81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" containerID="745f0130c8a843a1f7f679fb91fbed011c5cc6c79ffff4ba7e21abf7eb7801df" exitCode=0 Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.131838 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" event={"ID":"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425","Type":"ContainerDied","Data":"745f0130c8a843a1f7f679fb91fbed011c5cc6c79ffff4ba7e21abf7eb7801df"} Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.132165 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" event={"ID":"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425","Type":"ContainerStarted","Data":"37fdcaad723edae7948c00a5a5707c296106f75851e5fbb092feb86674389462"} Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.404284 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.405011 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="ceilometer-central-agent" containerID="cri-o://0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe" gracePeriod=30 Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.405519 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="proxy-httpd" containerID="cri-o://4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa" gracePeriod=30 Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.405570 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="sg-core" containerID="cri-o://389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6" gracePeriod=30 Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.405687 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="ceilometer-notification-agent" containerID="cri-o://3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65" gracePeriod=30 Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.420991 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.197:3000/\": EOF" Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.465168 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:43 crc kubenswrapper[4909]: I0202 10:53:43.934120 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.143870 4909 generic.go:334] "Generic (PLEG): container finished" podID="7e603361-b833-4b97-8aab-52601d806d34" containerID="4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa" exitCode=0 Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.143903 4909 generic.go:334] "Generic (PLEG): container finished" podID="7e603361-b833-4b97-8aab-52601d806d34" containerID="389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6" exitCode=2 Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.143911 4909 generic.go:334] "Generic (PLEG): container finished" podID="7e603361-b833-4b97-8aab-52601d806d34" containerID="0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe" exitCode=0 Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.143958 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e603361-b833-4b97-8aab-52601d806d34","Type":"ContainerDied","Data":"4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa"} Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.144018 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e603361-b833-4b97-8aab-52601d806d34","Type":"ContainerDied","Data":"389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6"} Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.144037 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e603361-b833-4b97-8aab-52601d806d34","Type":"ContainerDied","Data":"0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe"} Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.146346 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerName="nova-api-log" containerID="cri-o://8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d" gracePeriod=30 Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.147755 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" event={"ID":"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425","Type":"ContainerStarted","Data":"3487017a73b83d9a77436b7d788ba96efdea8f82fb04b45ef153d5a993e724bc"} Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.147794 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.148307 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerName="nova-api-api" containerID="cri-o://63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca" gracePeriod=30 Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.171531 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" podStartSLOduration=3.171508167 podStartE2EDuration="3.171508167s" podCreationTimestamp="2026-02-02 10:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:44.170148569 +0000 UTC m=+1349.916249294" watchObservedRunningTime="2026-02-02 10:53:44.171508167 +0000 UTC m=+1349.917608902" Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.760607 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.913009 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-ceilometer-tls-certs\") pod \"7e603361-b833-4b97-8aab-52601d806d34\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.913113 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f954w\" (UniqueName: \"kubernetes.io/projected/7e603361-b833-4b97-8aab-52601d806d34-kube-api-access-f954w\") pod \"7e603361-b833-4b97-8aab-52601d806d34\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.913159 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-run-httpd\") pod \"7e603361-b833-4b97-8aab-52601d806d34\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.913251 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-scripts\") pod \"7e603361-b833-4b97-8aab-52601d806d34\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.913342 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-combined-ca-bundle\") pod \"7e603361-b833-4b97-8aab-52601d806d34\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.913408 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-log-httpd\") pod \"7e603361-b833-4b97-8aab-52601d806d34\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.913446 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-config-data\") pod \"7e603361-b833-4b97-8aab-52601d806d34\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.913474 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-sg-core-conf-yaml\") pod \"7e603361-b833-4b97-8aab-52601d806d34\" (UID: \"7e603361-b833-4b97-8aab-52601d806d34\") " Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.919432 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e603361-b833-4b97-8aab-52601d806d34-kube-api-access-f954w" (OuterVolumeSpecName: "kube-api-access-f954w") pod "7e603361-b833-4b97-8aab-52601d806d34" (UID: "7e603361-b833-4b97-8aab-52601d806d34"). InnerVolumeSpecName "kube-api-access-f954w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.919899 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e603361-b833-4b97-8aab-52601d806d34" (UID: "7e603361-b833-4b97-8aab-52601d806d34"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.921529 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e603361-b833-4b97-8aab-52601d806d34" (UID: "7e603361-b833-4b97-8aab-52601d806d34"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.933018 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-scripts" (OuterVolumeSpecName: "scripts") pod "7e603361-b833-4b97-8aab-52601d806d34" (UID: "7e603361-b833-4b97-8aab-52601d806d34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:44 crc kubenswrapper[4909]: I0202 10:53:44.960034 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e603361-b833-4b97-8aab-52601d806d34" (UID: "7e603361-b833-4b97-8aab-52601d806d34"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.005901 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7e603361-b833-4b97-8aab-52601d806d34" (UID: "7e603361-b833-4b97-8aab-52601d806d34"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.015305 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f954w\" (UniqueName: \"kubernetes.io/projected/7e603361-b833-4b97-8aab-52601d806d34-kube-api-access-f954w\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.015352 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.015365 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.015378 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e603361-b833-4b97-8aab-52601d806d34-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.015388 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.015401 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.037037 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e603361-b833-4b97-8aab-52601d806d34" (UID: "7e603361-b833-4b97-8aab-52601d806d34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.044909 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-config-data" (OuterVolumeSpecName: "config-data") pod "7e603361-b833-4b97-8aab-52601d806d34" (UID: "7e603361-b833-4b97-8aab-52601d806d34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.117413 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.117448 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e603361-b833-4b97-8aab-52601d806d34-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.155253 4909 generic.go:334] "Generic (PLEG): container finished" podID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerID="8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d" exitCode=143 Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.155310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10cb2fd1-d09d-451f-97db-76a45a247a5b","Type":"ContainerDied","Data":"8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d"} Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.157909 4909 generic.go:334] "Generic (PLEG): container finished" podID="7e603361-b833-4b97-8aab-52601d806d34" containerID="3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65" exitCode=0 Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.158248 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.158414 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e603361-b833-4b97-8aab-52601d806d34","Type":"ContainerDied","Data":"3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65"} Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.158502 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e603361-b833-4b97-8aab-52601d806d34","Type":"ContainerDied","Data":"6d10c2de560da6b31b27e4c0383d29861b7d7e47310a2c6e7a741dcaab40b762"} Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.158520 4909 scope.go:117] "RemoveContainer" containerID="4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.201143 4909 scope.go:117] "RemoveContainer" containerID="389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.208231 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.222883 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.236895 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:45 crc kubenswrapper[4909]: E0202 10:53:45.237385 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="sg-core" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.237399 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="sg-core" Feb 02 10:53:45 crc kubenswrapper[4909]: E0202 10:53:45.237421 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="ceilometer-notification-agent" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.237427 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="ceilometer-notification-agent" Feb 02 10:53:45 crc kubenswrapper[4909]: E0202 10:53:45.237446 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="proxy-httpd" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.237452 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="proxy-httpd" Feb 02 10:53:45 crc kubenswrapper[4909]: E0202 10:53:45.237469 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="ceilometer-central-agent" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.237475 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="ceilometer-central-agent" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.237648 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="sg-core" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.237670 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="ceilometer-notification-agent" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.237681 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="ceilometer-central-agent" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.237689 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e603361-b833-4b97-8aab-52601d806d34" containerName="proxy-httpd" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.239530 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.245269 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.245444 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.246187 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.249609 4909 scope.go:117] "RemoveContainer" containerID="3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.264095 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.312955 4909 scope.go:117] "RemoveContainer" containerID="0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.323791 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfnnz\" (UniqueName: \"kubernetes.io/projected/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-kube-api-access-pfnnz\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.323866 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.323896 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.323922 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-log-httpd\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.323986 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-run-httpd\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.324017 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-scripts\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.324040 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.324064 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-config-data\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.352157 4909 scope.go:117] "RemoveContainer" containerID="4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa" Feb 02 10:53:45 crc kubenswrapper[4909]: E0202 10:53:45.354950 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa\": container with ID starting with 4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa not found: ID does not exist" containerID="4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.354983 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa"} err="failed to get container status \"4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa\": rpc error: code = NotFound desc = could not find container \"4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa\": container with ID starting with 4164a6ae7043d0b3249067b1e86a9e3b73334437395aa6aaed2df637990c96aa not found: ID does not exist" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.355007 4909 scope.go:117] "RemoveContainer" containerID="389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6" Feb 02 10:53:45 crc kubenswrapper[4909]: E0202 10:53:45.355353 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6\": container with ID starting with 389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6 not found: ID does not exist" containerID="389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.355373 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6"} err="failed to get container status \"389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6\": rpc error: code = NotFound desc = could not find container \"389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6\": container with ID starting with 389f361ed3a41b52b84eeb960c6e2887ec99414be50c1cc96950d62d851a77c6 not found: ID does not exist" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.355385 4909 scope.go:117] "RemoveContainer" containerID="3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65" Feb 02 10:53:45 crc kubenswrapper[4909]: E0202 10:53:45.355737 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65\": container with ID starting with 3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65 not found: ID does not exist" containerID="3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.355769 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65"} err="failed to get container status \"3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65\": rpc error: code = NotFound desc = could not find container \"3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65\": container with ID starting with 3a0058fbb371c94e57a1bcee532abfc98ed056e212042e7c7a3bb2424714df65 not found: ID does not exist" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.355782 4909 scope.go:117] "RemoveContainer" containerID="0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe" Feb 02 10:53:45 crc kubenswrapper[4909]: E0202 10:53:45.356114 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe\": container with ID starting with 0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe not found: ID does not exist" containerID="0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.356153 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe"} err="failed to get container status \"0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe\": rpc error: code = NotFound desc = could not find container \"0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe\": container with ID starting with 0280a5add182e4078afeb1cd526268bc8e4188bded1f94c3f0e80962839ff7fe not found: ID does not exist" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.425303 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.425366 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.425399 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-log-httpd\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.425478 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-run-httpd\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.425518 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-scripts\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.425550 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.425582 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-config-data\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.425620 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfnnz\" (UniqueName: \"kubernetes.io/projected/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-kube-api-access-pfnnz\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.426563 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-log-httpd\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.426643 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-run-httpd\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.430537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.431157 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-config-data\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.431261 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.431431 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.433202 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-scripts\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.446666 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfnnz\" (UniqueName: \"kubernetes.io/projected/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-kube-api-access-pfnnz\") pod \"ceilometer-0\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.585729 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:45 crc kubenswrapper[4909]: I0202 10:53:45.794181 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:46 crc kubenswrapper[4909]: I0202 10:53:46.126661 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:46 crc kubenswrapper[4909]: I0202 10:53:46.169316 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2","Type":"ContainerStarted","Data":"c3b74d535e37e9262a8e6ab8eaecf0474e0d4bb2dbe7bc6ca2b69f98a17c4925"} Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.026039 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e603361-b833-4b97-8aab-52601d806d34" path="/var/lib/kubelet/pods/7e603361-b833-4b97-8aab-52601d806d34/volumes" Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.179537 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2","Type":"ContainerStarted","Data":"4989217d1bcb4b70c7ca32f9cc1ae7031db47f58f9cbae1453f9aa64a0238f74"} Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.735113 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.870080 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-config-data\") pod \"10cb2fd1-d09d-451f-97db-76a45a247a5b\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.870201 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10cb2fd1-d09d-451f-97db-76a45a247a5b-logs\") pod \"10cb2fd1-d09d-451f-97db-76a45a247a5b\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.870265 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-combined-ca-bundle\") pod \"10cb2fd1-d09d-451f-97db-76a45a247a5b\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.870358 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96t4p\" (UniqueName: \"kubernetes.io/projected/10cb2fd1-d09d-451f-97db-76a45a247a5b-kube-api-access-96t4p\") pod \"10cb2fd1-d09d-451f-97db-76a45a247a5b\" (UID: \"10cb2fd1-d09d-451f-97db-76a45a247a5b\") " Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.870751 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10cb2fd1-d09d-451f-97db-76a45a247a5b-logs" (OuterVolumeSpecName: "logs") pod "10cb2fd1-d09d-451f-97db-76a45a247a5b" (UID: "10cb2fd1-d09d-451f-97db-76a45a247a5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.870925 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10cb2fd1-d09d-451f-97db-76a45a247a5b-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.876253 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10cb2fd1-d09d-451f-97db-76a45a247a5b-kube-api-access-96t4p" (OuterVolumeSpecName: "kube-api-access-96t4p") pod "10cb2fd1-d09d-451f-97db-76a45a247a5b" (UID: "10cb2fd1-d09d-451f-97db-76a45a247a5b"). InnerVolumeSpecName "kube-api-access-96t4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.897910 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-config-data" (OuterVolumeSpecName: "config-data") pod "10cb2fd1-d09d-451f-97db-76a45a247a5b" (UID: "10cb2fd1-d09d-451f-97db-76a45a247a5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.902781 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10cb2fd1-d09d-451f-97db-76a45a247a5b" (UID: "10cb2fd1-d09d-451f-97db-76a45a247a5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.972157 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96t4p\" (UniqueName: \"kubernetes.io/projected/10cb2fd1-d09d-451f-97db-76a45a247a5b-kube-api-access-96t4p\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.972192 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:47 crc kubenswrapper[4909]: I0202 10:53:47.972205 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10cb2fd1-d09d-451f-97db-76a45a247a5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.191390 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2","Type":"ContainerStarted","Data":"5fe8ec2adaf9c05f83453ef987a7a078f7d160652a1c0a299a8cf7821e93878a"} Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.191437 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2","Type":"ContainerStarted","Data":"e6a1046340c14c27579addcb11dd623845f9d759ee67ec221a212c776dce5703"} Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.193267 4909 generic.go:334] "Generic (PLEG): container finished" podID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerID="63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca" exitCode=0 Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.193305 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10cb2fd1-d09d-451f-97db-76a45a247a5b","Type":"ContainerDied","Data":"63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca"} Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.193343 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.193353 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"10cb2fd1-d09d-451f-97db-76a45a247a5b","Type":"ContainerDied","Data":"d7d9e97bd3ab4f15b664eaa6320bdcd9c78bc74119b8ff9541f0c3bd1954dd47"} Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.193369 4909 scope.go:117] "RemoveContainer" containerID="63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.211490 4909 scope.go:117] "RemoveContainer" containerID="8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.229597 4909 scope.go:117] "RemoveContainer" containerID="63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca" Feb 02 10:53:48 crc kubenswrapper[4909]: E0202 10:53:48.230097 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca\": container with ID starting with 63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca not found: ID does not exist" containerID="63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.230148 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca"} err="failed to get container status \"63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca\": rpc error: code = NotFound desc = could not find container \"63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca\": container with ID starting with 63bbbd591f46d57cf98ff4e5a534345a7697c6d683e359b7c022224a87f87bca not found: ID does not exist" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.230178 4909 scope.go:117] "RemoveContainer" containerID="8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d" Feb 02 10:53:48 crc kubenswrapper[4909]: E0202 10:53:48.230514 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d\": container with ID starting with 8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d not found: ID does not exist" containerID="8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.230555 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d"} err="failed to get container status \"8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d\": rpc error: code = NotFound desc = could not find container \"8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d\": container with ID starting with 8e26cc3f6d191a7ffb3076fd61ad51a30e9e5502fdbc0cb31d44fa135e09ab4d not found: ID does not exist" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.230925 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.239340 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.263457 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:48 crc kubenswrapper[4909]: E0202 10:53:48.264046 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerName="nova-api-log" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.264065 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerName="nova-api-log" Feb 02 10:53:48 crc kubenswrapper[4909]: E0202 10:53:48.264313 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerName="nova-api-api" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.264328 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerName="nova-api-api" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.265467 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerName="nova-api-log" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.265491 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" containerName="nova-api-api" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.266444 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.270163 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.270164 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.270273 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.279545 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.380800 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.380894 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d5726d8-d754-4fd9-8bbe-516b893e630f-logs\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.380915 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsxgt\" (UniqueName: \"kubernetes.io/projected/7d5726d8-d754-4fd9-8bbe-516b893e630f-kube-api-access-qsxgt\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.380978 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-public-tls-certs\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.381001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-config-data\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.381201 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.466230 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.483329 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.483422 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d5726d8-d754-4fd9-8bbe-516b893e630f-logs\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.483447 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsxgt\" (UniqueName: \"kubernetes.io/projected/7d5726d8-d754-4fd9-8bbe-516b893e630f-kube-api-access-qsxgt\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.483503 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-public-tls-certs\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.483530 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-config-data\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.483591 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.485447 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d5726d8-d754-4fd9-8bbe-516b893e630f-logs\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.501689 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.507244 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.508579 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-public-tls-certs\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.509991 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.513640 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-config-data\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.530052 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsxgt\" (UniqueName: \"kubernetes.io/projected/7d5726d8-d754-4fd9-8bbe-516b893e630f-kube-api-access-qsxgt\") pod \"nova-api-0\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " pod="openstack/nova-api-0" Feb 02 10:53:48 crc kubenswrapper[4909]: I0202 10:53:48.601758 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.027630 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10cb2fd1-d09d-451f-97db-76a45a247a5b" path="/var/lib/kubelet/pods/10cb2fd1-d09d-451f-97db-76a45a247a5b/volumes" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.055739 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:49 crc kubenswrapper[4909]: W0202 10:53:49.056755 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5726d8_d754_4fd9_8bbe_516b893e630f.slice/crio-029f1f9654cf42ee92e2f477d0cff841cee59d1feb16e772074051f2a9f457cd WatchSource:0}: Error finding container 029f1f9654cf42ee92e2f477d0cff841cee59d1feb16e772074051f2a9f457cd: Status 404 returned error can't find the container with id 029f1f9654cf42ee92e2f477d0cff841cee59d1feb16e772074051f2a9f457cd Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.203276 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d5726d8-d754-4fd9-8bbe-516b893e630f","Type":"ContainerStarted","Data":"029f1f9654cf42ee92e2f477d0cff841cee59d1feb16e772074051f2a9f457cd"} Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.231776 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.510469 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.510521 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.510563 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.511307 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad73216e79d924c2922053100514e06765aa5c63e49cfea0b056d73eebae4d59"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.511361 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://ad73216e79d924c2922053100514e06765aa5c63e49cfea0b056d73eebae4d59" gracePeriod=600 Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.526968 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-vnrcn"] Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.528179 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.531123 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.531522 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.541437 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vnrcn"] Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.606393 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf2jj\" (UniqueName: \"kubernetes.io/projected/1bcdd101-6966-4afe-86f0-47f3b7a524fe-kube-api-access-hf2jj\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.606642 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-scripts\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.606827 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-config-data\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.607095 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.711745 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.712240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf2jj\" (UniqueName: \"kubernetes.io/projected/1bcdd101-6966-4afe-86f0-47f3b7a524fe-kube-api-access-hf2jj\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.712281 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-scripts\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.712318 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-config-data\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.719733 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-config-data\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.722280 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-scripts\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.730388 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf2jj\" (UniqueName: \"kubernetes.io/projected/1bcdd101-6966-4afe-86f0-47f3b7a524fe-kube-api-access-hf2jj\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.731063 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-vnrcn\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:49 crc kubenswrapper[4909]: I0202 10:53:49.865171 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:50 crc kubenswrapper[4909]: I0202 10:53:50.216414 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d5726d8-d754-4fd9-8bbe-516b893e630f","Type":"ContainerStarted","Data":"d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f"} Feb 02 10:53:50 crc kubenswrapper[4909]: I0202 10:53:50.216748 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d5726d8-d754-4fd9-8bbe-516b893e630f","Type":"ContainerStarted","Data":"bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd"} Feb 02 10:53:50 crc kubenswrapper[4909]: I0202 10:53:50.236215 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="ad73216e79d924c2922053100514e06765aa5c63e49cfea0b056d73eebae4d59" exitCode=0 Feb 02 10:53:50 crc kubenswrapper[4909]: I0202 10:53:50.236625 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"ad73216e79d924c2922053100514e06765aa5c63e49cfea0b056d73eebae4d59"} Feb 02 10:53:50 crc kubenswrapper[4909]: I0202 10:53:50.236661 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3"} Feb 02 10:53:50 crc kubenswrapper[4909]: I0202 10:53:50.236679 4909 scope.go:117] "RemoveContainer" containerID="8890ae7c7b5156c4d584d7bd5581da3d2b944d91026e2fa8dff7d54c88b8b78c" Feb 02 10:53:50 crc kubenswrapper[4909]: I0202 10:53:50.239706 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.239687347 podStartE2EDuration="2.239687347s" podCreationTimestamp="2026-02-02 10:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:50.234787788 +0000 UTC m=+1355.980888523" watchObservedRunningTime="2026-02-02 10:53:50.239687347 +0000 UTC m=+1355.985788082" Feb 02 10:53:50 crc kubenswrapper[4909]: W0202 10:53:50.359026 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bcdd101_6966_4afe_86f0_47f3b7a524fe.slice/crio-9a1c7f670d06e815b83ccc2f412fd65f6f1dd599c351830c3c30e31068f5cb1d WatchSource:0}: Error finding container 9a1c7f670d06e815b83ccc2f412fd65f6f1dd599c351830c3c30e31068f5cb1d: Status 404 returned error can't find the container with id 9a1c7f670d06e815b83ccc2f412fd65f6f1dd599c351830c3c30e31068f5cb1d Feb 02 10:53:50 crc kubenswrapper[4909]: I0202 10:53:50.364617 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-vnrcn"] Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.247640 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2","Type":"ContainerStarted","Data":"94f5c576aeeff2fe453b367967176337e9141f52911a83d3ed6465adc28eb1a9"} Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.248407 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="ceilometer-central-agent" containerID="cri-o://4989217d1bcb4b70c7ca32f9cc1ae7031db47f58f9cbae1453f9aa64a0238f74" gracePeriod=30 Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.248482 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="sg-core" containerID="cri-o://5fe8ec2adaf9c05f83453ef987a7a078f7d160652a1c0a299a8cf7821e93878a" gracePeriod=30 Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.248472 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="proxy-httpd" containerID="cri-o://94f5c576aeeff2fe453b367967176337e9141f52911a83d3ed6465adc28eb1a9" gracePeriod=30 Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.248499 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.248514 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="ceilometer-notification-agent" containerID="cri-o://e6a1046340c14c27579addcb11dd623845f9d759ee67ec221a212c776dce5703" gracePeriod=30 Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.252784 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vnrcn" event={"ID":"1bcdd101-6966-4afe-86f0-47f3b7a524fe","Type":"ContainerStarted","Data":"428fe0a93c4288c4b421df226520af6098fbd2af1d876fd098d9c807b6800246"} Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.252845 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vnrcn" event={"ID":"1bcdd101-6966-4afe-86f0-47f3b7a524fe","Type":"ContainerStarted","Data":"9a1c7f670d06e815b83ccc2f412fd65f6f1dd599c351830c3c30e31068f5cb1d"} Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.281898 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.299022399 podStartE2EDuration="6.281878998s" podCreationTimestamp="2026-02-02 10:53:45 +0000 UTC" firstStartedPulling="2026-02-02 10:53:46.125274483 +0000 UTC m=+1351.871375218" lastFinishedPulling="2026-02-02 10:53:50.108131082 +0000 UTC m=+1355.854231817" observedRunningTime="2026-02-02 10:53:51.274039506 +0000 UTC m=+1357.020140341" watchObservedRunningTime="2026-02-02 10:53:51.281878998 +0000 UTC m=+1357.027979743" Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.298378 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-vnrcn" podStartSLOduration=2.298355946 podStartE2EDuration="2.298355946s" podCreationTimestamp="2026-02-02 10:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:53:51.293117527 +0000 UTC m=+1357.039218262" watchObservedRunningTime="2026-02-02 10:53:51.298355946 +0000 UTC m=+1357.044456681" Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.682306 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.755696 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-znn65"] Feb 02 10:53:51 crc kubenswrapper[4909]: I0202 10:53:51.756027 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" podUID="9e425915-23f7-4cde-8c2d-3dcbca42e315" containerName="dnsmasq-dns" containerID="cri-o://edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a" gracePeriod=10 Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.248417 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.278592 4909 generic.go:334] "Generic (PLEG): container finished" podID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerID="94f5c576aeeff2fe453b367967176337e9141f52911a83d3ed6465adc28eb1a9" exitCode=0 Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.278626 4909 generic.go:334] "Generic (PLEG): container finished" podID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerID="5fe8ec2adaf9c05f83453ef987a7a078f7d160652a1c0a299a8cf7821e93878a" exitCode=2 Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.278637 4909 generic.go:334] "Generic (PLEG): container finished" podID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerID="e6a1046340c14c27579addcb11dd623845f9d759ee67ec221a212c776dce5703" exitCode=0 Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.278684 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2","Type":"ContainerDied","Data":"94f5c576aeeff2fe453b367967176337e9141f52911a83d3ed6465adc28eb1a9"} Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.278716 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2","Type":"ContainerDied","Data":"5fe8ec2adaf9c05f83453ef987a7a078f7d160652a1c0a299a8cf7821e93878a"} Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.278727 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2","Type":"ContainerDied","Data":"e6a1046340c14c27579addcb11dd623845f9d759ee67ec221a212c776dce5703"} Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.287871 4909 generic.go:334] "Generic (PLEG): container finished" podID="9e425915-23f7-4cde-8c2d-3dcbca42e315" containerID="edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a" exitCode=0 Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.288724 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.289216 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" event={"ID":"9e425915-23f7-4cde-8c2d-3dcbca42e315","Type":"ContainerDied","Data":"edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a"} Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.289264 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-znn65" event={"ID":"9e425915-23f7-4cde-8c2d-3dcbca42e315","Type":"ContainerDied","Data":"b90a2c8bbe2b2d0938c0b5f242ed061e894319444fb0e8ff280e38b908439109"} Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.289288 4909 scope.go:117] "RemoveContainer" containerID="edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.317763 4909 scope.go:117] "RemoveContainer" containerID="42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.339736 4909 scope.go:117] "RemoveContainer" containerID="edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a" Feb 02 10:53:52 crc kubenswrapper[4909]: E0202 10:53:52.340147 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a\": container with ID starting with edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a not found: ID does not exist" containerID="edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.340174 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a"} err="failed to get container status \"edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a\": rpc error: code = NotFound desc = could not find container \"edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a\": container with ID starting with edba01ecd1bbb07cea4955a3e7312f9707855a41378c045c648c030dd7d7725a not found: ID does not exist" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.340671 4909 scope.go:117] "RemoveContainer" containerID="42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc" Feb 02 10:53:52 crc kubenswrapper[4909]: E0202 10:53:52.341792 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc\": container with ID starting with 42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc not found: ID does not exist" containerID="42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.341840 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc"} err="failed to get container status \"42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc\": rpc error: code = NotFound desc = could not find container \"42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc\": container with ID starting with 42883fc6e049c2f2ff02bdb7603424e5179c57da4580ed19146ee4baf473c5bc not found: ID does not exist" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.378624 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d42qp\" (UniqueName: \"kubernetes.io/projected/9e425915-23f7-4cde-8c2d-3dcbca42e315-kube-api-access-d42qp\") pod \"9e425915-23f7-4cde-8c2d-3dcbca42e315\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.378827 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-nb\") pod \"9e425915-23f7-4cde-8c2d-3dcbca42e315\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.378879 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-svc\") pod \"9e425915-23f7-4cde-8c2d-3dcbca42e315\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.378921 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-swift-storage-0\") pod \"9e425915-23f7-4cde-8c2d-3dcbca42e315\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.379000 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-config\") pod \"9e425915-23f7-4cde-8c2d-3dcbca42e315\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.379017 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-sb\") pod \"9e425915-23f7-4cde-8c2d-3dcbca42e315\" (UID: \"9e425915-23f7-4cde-8c2d-3dcbca42e315\") " Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.385225 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e425915-23f7-4cde-8c2d-3dcbca42e315-kube-api-access-d42qp" (OuterVolumeSpecName: "kube-api-access-d42qp") pod "9e425915-23f7-4cde-8c2d-3dcbca42e315" (UID: "9e425915-23f7-4cde-8c2d-3dcbca42e315"). InnerVolumeSpecName "kube-api-access-d42qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.431876 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9e425915-23f7-4cde-8c2d-3dcbca42e315" (UID: "9e425915-23f7-4cde-8c2d-3dcbca42e315"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.438173 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9e425915-23f7-4cde-8c2d-3dcbca42e315" (UID: "9e425915-23f7-4cde-8c2d-3dcbca42e315"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.443290 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9e425915-23f7-4cde-8c2d-3dcbca42e315" (UID: "9e425915-23f7-4cde-8c2d-3dcbca42e315"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.447161 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9e425915-23f7-4cde-8c2d-3dcbca42e315" (UID: "9e425915-23f7-4cde-8c2d-3dcbca42e315"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.468340 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-config" (OuterVolumeSpecName: "config") pod "9e425915-23f7-4cde-8c2d-3dcbca42e315" (UID: "9e425915-23f7-4cde-8c2d-3dcbca42e315"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.481327 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.481373 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.481384 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d42qp\" (UniqueName: \"kubernetes.io/projected/9e425915-23f7-4cde-8c2d-3dcbca42e315-kube-api-access-d42qp\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.481392 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.481407 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.481415 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9e425915-23f7-4cde-8c2d-3dcbca42e315-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.648163 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-znn65"] Feb 02 10:53:52 crc kubenswrapper[4909]: I0202 10:53:52.656391 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-znn65"] Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.028727 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e425915-23f7-4cde-8c2d-3dcbca42e315" path="/var/lib/kubelet/pods/9e425915-23f7-4cde-8c2d-3dcbca42e315/volumes" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.310968 4909 generic.go:334] "Generic (PLEG): container finished" podID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerID="4989217d1bcb4b70c7ca32f9cc1ae7031db47f58f9cbae1453f9aa64a0238f74" exitCode=0 Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.311061 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2","Type":"ContainerDied","Data":"4989217d1bcb4b70c7ca32f9cc1ae7031db47f58f9cbae1453f9aa64a0238f74"} Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.682445 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.808113 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-run-httpd\") pod \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.808226 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-config-data\") pod \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.808278 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfnnz\" (UniqueName: \"kubernetes.io/projected/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-kube-api-access-pfnnz\") pod \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.808339 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-combined-ca-bundle\") pod \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.808354 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-ceilometer-tls-certs\") pod \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.808367 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-scripts\") pod \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.808416 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-log-httpd\") pod \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.808458 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-sg-core-conf-yaml\") pod \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\" (UID: \"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2\") " Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.808510 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" (UID: "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.808855 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.809832 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" (UID: "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.816352 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-scripts" (OuterVolumeSpecName: "scripts") pod "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" (UID: "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.816518 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-kube-api-access-pfnnz" (OuterVolumeSpecName: "kube-api-access-pfnnz") pod "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" (UID: "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2"). InnerVolumeSpecName "kube-api-access-pfnnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.838120 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" (UID: "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.884641 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" (UID: "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.907188 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" (UID: "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.910501 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfnnz\" (UniqueName: \"kubernetes.io/projected/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-kube-api-access-pfnnz\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.910541 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.910553 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.910564 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.910575 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.910587 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:53 crc kubenswrapper[4909]: I0202 10:53:53.938395 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-config-data" (OuterVolumeSpecName: "config-data") pod "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" (UID: "cdfb2de4-1f6d-498e-a88b-00d7bd37acb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.012672 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.325156 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cdfb2de4-1f6d-498e-a88b-00d7bd37acb2","Type":"ContainerDied","Data":"c3b74d535e37e9262a8e6ab8eaecf0474e0d4bb2dbe7bc6ca2b69f98a17c4925"} Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.325217 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.325451 4909 scope.go:117] "RemoveContainer" containerID="94f5c576aeeff2fe453b367967176337e9141f52911a83d3ed6465adc28eb1a9" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.348442 4909 scope.go:117] "RemoveContainer" containerID="5fe8ec2adaf9c05f83453ef987a7a078f7d160652a1c0a299a8cf7821e93878a" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.370080 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.376474 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.379775 4909 scope.go:117] "RemoveContainer" containerID="e6a1046340c14c27579addcb11dd623845f9d759ee67ec221a212c776dce5703" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.385506 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:54 crc kubenswrapper[4909]: E0202 10:53:54.385943 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="sg-core" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.385963 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="sg-core" Feb 02 10:53:54 crc kubenswrapper[4909]: E0202 10:53:54.385974 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="proxy-httpd" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.385979 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="proxy-httpd" Feb 02 10:53:54 crc kubenswrapper[4909]: E0202 10:53:54.385992 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e425915-23f7-4cde-8c2d-3dcbca42e315" containerName="dnsmasq-dns" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.385998 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e425915-23f7-4cde-8c2d-3dcbca42e315" containerName="dnsmasq-dns" Feb 02 10:53:54 crc kubenswrapper[4909]: E0202 10:53:54.386011 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="ceilometer-notification-agent" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.386017 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="ceilometer-notification-agent" Feb 02 10:53:54 crc kubenswrapper[4909]: E0202 10:53:54.386028 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="ceilometer-central-agent" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.386038 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="ceilometer-central-agent" Feb 02 10:53:54 crc kubenswrapper[4909]: E0202 10:53:54.386054 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e425915-23f7-4cde-8c2d-3dcbca42e315" containerName="init" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.386061 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e425915-23f7-4cde-8c2d-3dcbca42e315" containerName="init" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.386234 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="ceilometer-notification-agent" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.386248 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="ceilometer-central-agent" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.386260 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e425915-23f7-4cde-8c2d-3dcbca42e315" containerName="dnsmasq-dns" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.386272 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="proxy-httpd" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.386288 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" containerName="sg-core" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.401609 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.401765 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.408176 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.408353 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.408444 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.420879 4909 scope.go:117] "RemoveContainer" containerID="4989217d1bcb4b70c7ca32f9cc1ae7031db47f58f9cbae1453f9aa64a0238f74" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.523230 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-log-httpd\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.523307 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.523384 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-scripts\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.523432 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.523600 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m42tt\" (UniqueName: \"kubernetes.io/projected/9a43fc6d-3442-4921-93bc-ef5ab2273a78-kube-api-access-m42tt\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.523657 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.523680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-run-httpd\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.523719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-config-data\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.624720 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-log-httpd\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.624775 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.624832 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-scripts\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.624866 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.624927 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m42tt\" (UniqueName: \"kubernetes.io/projected/9a43fc6d-3442-4921-93bc-ef5ab2273a78-kube-api-access-m42tt\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.624947 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.624963 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-run-httpd\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.624982 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-config-data\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.626166 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-run-httpd\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.626194 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-log-httpd\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.634255 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.636044 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-scripts\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.636210 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.636354 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-config-data\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.636471 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.648751 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m42tt\" (UniqueName: \"kubernetes.io/projected/9a43fc6d-3442-4921-93bc-ef5ab2273a78-kube-api-access-m42tt\") pod \"ceilometer-0\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " pod="openstack/ceilometer-0" Feb 02 10:53:54 crc kubenswrapper[4909]: I0202 10:53:54.723884 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:53:55 crc kubenswrapper[4909]: I0202 10:53:55.031891 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdfb2de4-1f6d-498e-a88b-00d7bd37acb2" path="/var/lib/kubelet/pods/cdfb2de4-1f6d-498e-a88b-00d7bd37acb2/volumes" Feb 02 10:53:55 crc kubenswrapper[4909]: I0202 10:53:55.186688 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:53:55 crc kubenswrapper[4909]: I0202 10:53:55.334965 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a43fc6d-3442-4921-93bc-ef5ab2273a78","Type":"ContainerStarted","Data":"fa3aea4766d2e55fa0b1da6a6ceecc305e907d363b002158b6ecbcc66daa9b97"} Feb 02 10:53:56 crc kubenswrapper[4909]: I0202 10:53:56.355153 4909 generic.go:334] "Generic (PLEG): container finished" podID="1bcdd101-6966-4afe-86f0-47f3b7a524fe" containerID="428fe0a93c4288c4b421df226520af6098fbd2af1d876fd098d9c807b6800246" exitCode=0 Feb 02 10:53:56 crc kubenswrapper[4909]: I0202 10:53:56.355370 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vnrcn" event={"ID":"1bcdd101-6966-4afe-86f0-47f3b7a524fe","Type":"ContainerDied","Data":"428fe0a93c4288c4b421df226520af6098fbd2af1d876fd098d9c807b6800246"} Feb 02 10:53:56 crc kubenswrapper[4909]: I0202 10:53:56.359056 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a43fc6d-3442-4921-93bc-ef5ab2273a78","Type":"ContainerStarted","Data":"1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2"} Feb 02 10:53:57 crc kubenswrapper[4909]: I0202 10:53:57.391933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a43fc6d-3442-4921-93bc-ef5ab2273a78","Type":"ContainerStarted","Data":"166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f"} Feb 02 10:53:57 crc kubenswrapper[4909]: I0202 10:53:57.392548 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a43fc6d-3442-4921-93bc-ef5ab2273a78","Type":"ContainerStarted","Data":"023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6"} Feb 02 10:53:57 crc kubenswrapper[4909]: I0202 10:53:57.880710 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:57 crc kubenswrapper[4909]: I0202 10:53:57.998190 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-combined-ca-bundle\") pod \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " Feb 02 10:53:57 crc kubenswrapper[4909]: I0202 10:53:57.998287 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf2jj\" (UniqueName: \"kubernetes.io/projected/1bcdd101-6966-4afe-86f0-47f3b7a524fe-kube-api-access-hf2jj\") pod \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " Feb 02 10:53:57 crc kubenswrapper[4909]: I0202 10:53:57.998460 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-config-data\") pod \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " Feb 02 10:53:57 crc kubenswrapper[4909]: I0202 10:53:57.998501 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-scripts\") pod \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\" (UID: \"1bcdd101-6966-4afe-86f0-47f3b7a524fe\") " Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.005051 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bcdd101-6966-4afe-86f0-47f3b7a524fe-kube-api-access-hf2jj" (OuterVolumeSpecName: "kube-api-access-hf2jj") pod "1bcdd101-6966-4afe-86f0-47f3b7a524fe" (UID: "1bcdd101-6966-4afe-86f0-47f3b7a524fe"). InnerVolumeSpecName "kube-api-access-hf2jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.005368 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-scripts" (OuterVolumeSpecName: "scripts") pod "1bcdd101-6966-4afe-86f0-47f3b7a524fe" (UID: "1bcdd101-6966-4afe-86f0-47f3b7a524fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.028090 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bcdd101-6966-4afe-86f0-47f3b7a524fe" (UID: "1bcdd101-6966-4afe-86f0-47f3b7a524fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.029011 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-config-data" (OuterVolumeSpecName: "config-data") pod "1bcdd101-6966-4afe-86f0-47f3b7a524fe" (UID: "1bcdd101-6966-4afe-86f0-47f3b7a524fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.100622 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.100875 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf2jj\" (UniqueName: \"kubernetes.io/projected/1bcdd101-6966-4afe-86f0-47f3b7a524fe-kube-api-access-hf2jj\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.100959 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.101078 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bcdd101-6966-4afe-86f0-47f3b7a524fe-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.400696 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-vnrcn" event={"ID":"1bcdd101-6966-4afe-86f0-47f3b7a524fe","Type":"ContainerDied","Data":"9a1c7f670d06e815b83ccc2f412fd65f6f1dd599c351830c3c30e31068f5cb1d"} Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.401622 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a1c7f670d06e815b83ccc2f412fd65f6f1dd599c351830c3c30e31068f5cb1d" Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.400791 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-vnrcn" Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.558027 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.558332 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d858154b-d091-48f8-9473-69432c34a59e" containerName="nova-scheduler-scheduler" containerID="cri-o://2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6" gracePeriod=30 Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.566242 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.566522 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7d5726d8-d754-4fd9-8bbe-516b893e630f" containerName="nova-api-log" containerID="cri-o://bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd" gracePeriod=30 Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.566596 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7d5726d8-d754-4fd9-8bbe-516b893e630f" containerName="nova-api-api" containerID="cri-o://d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f" gracePeriod=30 Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.596653 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.596931 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="411a3543-746a-4905-bd62-c712fa09daef" containerName="nova-metadata-log" containerID="cri-o://14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756" gracePeriod=30 Feb 02 10:53:58 crc kubenswrapper[4909]: I0202 10:53:58.597062 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="411a3543-746a-4905-bd62-c712fa09daef" containerName="nova-metadata-metadata" containerID="cri-o://39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6" gracePeriod=30 Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.176348 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.227087 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-internal-tls-certs\") pod \"7d5726d8-d754-4fd9-8bbe-516b893e630f\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.227186 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsxgt\" (UniqueName: \"kubernetes.io/projected/7d5726d8-d754-4fd9-8bbe-516b893e630f-kube-api-access-qsxgt\") pod \"7d5726d8-d754-4fd9-8bbe-516b893e630f\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.227241 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-config-data\") pod \"7d5726d8-d754-4fd9-8bbe-516b893e630f\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.227295 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-public-tls-certs\") pod \"7d5726d8-d754-4fd9-8bbe-516b893e630f\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.227365 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-combined-ca-bundle\") pod \"7d5726d8-d754-4fd9-8bbe-516b893e630f\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.227389 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d5726d8-d754-4fd9-8bbe-516b893e630f-logs\") pod \"7d5726d8-d754-4fd9-8bbe-516b893e630f\" (UID: \"7d5726d8-d754-4fd9-8bbe-516b893e630f\") " Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.228275 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d5726d8-d754-4fd9-8bbe-516b893e630f-logs" (OuterVolumeSpecName: "logs") pod "7d5726d8-d754-4fd9-8bbe-516b893e630f" (UID: "7d5726d8-d754-4fd9-8bbe-516b893e630f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.245697 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d5726d8-d754-4fd9-8bbe-516b893e630f-kube-api-access-qsxgt" (OuterVolumeSpecName: "kube-api-access-qsxgt") pod "7d5726d8-d754-4fd9-8bbe-516b893e630f" (UID: "7d5726d8-d754-4fd9-8bbe-516b893e630f"). InnerVolumeSpecName "kube-api-access-qsxgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.288412 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-config-data" (OuterVolumeSpecName: "config-data") pod "7d5726d8-d754-4fd9-8bbe-516b893e630f" (UID: "7d5726d8-d754-4fd9-8bbe-516b893e630f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.293126 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d5726d8-d754-4fd9-8bbe-516b893e630f" (UID: "7d5726d8-d754-4fd9-8bbe-516b893e630f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.318696 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7d5726d8-d754-4fd9-8bbe-516b893e630f" (UID: "7d5726d8-d754-4fd9-8bbe-516b893e630f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.331123 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.331157 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.331169 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.331177 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d5726d8-d754-4fd9-8bbe-516b893e630f-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.331185 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsxgt\" (UniqueName: \"kubernetes.io/projected/7d5726d8-d754-4fd9-8bbe-516b893e630f-kube-api-access-qsxgt\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.331703 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7d5726d8-d754-4fd9-8bbe-516b893e630f" (UID: "7d5726d8-d754-4fd9-8bbe-516b893e630f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:53:59 crc kubenswrapper[4909]: E0202 10:53:59.406185 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:53:59 crc kubenswrapper[4909]: E0202 10:53:59.409168 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:53:59 crc kubenswrapper[4909]: E0202 10:53:59.410688 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:53:59 crc kubenswrapper[4909]: E0202 10:53:59.410735 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d858154b-d091-48f8-9473-69432c34a59e" containerName="nova-scheduler-scheduler" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.423520 4909 generic.go:334] "Generic (PLEG): container finished" podID="411a3543-746a-4905-bd62-c712fa09daef" containerID="14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756" exitCode=143 Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.423593 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"411a3543-746a-4905-bd62-c712fa09daef","Type":"ContainerDied","Data":"14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756"} Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.432900 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d5726d8-d754-4fd9-8bbe-516b893e630f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.442387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a43fc6d-3442-4921-93bc-ef5ab2273a78","Type":"ContainerStarted","Data":"3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7"} Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.442674 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.446407 4909 generic.go:334] "Generic (PLEG): container finished" podID="7d5726d8-d754-4fd9-8bbe-516b893e630f" containerID="d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f" exitCode=0 Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.446449 4909 generic.go:334] "Generic (PLEG): container finished" podID="7d5726d8-d754-4fd9-8bbe-516b893e630f" containerID="bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd" exitCode=143 Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.446482 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d5726d8-d754-4fd9-8bbe-516b893e630f","Type":"ContainerDied","Data":"d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f"} Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.446523 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d5726d8-d754-4fd9-8bbe-516b893e630f","Type":"ContainerDied","Data":"bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd"} Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.446537 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7d5726d8-d754-4fd9-8bbe-516b893e630f","Type":"ContainerDied","Data":"029f1f9654cf42ee92e2f477d0cff841cee59d1feb16e772074051f2a9f457cd"} Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.446562 4909 scope.go:117] "RemoveContainer" containerID="d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.446747 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.473937 4909 scope.go:117] "RemoveContainer" containerID="bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.474438 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.686952032 podStartE2EDuration="5.474398572s" podCreationTimestamp="2026-02-02 10:53:54 +0000 UTC" firstStartedPulling="2026-02-02 10:53:55.199076365 +0000 UTC m=+1360.945177100" lastFinishedPulling="2026-02-02 10:53:58.986522905 +0000 UTC m=+1364.732623640" observedRunningTime="2026-02-02 10:53:59.467668961 +0000 UTC m=+1365.213769716" watchObservedRunningTime="2026-02-02 10:53:59.474398572 +0000 UTC m=+1365.220499307" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.493903 4909 scope.go:117] "RemoveContainer" containerID="d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f" Feb 02 10:53:59 crc kubenswrapper[4909]: E0202 10:53:59.494311 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f\": container with ID starting with d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f not found: ID does not exist" containerID="d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.494364 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f"} err="failed to get container status \"d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f\": rpc error: code = NotFound desc = could not find container \"d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f\": container with ID starting with d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f not found: ID does not exist" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.494395 4909 scope.go:117] "RemoveContainer" containerID="bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd" Feb 02 10:53:59 crc kubenswrapper[4909]: E0202 10:53:59.494573 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd\": container with ID starting with bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd not found: ID does not exist" containerID="bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.494611 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd"} err="failed to get container status \"bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd\": rpc error: code = NotFound desc = could not find container \"bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd\": container with ID starting with bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd not found: ID does not exist" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.494626 4909 scope.go:117] "RemoveContainer" containerID="d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.494792 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f"} err="failed to get container status \"d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f\": rpc error: code = NotFound desc = could not find container \"d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f\": container with ID starting with d79baeb44cf570af1e013396f30b89c440846e39d6a0010b9fcd427912e7a29f not found: ID does not exist" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.494868 4909 scope.go:117] "RemoveContainer" containerID="bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.495174 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd"} err="failed to get container status \"bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd\": rpc error: code = NotFound desc = could not find container \"bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd\": container with ID starting with bca93f3a9be7b71b2e94226184105e594927ebd30c315cc2a506305656bdfddd not found: ID does not exist" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.512376 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.524277 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.533571 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:59 crc kubenswrapper[4909]: E0202 10:53:59.534030 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5726d8-d754-4fd9-8bbe-516b893e630f" containerName="nova-api-api" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.534051 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5726d8-d754-4fd9-8bbe-516b893e630f" containerName="nova-api-api" Feb 02 10:53:59 crc kubenswrapper[4909]: E0202 10:53:59.534077 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5726d8-d754-4fd9-8bbe-516b893e630f" containerName="nova-api-log" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.534084 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5726d8-d754-4fd9-8bbe-516b893e630f" containerName="nova-api-log" Feb 02 10:53:59 crc kubenswrapper[4909]: E0202 10:53:59.534103 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bcdd101-6966-4afe-86f0-47f3b7a524fe" containerName="nova-manage" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.534111 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bcdd101-6966-4afe-86f0-47f3b7a524fe" containerName="nova-manage" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.534284 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bcdd101-6966-4afe-86f0-47f3b7a524fe" containerName="nova-manage" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.534302 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5726d8-d754-4fd9-8bbe-516b893e630f" containerName="nova-api-api" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.534320 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5726d8-d754-4fd9-8bbe-516b893e630f" containerName="nova-api-log" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.535374 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.537572 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.537796 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.538287 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.545324 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.638551 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.638842 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-public-tls-certs\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.638998 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7202de6a-156c-4c06-9e08-3e62cfcf367e-logs\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.639085 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-config-data\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.639164 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wckzf\" (UniqueName: \"kubernetes.io/projected/7202de6a-156c-4c06-9e08-3e62cfcf367e-kube-api-access-wckzf\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.639256 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.740623 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7202de6a-156c-4c06-9e08-3e62cfcf367e-logs\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.740755 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-config-data\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.741154 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7202de6a-156c-4c06-9e08-3e62cfcf367e-logs\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.741521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wckzf\" (UniqueName: \"kubernetes.io/projected/7202de6a-156c-4c06-9e08-3e62cfcf367e-kube-api-access-wckzf\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.741579 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.741680 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.741762 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-public-tls-certs\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.745563 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-config-data\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.745836 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-public-tls-certs\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.746125 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.747434 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.759061 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wckzf\" (UniqueName: \"kubernetes.io/projected/7202de6a-156c-4c06-9e08-3e62cfcf367e-kube-api-access-wckzf\") pod \"nova-api-0\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " pod="openstack/nova-api-0" Feb 02 10:53:59 crc kubenswrapper[4909]: I0202 10:53:59.853395 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:54:00 crc kubenswrapper[4909]: I0202 10:54:00.291961 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:54:00 crc kubenswrapper[4909]: W0202 10:54:00.298914 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7202de6a_156c_4c06_9e08_3e62cfcf367e.slice/crio-6d0e811b8f50ead3e9e919e4fcc41893c00d5a1a8955f6a999108e5c8dc0292d WatchSource:0}: Error finding container 6d0e811b8f50ead3e9e919e4fcc41893c00d5a1a8955f6a999108e5c8dc0292d: Status 404 returned error can't find the container with id 6d0e811b8f50ead3e9e919e4fcc41893c00d5a1a8955f6a999108e5c8dc0292d Feb 02 10:54:00 crc kubenswrapper[4909]: I0202 10:54:00.455555 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7202de6a-156c-4c06-9e08-3e62cfcf367e","Type":"ContainerStarted","Data":"6d0e811b8f50ead3e9e919e4fcc41893c00d5a1a8955f6a999108e5c8dc0292d"} Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.027096 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d5726d8-d754-4fd9-8bbe-516b893e630f" path="/var/lib/kubelet/pods/7d5726d8-d754-4fd9-8bbe-516b893e630f/volumes" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.467439 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7202de6a-156c-4c06-9e08-3e62cfcf367e","Type":"ContainerStarted","Data":"a20507244041d23b32f190d2a807a0eee356b71acfba50ced33b0ba77be75a3b"} Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.468209 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7202de6a-156c-4c06-9e08-3e62cfcf367e","Type":"ContainerStarted","Data":"dfe3f218431d62cbf5802f23a9e8f8dfdc1b311556035b91ad8568910dc22027"} Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.491316 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.491300158 podStartE2EDuration="2.491300158s" podCreationTimestamp="2026-02-02 10:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:01.487914782 +0000 UTC m=+1367.234015517" watchObservedRunningTime="2026-02-02 10:54:01.491300158 +0000 UTC m=+1367.237400893" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.553244 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cvj82"] Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.555412 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.566384 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvj82"] Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.687008 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swpzn\" (UniqueName: \"kubernetes.io/projected/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-kube-api-access-swpzn\") pod \"redhat-marketplace-cvj82\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.687081 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-utilities\") pod \"redhat-marketplace-cvj82\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.687227 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-catalog-content\") pod \"redhat-marketplace-cvj82\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.789387 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swpzn\" (UniqueName: \"kubernetes.io/projected/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-kube-api-access-swpzn\") pod \"redhat-marketplace-cvj82\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.789458 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-utilities\") pod \"redhat-marketplace-cvj82\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.789567 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-catalog-content\") pod \"redhat-marketplace-cvj82\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.789914 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-utilities\") pod \"redhat-marketplace-cvj82\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.790008 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-catalog-content\") pod \"redhat-marketplace-cvj82\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.817974 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swpzn\" (UniqueName: \"kubernetes.io/projected/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-kube-api-access-swpzn\") pod \"redhat-marketplace-cvj82\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:01 crc kubenswrapper[4909]: I0202 10:54:01.872844 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.285220 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.399363 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b44m\" (UniqueName: \"kubernetes.io/projected/411a3543-746a-4905-bd62-c712fa09daef-kube-api-access-9b44m\") pod \"411a3543-746a-4905-bd62-c712fa09daef\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.399491 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-config-data\") pod \"411a3543-746a-4905-bd62-c712fa09daef\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.399632 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-nova-metadata-tls-certs\") pod \"411a3543-746a-4905-bd62-c712fa09daef\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.399713 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411a3543-746a-4905-bd62-c712fa09daef-logs\") pod \"411a3543-746a-4905-bd62-c712fa09daef\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.399763 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-combined-ca-bundle\") pod \"411a3543-746a-4905-bd62-c712fa09daef\" (UID: \"411a3543-746a-4905-bd62-c712fa09daef\") " Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.400097 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvj82"] Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.400599 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/411a3543-746a-4905-bd62-c712fa09daef-logs" (OuterVolumeSpecName: "logs") pod "411a3543-746a-4905-bd62-c712fa09daef" (UID: "411a3543-746a-4905-bd62-c712fa09daef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.408690 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411a3543-746a-4905-bd62-c712fa09daef-kube-api-access-9b44m" (OuterVolumeSpecName: "kube-api-access-9b44m") pod "411a3543-746a-4905-bd62-c712fa09daef" (UID: "411a3543-746a-4905-bd62-c712fa09daef"). InnerVolumeSpecName "kube-api-access-9b44m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.433982 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "411a3543-746a-4905-bd62-c712fa09daef" (UID: "411a3543-746a-4905-bd62-c712fa09daef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.455315 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-config-data" (OuterVolumeSpecName: "config-data") pod "411a3543-746a-4905-bd62-c712fa09daef" (UID: "411a3543-746a-4905-bd62-c712fa09daef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.457603 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "411a3543-746a-4905-bd62-c712fa09daef" (UID: "411a3543-746a-4905-bd62-c712fa09daef"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.482185 4909 generic.go:334] "Generic (PLEG): container finished" podID="411a3543-746a-4905-bd62-c712fa09daef" containerID="39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6" exitCode=0 Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.482260 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"411a3543-746a-4905-bd62-c712fa09daef","Type":"ContainerDied","Data":"39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6"} Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.482287 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"411a3543-746a-4905-bd62-c712fa09daef","Type":"ContainerDied","Data":"9aea5850e7f172549c73eaf0a4fe548e419f737812cb382c8f4b06645b7254a3"} Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.482305 4909 scope.go:117] "RemoveContainer" containerID="39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.483431 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvj82" event={"ID":"ebeeac85-e454-4d63-87ff-8fa563bdf0ab","Type":"ContainerStarted","Data":"4e808ec92338871bd7c97ffa6f3d70839de29a68e3496453327b7c6f0e60da96"} Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.483687 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.501645 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.501686 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/411a3543-746a-4905-bd62-c712fa09daef-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.501701 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.501712 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b44m\" (UniqueName: \"kubernetes.io/projected/411a3543-746a-4905-bd62-c712fa09daef-kube-api-access-9b44m\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.501722 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/411a3543-746a-4905-bd62-c712fa09daef-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.553400 4909 scope.go:117] "RemoveContainer" containerID="14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.579291 4909 scope.go:117] "RemoveContainer" containerID="39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6" Feb 02 10:54:02 crc kubenswrapper[4909]: E0202 10:54:02.579897 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6\": container with ID starting with 39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6 not found: ID does not exist" containerID="39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.579961 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6"} err="failed to get container status \"39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6\": rpc error: code = NotFound desc = could not find container \"39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6\": container with ID starting with 39d6a6d692a45e86fe84cf72c598e783c0ac5b208077cc1b9bde3cbf876c90d6 not found: ID does not exist" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.579994 4909 scope.go:117] "RemoveContainer" containerID="14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756" Feb 02 10:54:02 crc kubenswrapper[4909]: E0202 10:54:02.580281 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756\": container with ID starting with 14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756 not found: ID does not exist" containerID="14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.580312 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756"} err="failed to get container status \"14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756\": rpc error: code = NotFound desc = could not find container \"14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756\": container with ID starting with 14bfa9457ecf535f5938e5e44eaccbb2e2a77aaae5daf4b985bf690007440756 not found: ID does not exist" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.589664 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.605168 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.617955 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:54:02 crc kubenswrapper[4909]: E0202 10:54:02.618455 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411a3543-746a-4905-bd62-c712fa09daef" containerName="nova-metadata-metadata" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.618477 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="411a3543-746a-4905-bd62-c712fa09daef" containerName="nova-metadata-metadata" Feb 02 10:54:02 crc kubenswrapper[4909]: E0202 10:54:02.618491 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411a3543-746a-4905-bd62-c712fa09daef" containerName="nova-metadata-log" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.618499 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="411a3543-746a-4905-bd62-c712fa09daef" containerName="nova-metadata-log" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.618726 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="411a3543-746a-4905-bd62-c712fa09daef" containerName="nova-metadata-log" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.618752 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="411a3543-746a-4905-bd62-c712fa09daef" containerName="nova-metadata-metadata" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.620039 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.622904 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.623105 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.630609 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.704861 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-config-data\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.705208 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.705244 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f0cf9e9-b663-4be3-a435-c7dd6deea228-logs\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.705283 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.705332 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5v9j\" (UniqueName: \"kubernetes.io/projected/7f0cf9e9-b663-4be3-a435-c7dd6deea228-kube-api-access-x5v9j\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.806876 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.806936 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f0cf9e9-b663-4be3-a435-c7dd6deea228-logs\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.806974 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.807024 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5v9j\" (UniqueName: \"kubernetes.io/projected/7f0cf9e9-b663-4be3-a435-c7dd6deea228-kube-api-access-x5v9j\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.807083 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-config-data\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.808093 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f0cf9e9-b663-4be3-a435-c7dd6deea228-logs\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.810964 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.811264 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.811757 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-config-data\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.824969 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5v9j\" (UniqueName: \"kubernetes.io/projected/7f0cf9e9-b663-4be3-a435-c7dd6deea228-kube-api-access-x5v9j\") pod \"nova-metadata-0\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " pod="openstack/nova-metadata-0" Feb 02 10:54:02 crc kubenswrapper[4909]: I0202 10:54:02.941936 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.034584 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411a3543-746a-4905-bd62-c712fa09daef" path="/var/lib/kubelet/pods/411a3543-746a-4905-bd62-c712fa09daef/volumes" Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.370035 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.494840 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f0cf9e9-b663-4be3-a435-c7dd6deea228","Type":"ContainerStarted","Data":"616e8f784384a3a669ea27e5b62f8fac3c4e970a6e41b551be916f38b10f6f48"} Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.500307 4909 generic.go:334] "Generic (PLEG): container finished" podID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerID="c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7" exitCode=0 Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.500350 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvj82" event={"ID":"ebeeac85-e454-4d63-87ff-8fa563bdf0ab","Type":"ContainerDied","Data":"c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7"} Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.506221 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.852512 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.932275 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-combined-ca-bundle\") pod \"d858154b-d091-48f8-9473-69432c34a59e\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.932368 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-config-data\") pod \"d858154b-d091-48f8-9473-69432c34a59e\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.932519 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbnjf\" (UniqueName: \"kubernetes.io/projected/d858154b-d091-48f8-9473-69432c34a59e-kube-api-access-dbnjf\") pod \"d858154b-d091-48f8-9473-69432c34a59e\" (UID: \"d858154b-d091-48f8-9473-69432c34a59e\") " Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.939148 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d858154b-d091-48f8-9473-69432c34a59e-kube-api-access-dbnjf" (OuterVolumeSpecName: "kube-api-access-dbnjf") pod "d858154b-d091-48f8-9473-69432c34a59e" (UID: "d858154b-d091-48f8-9473-69432c34a59e"). InnerVolumeSpecName "kube-api-access-dbnjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.962635 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d858154b-d091-48f8-9473-69432c34a59e" (UID: "d858154b-d091-48f8-9473-69432c34a59e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:03 crc kubenswrapper[4909]: I0202 10:54:03.965970 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-config-data" (OuterVolumeSpecName: "config-data") pod "d858154b-d091-48f8-9473-69432c34a59e" (UID: "d858154b-d091-48f8-9473-69432c34a59e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.034683 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbnjf\" (UniqueName: \"kubernetes.io/projected/d858154b-d091-48f8-9473-69432c34a59e-kube-api-access-dbnjf\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.034724 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.034735 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d858154b-d091-48f8-9473-69432c34a59e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.509456 4909 generic.go:334] "Generic (PLEG): container finished" podID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerID="bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a" exitCode=0 Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.509509 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvj82" event={"ID":"ebeeac85-e454-4d63-87ff-8fa563bdf0ab","Type":"ContainerDied","Data":"bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a"} Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.513507 4909 generic.go:334] "Generic (PLEG): container finished" podID="d858154b-d091-48f8-9473-69432c34a59e" containerID="2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6" exitCode=0 Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.513565 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.513555 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d858154b-d091-48f8-9473-69432c34a59e","Type":"ContainerDied","Data":"2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6"} Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.513754 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d858154b-d091-48f8-9473-69432c34a59e","Type":"ContainerDied","Data":"5b433b7eb3a6546d1f40baa527418cb0f6b8492fe2fb2cbe12bc9dd86afafa65"} Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.513792 4909 scope.go:117] "RemoveContainer" containerID="2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.517525 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f0cf9e9-b663-4be3-a435-c7dd6deea228","Type":"ContainerStarted","Data":"9146b7fe892e3b1b3b9341aaa935f843c8006e1d193949290905af61bbfd863a"} Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.517559 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f0cf9e9-b663-4be3-a435-c7dd6deea228","Type":"ContainerStarted","Data":"1a3c44ae75c0e82a41ac845e6bc75bd78044db6d50d81a3cf25783da569108b8"} Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.540149 4909 scope.go:117] "RemoveContainer" containerID="2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6" Feb 02 10:54:04 crc kubenswrapper[4909]: E0202 10:54:04.540764 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6\": container with ID starting with 2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6 not found: ID does not exist" containerID="2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.540878 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6"} err="failed to get container status \"2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6\": rpc error: code = NotFound desc = could not find container \"2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6\": container with ID starting with 2dcf2cc9e0c8372b0a019e8a342f3a071a1ec30e0e9f74be56e2ab8186436ca6 not found: ID does not exist" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.556403 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.586762 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.595695 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:54:04 crc kubenswrapper[4909]: E0202 10:54:04.596132 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d858154b-d091-48f8-9473-69432c34a59e" containerName="nova-scheduler-scheduler" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.596149 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d858154b-d091-48f8-9473-69432c34a59e" containerName="nova-scheduler-scheduler" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.596352 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d858154b-d091-48f8-9473-69432c34a59e" containerName="nova-scheduler-scheduler" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.596990 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.599416 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.607930 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.615154 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.615134767 podStartE2EDuration="2.615134767s" podCreationTimestamp="2026-02-02 10:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:04.577468387 +0000 UTC m=+1370.323569162" watchObservedRunningTime="2026-02-02 10:54:04.615134767 +0000 UTC m=+1370.361235502" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.648633 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.648773 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58h4b\" (UniqueName: \"kubernetes.io/projected/ba559652-0584-49c5-91d6-7d7fdd596dc2-kube-api-access-58h4b\") pod \"nova-scheduler-0\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.648948 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-config-data\") pod \"nova-scheduler-0\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.750262 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.750383 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58h4b\" (UniqueName: \"kubernetes.io/projected/ba559652-0584-49c5-91d6-7d7fdd596dc2-kube-api-access-58h4b\") pod \"nova-scheduler-0\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.750408 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-config-data\") pod \"nova-scheduler-0\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.754051 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-config-data\") pod \"nova-scheduler-0\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.754768 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.765397 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58h4b\" (UniqueName: \"kubernetes.io/projected/ba559652-0584-49c5-91d6-7d7fdd596dc2-kube-api-access-58h4b\") pod \"nova-scheduler-0\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " pod="openstack/nova-scheduler-0" Feb 02 10:54:04 crc kubenswrapper[4909]: I0202 10:54:04.922074 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:54:05 crc kubenswrapper[4909]: I0202 10:54:05.063063 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d858154b-d091-48f8-9473-69432c34a59e" path="/var/lib/kubelet/pods/d858154b-d091-48f8-9473-69432c34a59e/volumes" Feb 02 10:54:05 crc kubenswrapper[4909]: I0202 10:54:05.410842 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:54:05 crc kubenswrapper[4909]: W0202 10:54:05.414236 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba559652_0584_49c5_91d6_7d7fdd596dc2.slice/crio-f6edd9044a4aea805102e06530f56978ae8028bdfcf591965857f26a4ce79788 WatchSource:0}: Error finding container f6edd9044a4aea805102e06530f56978ae8028bdfcf591965857f26a4ce79788: Status 404 returned error can't find the container with id f6edd9044a4aea805102e06530f56978ae8028bdfcf591965857f26a4ce79788 Feb 02 10:54:05 crc kubenswrapper[4909]: I0202 10:54:05.528043 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvj82" event={"ID":"ebeeac85-e454-4d63-87ff-8fa563bdf0ab","Type":"ContainerStarted","Data":"80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3"} Feb 02 10:54:05 crc kubenswrapper[4909]: I0202 10:54:05.530671 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba559652-0584-49c5-91d6-7d7fdd596dc2","Type":"ContainerStarted","Data":"f6edd9044a4aea805102e06530f56978ae8028bdfcf591965857f26a4ce79788"} Feb 02 10:54:05 crc kubenswrapper[4909]: I0202 10:54:05.551862 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cvj82" podStartSLOduration=3.161110461 podStartE2EDuration="4.55183695s" podCreationTimestamp="2026-02-02 10:54:01 +0000 UTC" firstStartedPulling="2026-02-02 10:54:03.505748855 +0000 UTC m=+1369.251849590" lastFinishedPulling="2026-02-02 10:54:04.896475344 +0000 UTC m=+1370.642576079" observedRunningTime="2026-02-02 10:54:05.54266111 +0000 UTC m=+1371.288761845" watchObservedRunningTime="2026-02-02 10:54:05.55183695 +0000 UTC m=+1371.297937685" Feb 02 10:54:06 crc kubenswrapper[4909]: I0202 10:54:06.543379 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba559652-0584-49c5-91d6-7d7fdd596dc2","Type":"ContainerStarted","Data":"7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e"} Feb 02 10:54:06 crc kubenswrapper[4909]: I0202 10:54:06.568855 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.568837967 podStartE2EDuration="2.568837967s" podCreationTimestamp="2026-02-02 10:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:06.561303983 +0000 UTC m=+1372.307404748" watchObservedRunningTime="2026-02-02 10:54:06.568837967 +0000 UTC m=+1372.314938702" Feb 02 10:54:07 crc kubenswrapper[4909]: I0202 10:54:07.942401 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:54:07 crc kubenswrapper[4909]: I0202 10:54:07.942738 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 10:54:09 crc kubenswrapper[4909]: I0202 10:54:09.854571 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:54:09 crc kubenswrapper[4909]: I0202 10:54:09.855171 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 10:54:09 crc kubenswrapper[4909]: I0202 10:54:09.923204 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 10:54:10 crc kubenswrapper[4909]: I0202 10:54:10.867109 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:54:10 crc kubenswrapper[4909]: I0202 10:54:10.867134 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:54:11 crc kubenswrapper[4909]: I0202 10:54:11.873356 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:11 crc kubenswrapper[4909]: I0202 10:54:11.873721 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:11 crc kubenswrapper[4909]: I0202 10:54:11.917124 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:12 crc kubenswrapper[4909]: I0202 10:54:12.644988 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:12 crc kubenswrapper[4909]: I0202 10:54:12.712493 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvj82"] Feb 02 10:54:12 crc kubenswrapper[4909]: I0202 10:54:12.942118 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:54:12 crc kubenswrapper[4909]: I0202 10:54:12.942190 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 10:54:13 crc kubenswrapper[4909]: I0202 10:54:13.954970 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 10:54:13 crc kubenswrapper[4909]: I0202 10:54:13.954975 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 10:54:14 crc kubenswrapper[4909]: I0202 10:54:14.614625 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cvj82" podUID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerName="registry-server" containerID="cri-o://80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3" gracePeriod=2 Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:14.924123 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:14.957194 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.078540 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.146154 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-catalog-content\") pod \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.146226 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-utilities\") pod \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.146261 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swpzn\" (UniqueName: \"kubernetes.io/projected/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-kube-api-access-swpzn\") pod \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\" (UID: \"ebeeac85-e454-4d63-87ff-8fa563bdf0ab\") " Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.146962 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-utilities" (OuterVolumeSpecName: "utilities") pod "ebeeac85-e454-4d63-87ff-8fa563bdf0ab" (UID: "ebeeac85-e454-4d63-87ff-8fa563bdf0ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.152451 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-kube-api-access-swpzn" (OuterVolumeSpecName: "kube-api-access-swpzn") pod "ebeeac85-e454-4d63-87ff-8fa563bdf0ab" (UID: "ebeeac85-e454-4d63-87ff-8fa563bdf0ab"). InnerVolumeSpecName "kube-api-access-swpzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.169529 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebeeac85-e454-4d63-87ff-8fa563bdf0ab" (UID: "ebeeac85-e454-4d63-87ff-8fa563bdf0ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.248817 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.248852 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swpzn\" (UniqueName: \"kubernetes.io/projected/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-kube-api-access-swpzn\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.248862 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebeeac85-e454-4d63-87ff-8fa563bdf0ab-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.629439 4909 generic.go:334] "Generic (PLEG): container finished" podID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerID="80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3" exitCode=0 Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.629551 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cvj82" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.629573 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvj82" event={"ID":"ebeeac85-e454-4d63-87ff-8fa563bdf0ab","Type":"ContainerDied","Data":"80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3"} Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.629625 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cvj82" event={"ID":"ebeeac85-e454-4d63-87ff-8fa563bdf0ab","Type":"ContainerDied","Data":"4e808ec92338871bd7c97ffa6f3d70839de29a68e3496453327b7c6f0e60da96"} Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.629650 4909 scope.go:117] "RemoveContainer" containerID="80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.658101 4909 scope.go:117] "RemoveContainer" containerID="bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.681448 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.687213 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvj82"] Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.698672 4909 scope.go:117] "RemoveContainer" containerID="c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.699342 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cvj82"] Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.739370 4909 scope.go:117] "RemoveContainer" containerID="80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3" Feb 02 10:54:15 crc kubenswrapper[4909]: E0202 10:54:15.739854 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3\": container with ID starting with 80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3 not found: ID does not exist" containerID="80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.739885 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3"} err="failed to get container status \"80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3\": rpc error: code = NotFound desc = could not find container \"80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3\": container with ID starting with 80b546f8a562b9bef0620a72cf0a964de843e3aeba6308efea6036190f488ae3 not found: ID does not exist" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.739909 4909 scope.go:117] "RemoveContainer" containerID="bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a" Feb 02 10:54:15 crc kubenswrapper[4909]: E0202 10:54:15.740275 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a\": container with ID starting with bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a not found: ID does not exist" containerID="bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.740302 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a"} err="failed to get container status \"bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a\": rpc error: code = NotFound desc = could not find container \"bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a\": container with ID starting with bf1045192c3e997f0f9de8391cb5599145ec7dc36e9ce10e1fe9dea953fe3a6a not found: ID does not exist" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.740317 4909 scope.go:117] "RemoveContainer" containerID="c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7" Feb 02 10:54:15 crc kubenswrapper[4909]: E0202 10:54:15.740487 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7\": container with ID starting with c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7 not found: ID does not exist" containerID="c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7" Feb 02 10:54:15 crc kubenswrapper[4909]: I0202 10:54:15.740506 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7"} err="failed to get container status \"c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7\": rpc error: code = NotFound desc = could not find container \"c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7\": container with ID starting with c006872b5b120890b2fd310bf3278a6ba680f03e5f7c324a0fb82c0a69c828e7 not found: ID does not exist" Feb 02 10:54:17 crc kubenswrapper[4909]: I0202 10:54:17.027861 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" path="/var/lib/kubelet/pods/ebeeac85-e454-4d63-87ff-8fa563bdf0ab/volumes" Feb 02 10:54:19 crc kubenswrapper[4909]: I0202 10:54:19.863109 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:54:19 crc kubenswrapper[4909]: I0202 10:54:19.863734 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:54:19 crc kubenswrapper[4909]: I0202 10:54:19.863951 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 10:54:19 crc kubenswrapper[4909]: I0202 10:54:19.871341 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:54:20 crc kubenswrapper[4909]: I0202 10:54:20.673981 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 10:54:20 crc kubenswrapper[4909]: I0202 10:54:20.684098 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 10:54:22 crc kubenswrapper[4909]: I0202 10:54:22.947699 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:54:22 crc kubenswrapper[4909]: I0202 10:54:22.948078 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 10:54:22 crc kubenswrapper[4909]: I0202 10:54:22.953722 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:54:22 crc kubenswrapper[4909]: I0202 10:54:22.954601 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 10:54:24 crc kubenswrapper[4909]: I0202 10:54:24.730868 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.301677 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.302364 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="c8dba959-faf4-4f15-96d3-e8f67ae00d62" containerName="openstackclient" containerID="cri-o://908ec2618be6ce0ccb5553a70d2b7d51ef915f04d269be9c3d7f53480f657d99" gracePeriod=2 Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.318386 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.338122 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6dd95bf6f-66vhd"] Feb 02 10:54:43 crc kubenswrapper[4909]: E0202 10:54:43.338553 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerName="extract-content" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.338571 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerName="extract-content" Feb 02 10:54:43 crc kubenswrapper[4909]: E0202 10:54:43.338587 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerName="extract-utilities" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.338594 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerName="extract-utilities" Feb 02 10:54:43 crc kubenswrapper[4909]: E0202 10:54:43.338616 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8dba959-faf4-4f15-96d3-e8f67ae00d62" containerName="openstackclient" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.338624 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8dba959-faf4-4f15-96d3-e8f67ae00d62" containerName="openstackclient" Feb 02 10:54:43 crc kubenswrapper[4909]: E0202 10:54:43.338636 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerName="registry-server" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.338641 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerName="registry-server" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.338837 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8dba959-faf4-4f15-96d3-e8f67ae00d62" containerName="openstackclient" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.338851 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebeeac85-e454-4d63-87ff-8fa563bdf0ab" containerName="registry-server" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.339962 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.352576 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-795c6654c6-z72r6"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.355301 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.368880 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-795c6654c6-z72r6"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.377957 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6dd95bf6f-66vhd"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.399187 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.399243 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-combined-ca-bundle\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.399276 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data-custom\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.399312 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-combined-ca-bundle\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.399334 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gghks\" (UniqueName: \"kubernetes.io/projected/65028b8f-2d3c-40f3-8c17-239856623f4e-kube-api-access-gghks\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.399383 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5tst\" (UniqueName: \"kubernetes.io/projected/c97f6f0e-16ab-439c-a14c-3d908758b1db-kube-api-access-r5tst\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.401558 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c97f6f0e-16ab-439c-a14c-3d908758b1db-logs\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.401754 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65028b8f-2d3c-40f3-8c17-239856623f4e-logs\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.401826 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data-custom\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.401901 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.454445 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c8ec-account-create-update-cphbx"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.494876 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c8ec-account-create-update-cphbx"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.503468 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65028b8f-2d3c-40f3-8c17-239856623f4e-logs\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.503518 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data-custom\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.503558 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.503610 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.503635 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-combined-ca-bundle\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.503656 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data-custom\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.503679 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-combined-ca-bundle\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.503695 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gghks\" (UniqueName: \"kubernetes.io/projected/65028b8f-2d3c-40f3-8c17-239856623f4e-kube-api-access-gghks\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.503728 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5tst\" (UniqueName: \"kubernetes.io/projected/c97f6f0e-16ab-439c-a14c-3d908758b1db-kube-api-access-r5tst\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.503753 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c97f6f0e-16ab-439c-a14c-3d908758b1db-logs\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.504181 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c97f6f0e-16ab-439c-a14c-3d908758b1db-logs\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.504441 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65028b8f-2d3c-40f3-8c17-239856623f4e-logs\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.539891 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.561726 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-combined-ca-bundle\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.563189 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.572028 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-blz7s"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.573438 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-blz7s" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.573870 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data-custom\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.574042 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-combined-ca-bundle\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.584731 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.586161 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.588404 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data-custom\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.601061 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gghks\" (UniqueName: \"kubernetes.io/projected/65028b8f-2d3c-40f3-8c17-239856623f4e-kube-api-access-gghks\") pod \"barbican-worker-6dd95bf6f-66vhd\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.634398 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-blz7s"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.635196 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5tst\" (UniqueName: \"kubernetes.io/projected/c97f6f0e-16ab-439c-a14c-3d908758b1db-kube-api-access-r5tst\") pod \"barbican-keystone-listener-795c6654c6-z72r6\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.652852 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6bbd5778f6-n6mw6"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.664154 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.690776 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.691403 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6bbd5778f6-n6mw6"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.703444 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-mhgf5"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.715092 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.716429 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts\") pod \"root-account-create-update-blz7s\" (UID: \"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060\") " pod="openstack/root-account-create-update-blz7s" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.716472 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w68r\" (UniqueName: \"kubernetes.io/projected/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-kube-api-access-7w68r\") pod \"root-account-create-update-blz7s\" (UID: \"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060\") " pod="openstack/root-account-create-update-blz7s" Feb 02 10:54:43 crc kubenswrapper[4909]: E0202 10:54:43.716998 4909 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 02 10:54:43 crc kubenswrapper[4909]: E0202 10:54:43.717052 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data podName:b441d32f-f76f-4e7b-b3fe-40e93b126567 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:44.217035161 +0000 UTC m=+1409.963135896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data") pod "rabbitmq-server-0" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567") : configmap "rabbitmq-config-data" not found Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.720321 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-mhgf5"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.826865 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.827237 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-public-tls-certs\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.827275 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-combined-ca-bundle\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.827313 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts\") pod \"root-account-create-update-blz7s\" (UID: \"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060\") " pod="openstack/root-account-create-update-blz7s" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.827335 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-internal-tls-certs\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.827355 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w68r\" (UniqueName: \"kubernetes.io/projected/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-kube-api-access-7w68r\") pod \"root-account-create-update-blz7s\" (UID: \"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060\") " pod="openstack/root-account-create-update-blz7s" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.827375 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddda8a20-60ba-4ae9-837a-44fa44518b8a-logs\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.827412 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data-custom\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.827470 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8j8j\" (UniqueName: \"kubernetes.io/projected/ddda8a20-60ba-4ae9-837a-44fa44518b8a-kube-api-access-g8j8j\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.828401 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts\") pod \"root-account-create-update-blz7s\" (UID: \"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060\") " pod="openstack/root-account-create-update-blz7s" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.902764 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w68r\" (UniqueName: \"kubernetes.io/projected/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-kube-api-access-7w68r\") pod \"root-account-create-update-blz7s\" (UID: \"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060\") " pod="openstack/root-account-create-update-blz7s" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.914871 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-abe3-account-create-update-47nm8"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.934150 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-public-tls-certs\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.934211 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-combined-ca-bundle\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.934249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-internal-tls-certs\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.934289 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddda8a20-60ba-4ae9-837a-44fa44518b8a-logs\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.934331 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data-custom\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.934395 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8j8j\" (UniqueName: \"kubernetes.io/projected/ddda8a20-60ba-4ae9-837a-44fa44518b8a-kube-api-access-g8j8j\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.934458 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.942523 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-90c9-account-create-update-6nkzl"] Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.958246 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddda8a20-60ba-4ae9-837a-44fa44518b8a-logs\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.963280 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-internal-tls-certs\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.963336 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.965392 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-public-tls-certs\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.969770 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-combined-ca-bundle\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:43 crc kubenswrapper[4909]: I0202 10:54:43.970430 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data-custom\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.050030 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xqjmc"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.095244 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8j8j\" (UniqueName: \"kubernetes.io/projected/ddda8a20-60ba-4ae9-837a-44fa44518b8a-kube-api-access-g8j8j\") pod \"barbican-api-6bbd5778f6-n6mw6\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.150370 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xqjmc"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.202266 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-blz7s" Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.215205 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-90c9-account-create-update-6nkzl"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.255244 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.276955 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-abe3-account-create-update-47nm8"] Feb 02 10:54:44 crc kubenswrapper[4909]: E0202 10:54:44.282861 4909 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 02 10:54:44 crc kubenswrapper[4909]: E0202 10:54:44.282930 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data podName:b441d32f-f76f-4e7b-b3fe-40e93b126567 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:45.282914605 +0000 UTC m=+1411.029015340 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data") pod "rabbitmq-server-0" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567") : configmap "rabbitmq-config-data" not found Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.292927 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.323495 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d833-account-create-update-dbfhx"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.332846 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d833-account-create-update-dbfhx"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.360186 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jpqtq"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.381846 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jpqtq"] Feb 02 10:54:44 crc kubenswrapper[4909]: E0202 10:54:44.388622 4909 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 02 10:54:44 crc kubenswrapper[4909]: E0202 10:54:44.388707 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data podName:1ab15f72-b249-42d5-8698-273c5afc7758 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:44.88866327 +0000 UTC m=+1410.634763995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data") pod "rabbitmq-cell1-server-0" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758") : configmap "rabbitmq-cell1-config-data" not found Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.399628 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a6cd-account-create-update-kvdwx"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.409505 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a6cd-account-create-update-kvdwx"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.422895 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.423642 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="f0a30163-0b42-493b-b775-d88218bd1844" containerName="openstack-network-exporter" containerID="cri-o://960092178e862c3b77d8e3bf300b8c16cf49e7db4eb0520f595f9cef3f507a97" gracePeriod=300 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.440104 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-vnrcn"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.455582 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-dd3e-account-create-update-phmjt"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.486082 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-vnrcn"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.523060 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-hgjsh"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.542657 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-hgjsh"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.581466 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="f0a30163-0b42-493b-b775-d88218bd1844" containerName="ovsdbserver-nb" containerID="cri-o://97f377e1596e6eafef005fa1e1978dccc7634e807d2ca9c148149e41e82b87df" gracePeriod=300 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.581604 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-dd3e-account-create-update-phmjt"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.605722 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.606407 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="7b658933-f66d-47df-8b75-a42cd55b9bf4" containerName="openstack-network-exporter" containerID="cri-o://d2191dd6198a0fc9ecc9c919260563196adfa2a35bc25ae0ca32279cce086a5f" gracePeriod=300 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.616836 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-c27bb"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.628404 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-c27bb"] Feb 02 10:54:44 crc kubenswrapper[4909]: W0202 10:54:44.641978 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc97f6f0e_16ab_439c_a14c_3d908758b1db.slice/crio-ef77b807b50570f53289b41354f9f01947db881fc7e634a6426df9eede9623af WatchSource:0}: Error finding container ef77b807b50570f53289b41354f9f01947db881fc7e634a6426df9eede9623af: Status 404 returned error can't find the container with id ef77b807b50570f53289b41354f9f01947db881fc7e634a6426df9eede9623af Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.659412 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.659630 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerName="ovn-northd" containerID="cri-o://52e5a0b83ebdf2c32308bcba8bf2bcd22b63c5893664ccffa254ee3d4272e8b7" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.659984 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerName="openstack-network-exporter" containerID="cri-o://c74c134d5ea53fc5d68abe3fe06e4cfbcfa8770404a1e48f3df9ace3dc76800d" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.686639 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wn6km"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.698928 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wn6km"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.716034 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vxm4q"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.726753 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qpqvt"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.737558 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-vxm4q"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.768143 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-795c6654c6-z72r6"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.793432 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-9hkv5"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.804567 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-pzcjh"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.804755 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-pzcjh" podUID="c631ddad-ab3a-488f-9947-2f3385fd912c" containerName="openstack-network-exporter" containerID="cri-o://e616e96154cc54a51e3703d10ef0dafd77cf698728a725fab58c048715a29bb9" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.813875 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.814150 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9b24f572-8a70-4a46-b3cf-e50ae4859892" containerName="cinder-scheduler" containerID="cri-o://e29f67614fb9cccc18985f89282ec44019b367ff92f09a6e3769b0344cbb1193" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.814298 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9b24f572-8a70-4a46-b3cf-e50ae4859892" containerName="probe" containerID="cri-o://5181b71ba760ed91aeb7b3915c3d7814e4e85e3d4983092599caaa351d56bd8b" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.827541 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6dd95bf6f-66vhd"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.833107 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="7b658933-f66d-47df-8b75-a42cd55b9bf4" containerName="ovsdbserver-sb" containerID="cri-o://b05bbe00b26a1b9593023419b5d4a14f74126dc62da00a8e37a5ac60df828f91" gracePeriod=300 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.839710 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.839953 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerName="cinder-api-log" containerID="cri-o://e41df562f1a41e47706274614d03c13faaba64903b9ae50a415bad9f51e06946" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.840077 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerName="cinder-api" containerID="cri-o://9b97792ee0643776ed45dd9bce3742b6344f3f3eb329b31a1d413958f371ef6e" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.861591 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-lzdlk"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.880546 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-lzdlk"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.892097 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:54:44 crc kubenswrapper[4909]: E0202 10:54:44.898890 4909 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 02 10:54:44 crc kubenswrapper[4909]: E0202 10:54:44.898952 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data podName:1ab15f72-b249-42d5-8698-273c5afc7758 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:45.898935703 +0000 UTC m=+1411.645036438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data") pod "rabbitmq-cell1-server-0" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758") : configmap "rabbitmq-cell1-config-data" not found Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.912831 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8588c46577-4cp8s"] Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.913076 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8588c46577-4cp8s" podUID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerName="neutron-api" containerID="cri-o://8a5e2b95b89f07eeb7333cda5e4cb6d87b241046d11a832dbb17fcb90f90c063" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.913490 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8588c46577-4cp8s" podUID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerName="neutron-httpd" containerID="cri-o://07fee0cf291ae485ea00f0668c32744aaec34c29563b1aae25a47955a349b94d" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.919930 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-server" containerID="cri-o://bdf18a1a02fe85f1bb6cf3ed370c1c5c18587a314b6cf14fea7469bf28031585" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921088 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-server" containerID="cri-o://3c20875f1de9fb350baed752230d656b218f7820f84b09af0fae63228ac55300" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921120 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="swift-recon-cron" containerID="cri-o://5e88d5dc47ad71a01580306942e0f3a9a30eb6a37d0332e786a59186dadbb937" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921134 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="rsync" containerID="cri-o://403544c94cece49fecf3e837a0ceb79d6332a055fdfa13f162fa0295add4bbb7" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921145 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-expirer" containerID="cri-o://e7bf2be62b7e97abc8524922264d489ae88403ad72a536c23b1a2c1f6278395b" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921159 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-updater" containerID="cri-o://2e0f7d484d5647dce2a899dda4783d32b2783471d37c33f12abbe7c324c5a495" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921170 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-auditor" containerID="cri-o://4769dc8794676e2cda42fe1e1591dca654572c12d01ef7df149e868e62a3a604" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921180 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-replicator" containerID="cri-o://858adda0671141db145aafebaea7fb2c8fe86cafddc9fe78b0569ebbb13f0012" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921189 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-server" containerID="cri-o://75e96eb7331791bb4bb7609ae1be6e72027c31244a81358a0312c1fe5e88d79b" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921210 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-updater" containerID="cri-o://cbfed4f1069ee483c0b5622d010bb4050eb703f82417b83b9be6a69f91f6416e" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921219 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-auditor" containerID="cri-o://15ada96398d5e422dc198589740e54091d46278fea5a1976da718efa78d1aea0" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921229 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-replicator" containerID="cri-o://1a36d372a28220791f7d96800ea8889bd6721cfb062a1df523373e1709e54828" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921239 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-auditor" containerID="cri-o://bb0826ef7db615ea25639f354b4e3d66a6b4b7a4c615c65933bc47849ffdcf70" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921249 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-reaper" containerID="cri-o://f4ba97736c676ea7105906bcc956b64814d9063116ac1ae4c7dd355c1861de1a" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.921259 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-replicator" containerID="cri-o://688f9c6773a03040cd5ec3fcaab21a892ba396e408fdcffda51c117011469111" gracePeriod=30 Feb 02 10:54:44 crc kubenswrapper[4909]: I0202 10:54:44.970237 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-blz7s"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.041734 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f0a30163-0b42-493b-b775-d88218bd1844/ovsdbserver-nb/0.log" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.041777 4909 generic.go:334] "Generic (PLEG): container finished" podID="f0a30163-0b42-493b-b775-d88218bd1844" containerID="960092178e862c3b77d8e3bf300b8c16cf49e7db4eb0520f595f9cef3f507a97" exitCode=2 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.041795 4909 generic.go:334] "Generic (PLEG): container finished" podID="f0a30163-0b42-493b-b775-d88218bd1844" containerID="97f377e1596e6eafef005fa1e1978dccc7634e807d2ca9c148149e41e82b87df" exitCode=143 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.061849 4909 generic.go:334] "Generic (PLEG): container finished" podID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerID="c74c134d5ea53fc5d68abe3fe06e4cfbcfa8770404a1e48f3df9ace3dc76800d" exitCode=2 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.120927 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7b658933-f66d-47df-8b75-a42cd55b9bf4/ovsdbserver-sb/0.log" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.121010 4909 generic.go:334] "Generic (PLEG): container finished" podID="7b658933-f66d-47df-8b75-a42cd55b9bf4" containerID="d2191dd6198a0fc9ecc9c919260563196adfa2a35bc25ae0ca32279cce086a5f" exitCode=2 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.147827 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0834981f-5c9e-48dc-a18a-4108e2eb24f4" path="/var/lib/kubelet/pods/0834981f-5c9e-48dc-a18a-4108e2eb24f4/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.150944 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf" path="/var/lib/kubelet/pods/0fc3d70c-e582-4fcb-87b7-d4cce70fa4cf/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.161307 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bcdd101-6966-4afe-86f0-47f3b7a524fe" path="/var/lib/kubelet/pods/1bcdd101-6966-4afe-86f0-47f3b7a524fe/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.176565 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be7766e-5d21-45ce-8d1f-23b264467c79" path="/var/lib/kubelet/pods/2be7766e-5d21-45ce-8d1f-23b264467c79/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.182137 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32e77d63-a0db-4845-80c5-9815b46a9e21" path="/var/lib/kubelet/pods/32e77d63-a0db-4845-80c5-9815b46a9e21/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.183348 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ec9c7cf-acf0-454c-9267-a87b03460d6b" path="/var/lib/kubelet/pods/4ec9c7cf-acf0-454c-9267-a87b03460d6b/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.184090 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c" path="/var/lib/kubelet/pods/5e4c35da-e4c9-462f-ab25-8dcf6ccfe79c/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.184828 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81b16bcd-5191-4481-bb9a-a1762143bc89" path="/var/lib/kubelet/pods/81b16bcd-5191-4481-bb9a-a1762143bc89/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.185600 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf7b0419-b32f-44ab-b35d-eb06765be89d" path="/var/lib/kubelet/pods/bf7b0419-b32f-44ab-b35d-eb06765be89d/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.187082 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca377ade-e972-41f6-add9-a0b491d86bbf" path="/var/lib/kubelet/pods/ca377ade-e972-41f6-add9-a0b491d86bbf/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.187831 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d450e8-b58b-423c-afae-ed534a2d65ed" path="/var/lib/kubelet/pods/d5d450e8-b58b-423c-afae-ed534a2d65ed/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.188493 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d68a2056-e886-4135-a63a-3755df0703af" path="/var/lib/kubelet/pods/d68a2056-e886-4135-a63a-3755df0703af/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.193417 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d68a9e4e-b453-459f-b397-9c6d7c221dda" path="/var/lib/kubelet/pods/d68a9e4e-b453-459f-b397-9c6d7c221dda/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.194282 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db6d95b6-71f4-47be-90e2-64ebcf72442c" path="/var/lib/kubelet/pods/db6d95b6-71f4-47be-90e2-64ebcf72442c/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195064 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd533d6b-9338-4896-b52c-c3123a5e0467" path="/var/lib/kubelet/pods/dd533d6b-9338-4896-b52c-c3123a5e0467/volumes" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195729 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f0a30163-0b42-493b-b775-d88218bd1844","Type":"ContainerDied","Data":"960092178e862c3b77d8e3bf300b8c16cf49e7db4eb0520f595f9cef3f507a97"} Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195771 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f0a30163-0b42-493b-b775-d88218bd1844","Type":"ContainerDied","Data":"97f377e1596e6eafef005fa1e1978dccc7634e807d2ca9c148149e41e82b87df"} Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195787 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5d56d8dff8-fh9sw"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195824 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-nlvbk"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195838 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87203850-864b-4fff-b340-25e4f5c6e7c9","Type":"ContainerDied","Data":"c74c134d5ea53fc5d68abe3fe06e4cfbcfa8770404a1e48f3df9ace3dc76800d"} Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195857 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-nlvbk"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195875 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dd95bf6f-66vhd" event={"ID":"65028b8f-2d3c-40f3-8c17-239856623f4e","Type":"ContainerStarted","Data":"3efade9a5319967de578520d20eaa5fe362b0bb9eea6200c5e03127aa5ee76b8"} Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195888 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7b658933-f66d-47df-8b75-a42cd55b9bf4","Type":"ContainerDied","Data":"d2191dd6198a0fc9ecc9c919260563196adfa2a35bc25ae0ca32279cce086a5f"} Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195904 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-j8gxg"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195918 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195931 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.195944 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" event={"ID":"c97f6f0e-16ab-439c-a14c-3d908758b1db","Type":"ContainerStarted","Data":"ef77b807b50570f53289b41354f9f01947db881fc7e634a6426df9eede9623af"} Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.196424 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5d56d8dff8-fh9sw" podUID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" containerName="placement-log" containerID="cri-o://39650f1fdde2fdbf3bd917a0f731e9a813c0c766ece71f265bc33c3787265ed1" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.196590 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" podUID="81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" containerName="dnsmasq-dns" containerID="cri-o://3487017a73b83d9a77436b7d788ba96efdea8f82fb04b45ef153d5a993e724bc" gracePeriod=10 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.196732 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-log" containerID="cri-o://1a3c44ae75c0e82a41ac845e6bc75bd78044db6d50d81a3cf25783da569108b8" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.197587 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5d56d8dff8-fh9sw" podUID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" containerName="placement-api" containerID="cri-o://a0b641eb35c253bddcb998df3a1e8bb281e92fcbf9ed7e43633609aff1d97578" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.197682 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-metadata" containerID="cri-o://9146b7fe892e3b1b3b9341aaa935f843c8006e1d193949290905af61bbfd863a" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.207198 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-dgnzc"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.234695 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-dgnzc"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.257647 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6bbd5778f6-n6mw6"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.265962 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:54:45 crc kubenswrapper[4909]: W0202 10:54:45.272924 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5e2e76d_5e02_4488_ae0d_5acbdb1aa060.slice/crio-50c92004cccd536ad46a63bf928a30fe9638398efd3f8ced1d508d7cc8a24b19 WatchSource:0}: Error finding container 50c92004cccd536ad46a63bf928a30fe9638398efd3f8ced1d508d7cc8a24b19: Status 404 returned error can't find the container with id 50c92004cccd536ad46a63bf928a30fe9638398efd3f8ced1d508d7cc8a24b19 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.293016 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.293294 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerName="nova-api-log" containerID="cri-o://dfe3f218431d62cbf5802f23a9e8f8dfdc1b311556035b91ad8568910dc22027" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.293834 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerName="nova-api-api" containerID="cri-o://a20507244041d23b32f190d2a807a0eee356b71acfba50ced33b0ba77be75a3b" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.329348 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-d64c85fd5-nns29"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.329614 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-d64c85fd5-nns29" podUID="40590ae8-432b-4d01-a586-f61f07a206b0" containerName="proxy-httpd" containerID="cri-o://811b1c8d9db9cb5f0ee50813644e170eb643be43ad3fd1cf8b8def4d133ce9a4" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.330014 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-d64c85fd5-nns29" podUID="40590ae8-432b-4d01-a586-f61f07a206b0" containerName="proxy-server" containerID="cri-o://35a67f40f9a2ea8483465449c6307919ab0017c539599beb47d55125999c2929" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.348870 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-082b-account-create-update-wx89t"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.359514 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-082b-account-create-update-wx89t"] Feb 02 10:54:45 crc kubenswrapper[4909]: E0202 10:54:45.361432 4909 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 02 10:54:45 crc kubenswrapper[4909]: E0202 10:54:45.361493 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data podName:b441d32f-f76f-4e7b-b3fe-40e93b126567 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:47.36147836 +0000 UTC m=+1413.107579095 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data") pod "rabbitmq-server-0" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567") : configmap "rabbitmq-config-data" not found Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.397795 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5fks9"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.417157 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.417468 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" containerName="glance-log" containerID="cri-o://1ea9ba4798372399476cd86a9d7248e0dc1ac843d30751cd5067d09ad4963c62" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.418677 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" containerName="glance-httpd" containerID="cri-o://ecc6af1fec99e4b5728487ec159109b228720e7e811d71b0bc2fd00d2e8a68cc" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.440868 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5fks9"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.465888 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5m44k"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.484320 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5m44k"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.500241 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.500444 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8c52b752-391b-4770-9191-3494df4e3999" containerName="glance-log" containerID="cri-o://3f7435b37c532d142cc3a2e44d9a5f007446b4a37f3b1f4607541c257406cba7" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.500866 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8c52b752-391b-4770-9191-3494df4e3999" containerName="glance-httpd" containerID="cri-o://79243ba3eb5d42f1bd0f138c616462a4f129bde3f205dc9026f0724a065bb0f7" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.568691 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-fz2h7"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.631871 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-fz2h7"] Feb 02 10:54:45 crc kubenswrapper[4909]: W0202 10:54:45.682636 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddda8a20_60ba_4ae9_837a_44fa44518b8a.slice/crio-1394fd3d1fe33a21fdb054a3861a35490ddf37761eae2045385f28453f46ff3a WatchSource:0}: Error finding container 1394fd3d1fe33a21fdb054a3861a35490ddf37761eae2045385f28453f46ff3a: Status 404 returned error can't find the container with id 1394fd3d1fe33a21fdb054a3861a35490ddf37761eae2045385f28453f46ff3a Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.727019 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.727487 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.748397 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="b441d32f-f76f-4e7b-b3fe-40e93b126567" containerName="rabbitmq" containerID="cri-o://7fce69eec287d7c5a7114d96e223cf91ca30b188e5b132915844731ca8a68ec2" gracePeriod=604800 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.748920 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-ac14-account-create-update-6nxwp"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.771879 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-p67xw"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.781704 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-ac14-account-create-update-6nxwp"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.790989 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-p67xw"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.804874 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-rx2rn"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.813874 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-fgzzd"] Feb 02 10:54:45 crc kubenswrapper[4909]: E0202 10:54:45.815052 4909 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 02 10:54:45 crc kubenswrapper[4909]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 02 10:54:45 crc kubenswrapper[4909]: + source /usr/local/bin/container-scripts/functions Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNBridge=br-int Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNRemote=tcp:localhost:6642 Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNEncapType=geneve Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNAvailabilityZones= Feb 02 10:54:45 crc kubenswrapper[4909]: ++ EnableChassisAsGateway=true Feb 02 10:54:45 crc kubenswrapper[4909]: ++ PhysicalNetworks= Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNHostName= Feb 02 10:54:45 crc kubenswrapper[4909]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 02 10:54:45 crc kubenswrapper[4909]: ++ ovs_dir=/var/lib/openvswitch Feb 02 10:54:45 crc kubenswrapper[4909]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 02 10:54:45 crc kubenswrapper[4909]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 02 10:54:45 crc kubenswrapper[4909]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 02 10:54:45 crc kubenswrapper[4909]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 02 10:54:45 crc kubenswrapper[4909]: + sleep 0.5 Feb 02 10:54:45 crc kubenswrapper[4909]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 02 10:54:45 crc kubenswrapper[4909]: + cleanup_ovsdb_server_semaphore Feb 02 10:54:45 crc kubenswrapper[4909]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 02 10:54:45 crc kubenswrapper[4909]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 02 10:54:45 crc kubenswrapper[4909]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-9hkv5" message=< Feb 02 10:54:45 crc kubenswrapper[4909]: Exiting ovsdb-server (5) [ OK ] Feb 02 10:54:45 crc kubenswrapper[4909]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 02 10:54:45 crc kubenswrapper[4909]: + source /usr/local/bin/container-scripts/functions Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNBridge=br-int Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNRemote=tcp:localhost:6642 Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNEncapType=geneve Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNAvailabilityZones= Feb 02 10:54:45 crc kubenswrapper[4909]: ++ EnableChassisAsGateway=true Feb 02 10:54:45 crc kubenswrapper[4909]: ++ PhysicalNetworks= Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNHostName= Feb 02 10:54:45 crc kubenswrapper[4909]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 02 10:54:45 crc kubenswrapper[4909]: ++ ovs_dir=/var/lib/openvswitch Feb 02 10:54:45 crc kubenswrapper[4909]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 02 10:54:45 crc kubenswrapper[4909]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 02 10:54:45 crc kubenswrapper[4909]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 02 10:54:45 crc kubenswrapper[4909]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 02 10:54:45 crc kubenswrapper[4909]: + sleep 0.5 Feb 02 10:54:45 crc kubenswrapper[4909]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 02 10:54:45 crc kubenswrapper[4909]: + cleanup_ovsdb_server_semaphore Feb 02 10:54:45 crc kubenswrapper[4909]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 02 10:54:45 crc kubenswrapper[4909]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 02 10:54:45 crc kubenswrapper[4909]: > Feb 02 10:54:45 crc kubenswrapper[4909]: E0202 10:54:45.815091 4909 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 02 10:54:45 crc kubenswrapper[4909]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 02 10:54:45 crc kubenswrapper[4909]: + source /usr/local/bin/container-scripts/functions Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNBridge=br-int Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNRemote=tcp:localhost:6642 Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNEncapType=geneve Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNAvailabilityZones= Feb 02 10:54:45 crc kubenswrapper[4909]: ++ EnableChassisAsGateway=true Feb 02 10:54:45 crc kubenswrapper[4909]: ++ PhysicalNetworks= Feb 02 10:54:45 crc kubenswrapper[4909]: ++ OVNHostName= Feb 02 10:54:45 crc kubenswrapper[4909]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 02 10:54:45 crc kubenswrapper[4909]: ++ ovs_dir=/var/lib/openvswitch Feb 02 10:54:45 crc kubenswrapper[4909]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 02 10:54:45 crc kubenswrapper[4909]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 02 10:54:45 crc kubenswrapper[4909]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 02 10:54:45 crc kubenswrapper[4909]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 02 10:54:45 crc kubenswrapper[4909]: + sleep 0.5 Feb 02 10:54:45 crc kubenswrapper[4909]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 02 10:54:45 crc kubenswrapper[4909]: + cleanup_ovsdb_server_semaphore Feb 02 10:54:45 crc kubenswrapper[4909]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 02 10:54:45 crc kubenswrapper[4909]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 02 10:54:45 crc kubenswrapper[4909]: > pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" containerID="cri-o://4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.815128 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" containerID="cri-o://4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" gracePeriod=29 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.829105 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-fgzzd"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.829643 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-rx2rn"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.838575 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6d4cb98fc-6tg42"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.839118 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6d4cb98fc-6tg42" podUID="d4589304-68d2-48c9-a691-e34a9cb4c75b" containerName="barbican-worker-log" containerID="cri-o://766cdd0ecebd2c830eb9eb4207f7f04fbf9e06724a13a143f2ffa40073116503" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.839611 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6d4cb98fc-6tg42" podUID="d4589304-68d2-48c9-a691-e34a9cb4c75b" containerName="barbican-worker" containerID="cri-o://397ade4dfec5a408d839ad6fb1e26085b9b09d9f93c56399c0d7b99f36c1aded" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.853871 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.863339 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6bbd5778f6-n6mw6"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.890947 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6c7fbb5f9b-4fbwp"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.891086 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6dd95bf6f-66vhd"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.891416 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" podUID="04232dcc-dda5-4774-b999-5104335f2da0" containerName="barbican-api-log" containerID="cri-o://7573e7b7356e01328ac6964e4639c84a155f497ab30f7801c09c8069b1ba1175" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.891864 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" podUID="04232dcc-dda5-4774-b999-5104335f2da0" containerName="barbican-api" containerID="cri-o://806aaa0ff1451eccc6cc80605c3b759d86a7c3041d5b3685c9a6764ed0b9dda2" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.906678 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovs-vswitchd" containerID="cri-o://8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" gracePeriod=29 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.908537 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-75589bd9c8-npg4p"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.908900 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" podUID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" containerName="barbican-keystone-listener-log" containerID="cri-o://cdf003d472b80a2499008207df29bc8a7fe1a4f1fbca8fa6aa522ad599f3f1ea" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.909168 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" podUID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" containerName="barbican-keystone-listener" containerID="cri-o://b85d85d06b59a99a123c0147182bc23c0f8782f8baf6bdc50f6e3c5737f2292a" gracePeriod=30 Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.921879 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-795c6654c6-z72r6"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.929857 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-chlnb"] Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.998068 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b441d32f-f76f-4e7b-b3fe-40e93b126567" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Feb 02 10:54:45 crc kubenswrapper[4909]: I0202 10:54:45.998328 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1ab15f72-b249-42d5-8698-273c5afc7758" containerName="rabbitmq" containerID="cri-o://13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9" gracePeriod=604800 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.000891 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-chlnb"] Feb 02 10:54:46 crc kubenswrapper[4909]: E0202 10:54:46.008043 4909 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.012241 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="e0af95ff-c608-4b73-92fa-d4a443a9eaaf" containerName="galera" containerID="cri-o://f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c" gracePeriod=30 Feb 02 10:54:46 crc kubenswrapper[4909]: E0202 10:54:46.012655 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data podName:1ab15f72-b249-42d5-8698-273c5afc7758 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:48.012622018 +0000 UTC m=+1413.758722753 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data") pod "rabbitmq-cell1-server-0" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758") : configmap "rabbitmq-cell1-config-data" not found Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.024608 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.024840 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="05143579-706d-4107-9d7a-a63b4a13c187" containerName="nova-cell1-conductor-conductor" containerID="cri-o://7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466" gracePeriod=30 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.036709 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5vq8t"] Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.055860 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.056243 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="2399f7a5-86e0-46bd-9f3d-624d1208b9cc" containerName="nova-cell0-conductor-conductor" containerID="cri-o://e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51" gracePeriod=30 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.071358 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5vq8t"] Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.084495 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.084772 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ba559652-0584-49c5-91d6-7d7fdd596dc2" containerName="nova-scheduler-scheduler" containerID="cri-o://7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e" gracePeriod=30 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.191637 4909 generic.go:334] "Generic (PLEG): container finished" podID="d4589304-68d2-48c9-a691-e34a9cb4c75b" containerID="766cdd0ecebd2c830eb9eb4207f7f04fbf9e06724a13a143f2ffa40073116503" exitCode=143 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.192003 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d4cb98fc-6tg42" event={"ID":"d4589304-68d2-48c9-a691-e34a9cb4c75b","Type":"ContainerDied","Data":"766cdd0ecebd2c830eb9eb4207f7f04fbf9e06724a13a143f2ffa40073116503"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217496 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="403544c94cece49fecf3e837a0ceb79d6332a055fdfa13f162fa0295add4bbb7" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217524 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="e7bf2be62b7e97abc8524922264d489ae88403ad72a536c23b1a2c1f6278395b" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217530 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="2e0f7d484d5647dce2a899dda4783d32b2783471d37c33f12abbe7c324c5a495" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217537 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="4769dc8794676e2cda42fe1e1591dca654572c12d01ef7df149e868e62a3a604" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217543 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="858adda0671141db145aafebaea7fb2c8fe86cafddc9fe78b0569ebbb13f0012" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217550 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="3c20875f1de9fb350baed752230d656b218f7820f84b09af0fae63228ac55300" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217556 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="cbfed4f1069ee483c0b5622d010bb4050eb703f82417b83b9be6a69f91f6416e" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217562 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="15ada96398d5e422dc198589740e54091d46278fea5a1976da718efa78d1aea0" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217568 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="1a36d372a28220791f7d96800ea8889bd6721cfb062a1df523373e1709e54828" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217574 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="75e96eb7331791bb4bb7609ae1be6e72027c31244a81358a0312c1fe5e88d79b" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217581 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="f4ba97736c676ea7105906bcc956b64814d9063116ac1ae4c7dd355c1861de1a" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217588 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="bb0826ef7db615ea25639f354b4e3d66a6b4b7a4c615c65933bc47849ffdcf70" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217595 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="688f9c6773a03040cd5ec3fcaab21a892ba396e408fdcffda51c117011469111" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217602 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="bdf18a1a02fe85f1bb6cf3ed370c1c5c18587a314b6cf14fea7469bf28031585" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217640 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"403544c94cece49fecf3e837a0ceb79d6332a055fdfa13f162fa0295add4bbb7"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217668 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"e7bf2be62b7e97abc8524922264d489ae88403ad72a536c23b1a2c1f6278395b"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217681 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"2e0f7d484d5647dce2a899dda4783d32b2783471d37c33f12abbe7c324c5a495"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217692 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"4769dc8794676e2cda42fe1e1591dca654572c12d01ef7df149e868e62a3a604"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217701 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"858adda0671141db145aafebaea7fb2c8fe86cafddc9fe78b0569ebbb13f0012"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217708 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"3c20875f1de9fb350baed752230d656b218f7820f84b09af0fae63228ac55300"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217717 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"cbfed4f1069ee483c0b5622d010bb4050eb703f82417b83b9be6a69f91f6416e"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217726 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"15ada96398d5e422dc198589740e54091d46278fea5a1976da718efa78d1aea0"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217734 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"1a36d372a28220791f7d96800ea8889bd6721cfb062a1df523373e1709e54828"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217742 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"75e96eb7331791bb4bb7609ae1be6e72027c31244a81358a0312c1fe5e88d79b"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217750 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"f4ba97736c676ea7105906bcc956b64814d9063116ac1ae4c7dd355c1861de1a"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217758 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"bb0826ef7db615ea25639f354b4e3d66a6b4b7a4c615c65933bc47849ffdcf70"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"688f9c6773a03040cd5ec3fcaab21a892ba396e408fdcffda51c117011469111"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.217774 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"bdf18a1a02fe85f1bb6cf3ed370c1c5c18587a314b6cf14fea7469bf28031585"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.219116 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dd95bf6f-66vhd" event={"ID":"65028b8f-2d3c-40f3-8c17-239856623f4e","Type":"ContainerStarted","Data":"0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.224706 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pzcjh_c631ddad-ab3a-488f-9947-2f3385fd912c/openstack-network-exporter/0.log" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.224771 4909 generic.go:334] "Generic (PLEG): container finished" podID="c631ddad-ab3a-488f-9947-2f3385fd912c" containerID="e616e96154cc54a51e3703d10ef0dafd77cf698728a725fab58c048715a29bb9" exitCode=2 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.224862 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pzcjh" event={"ID":"c631ddad-ab3a-488f-9947-2f3385fd912c","Type":"ContainerDied","Data":"e616e96154cc54a51e3703d10ef0dafd77cf698728a725fab58c048715a29bb9"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.224916 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-pzcjh" event={"ID":"c631ddad-ab3a-488f-9947-2f3385fd912c","Type":"ContainerDied","Data":"54baae7c7b75f13bae33d983f2d0ddee0aa4501c4a00d2a1c1cf3c2c25652c45"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.224931 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54baae7c7b75f13bae33d983f2d0ddee0aa4501c4a00d2a1c1cf3c2c25652c45" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.230226 4909 generic.go:334] "Generic (PLEG): container finished" podID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.230298 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9hkv5" event={"ID":"ef5a90ca-c133-400b-b869-becc0b1f60a0","Type":"ContainerDied","Data":"4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.268084 4909 generic.go:334] "Generic (PLEG): container finished" podID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerID="dfe3f218431d62cbf5802f23a9e8f8dfdc1b311556035b91ad8568910dc22027" exitCode=143 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.268174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7202de6a-156c-4c06-9e08-3e62cfcf367e","Type":"ContainerDied","Data":"dfe3f218431d62cbf5802f23a9e8f8dfdc1b311556035b91ad8568910dc22027"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.272285 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f0a30163-0b42-493b-b775-d88218bd1844/ovsdbserver-nb/0.log" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.272380 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.273322 4909 generic.go:334] "Generic (PLEG): container finished" podID="04232dcc-dda5-4774-b999-5104335f2da0" containerID="7573e7b7356e01328ac6964e4639c84a155f497ab30f7801c09c8069b1ba1175" exitCode=143 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.273410 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" event={"ID":"04232dcc-dda5-4774-b999-5104335f2da0","Type":"ContainerDied","Data":"7573e7b7356e01328ac6964e4639c84a155f497ab30f7801c09c8069b1ba1175"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.279391 4909 generic.go:334] "Generic (PLEG): container finished" podID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" containerID="39650f1fdde2fdbf3bd917a0f731e9a813c0c766ece71f265bc33c3787265ed1" exitCode=143 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.279559 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d56d8dff8-fh9sw" event={"ID":"b6be13cc-01ed-441f-b2c9-dc024fcb4b18","Type":"ContainerDied","Data":"39650f1fdde2fdbf3bd917a0f731e9a813c0c766ece71f265bc33c3787265ed1"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.298182 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1ab15f72-b249-42d5-8698-273c5afc7758" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.318751 4909 generic.go:334] "Generic (PLEG): container finished" podID="8c52b752-391b-4770-9191-3494df4e3999" containerID="3f7435b37c532d142cc3a2e44d9a5f007446b4a37f3b1f4607541c257406cba7" exitCode=143 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.318844 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c52b752-391b-4770-9191-3494df4e3999","Type":"ContainerDied","Data":"3f7435b37c532d142cc3a2e44d9a5f007446b4a37f3b1f4607541c257406cba7"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.318886 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-pzcjh_c631ddad-ab3a-488f-9947-2f3385fd912c/openstack-network-exporter/0.log" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.318954 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.325373 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7b658933-f66d-47df-8b75-a42cd55b9bf4/ovsdbserver-sb/0.log" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.325408 4909 generic.go:334] "Generic (PLEG): container finished" podID="7b658933-f66d-47df-8b75-a42cd55b9bf4" containerID="b05bbe00b26a1b9593023419b5d4a14f74126dc62da00a8e37a5ac60df828f91" exitCode=143 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.325453 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7b658933-f66d-47df-8b75-a42cd55b9bf4","Type":"ContainerDied","Data":"b05bbe00b26a1b9593023419b5d4a14f74126dc62da00a8e37a5ac60df828f91"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.338077 4909 generic.go:334] "Generic (PLEG): container finished" podID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" containerID="1ea9ba4798372399476cd86a9d7248e0dc1ac843d30751cd5067d09ad4963c62" exitCode=143 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.338240 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97409ffd-f1ab-4a1a-9939-a041a4085b1a","Type":"ContainerDied","Data":"1ea9ba4798372399476cd86a9d7248e0dc1ac843d30751cd5067d09ad4963c62"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.350581 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.351747 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f0a30163-0b42-493b-b775-d88218bd1844/ovsdbserver-nb/0.log" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.351902 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.352713 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f0a30163-0b42-493b-b775-d88218bd1844","Type":"ContainerDied","Data":"eda4514efdafb9d3edc2db4164d749803408bc43963f3b15c82cc113eca3e28a"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.352754 4909 scope.go:117] "RemoveContainer" containerID="960092178e862c3b77d8e3bf300b8c16cf49e7db4eb0520f595f9cef3f507a97" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.362364 4909 generic.go:334] "Generic (PLEG): container finished" podID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" containerID="cdf003d472b80a2499008207df29bc8a7fe1a4f1fbca8fa6aa522ad599f3f1ea" exitCode=143 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.362438 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" event={"ID":"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a","Type":"ContainerDied","Data":"cdf003d472b80a2499008207df29bc8a7fe1a4f1fbca8fa6aa522ad599f3f1ea"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.362834 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7b658933-f66d-47df-8b75-a42cd55b9bf4/ovsdbserver-sb/0.log" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.362909 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.370074 4909 generic.go:334] "Generic (PLEG): container finished" podID="81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" containerID="3487017a73b83d9a77436b7d788ba96efdea8f82fb04b45ef153d5a993e724bc" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.370164 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" event={"ID":"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425","Type":"ContainerDied","Data":"3487017a73b83d9a77436b7d788ba96efdea8f82fb04b45ef153d5a993e724bc"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.372844 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-blz7s" event={"ID":"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060","Type":"ContainerStarted","Data":"50c92004cccd536ad46a63bf928a30fe9638398efd3f8ced1d508d7cc8a24b19"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.375048 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.376123 4909 generic.go:334] "Generic (PLEG): container finished" podID="40590ae8-432b-4d01-a586-f61f07a206b0" containerID="35a67f40f9a2ea8483465449c6307919ab0017c539599beb47d55125999c2929" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.376144 4909 generic.go:334] "Generic (PLEG): container finished" podID="40590ae8-432b-4d01-a586-f61f07a206b0" containerID="811b1c8d9db9cb5f0ee50813644e170eb643be43ad3fd1cf8b8def4d133ce9a4" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.376174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d64c85fd5-nns29" event={"ID":"40590ae8-432b-4d01-a586-f61f07a206b0","Type":"ContainerDied","Data":"35a67f40f9a2ea8483465449c6307919ab0017c539599beb47d55125999c2929"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.376191 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d64c85fd5-nns29" event={"ID":"40590ae8-432b-4d01-a586-f61f07a206b0","Type":"ContainerDied","Data":"811b1c8d9db9cb5f0ee50813644e170eb643be43ad3fd1cf8b8def4d133ce9a4"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.377209 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bbd5778f6-n6mw6" event={"ID":"ddda8a20-60ba-4ae9-837a-44fa44518b8a","Type":"ContainerStarted","Data":"1394fd3d1fe33a21fdb054a3861a35490ddf37761eae2045385f28453f46ff3a"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.378241 4909 generic.go:334] "Generic (PLEG): container finished" podID="c8dba959-faf4-4f15-96d3-e8f67ae00d62" containerID="908ec2618be6ce0ccb5553a70d2b7d51ef915f04d269be9c3d7f53480f657d99" exitCode=137 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.378307 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.390849 4909 generic.go:334] "Generic (PLEG): container finished" podID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerID="1a3c44ae75c0e82a41ac845e6bc75bd78044db6d50d81a3cf25783da569108b8" exitCode=143 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.390985 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f0cf9e9-b663-4be3-a435-c7dd6deea228","Type":"ContainerDied","Data":"1a3c44ae75c0e82a41ac845e6bc75bd78044db6d50d81a3cf25783da569108b8"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.394176 4909 generic.go:334] "Generic (PLEG): container finished" podID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerID="07fee0cf291ae485ea00f0668c32744aaec34c29563b1aae25a47955a349b94d" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.394275 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8588c46577-4cp8s" event={"ID":"d1145da4-90e5-422b-917a-33473a9c5d6a","Type":"ContainerDied","Data":"07fee0cf291ae485ea00f0668c32744aaec34c29563b1aae25a47955a349b94d"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.410594 4909 generic.go:334] "Generic (PLEG): container finished" podID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerID="e41df562f1a41e47706274614d03c13faaba64903b9ae50a415bad9f51e06946" exitCode=143 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.410682 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14a8699a-66db-48c6-8834-bda4e21ef1d9","Type":"ContainerDied","Data":"e41df562f1a41e47706274614d03c13faaba64903b9ae50a415bad9f51e06946"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.411444 4909 scope.go:117] "RemoveContainer" containerID="97f377e1596e6eafef005fa1e1978dccc7634e807d2ca9c148149e41e82b87df" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.415832 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-ovsdbserver-nb-tls-certs\") pod \"f0a30163-0b42-493b-b775-d88218bd1844\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.415890 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-config\") pod \"f0a30163-0b42-493b-b775-d88218bd1844\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.415918 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f0a30163-0b42-493b-b775-d88218bd1844-ovsdb-rundir\") pod \"f0a30163-0b42-493b-b775-d88218bd1844\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.415952 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-scripts\") pod \"f0a30163-0b42-493b-b775-d88218bd1844\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.415988 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-combined-ca-bundle\") pod \"c631ddad-ab3a-488f-9947-2f3385fd912c\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.416029 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-combined-ca-bundle\") pod \"f0a30163-0b42-493b-b775-d88218bd1844\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.416103 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c631ddad-ab3a-488f-9947-2f3385fd912c-config\") pod \"c631ddad-ab3a-488f-9947-2f3385fd912c\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.416132 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"f0a30163-0b42-493b-b775-d88218bd1844\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.416153 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-metrics-certs-tls-certs\") pod \"f0a30163-0b42-493b-b775-d88218bd1844\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.416236 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-metrics-certs-tls-certs\") pod \"c631ddad-ab3a-488f-9947-2f3385fd912c\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.416276 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovn-rundir\") pod \"c631ddad-ab3a-488f-9947-2f3385fd912c\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.416328 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm4ws\" (UniqueName: \"kubernetes.io/projected/c631ddad-ab3a-488f-9947-2f3385fd912c-kube-api-access-qm4ws\") pod \"c631ddad-ab3a-488f-9947-2f3385fd912c\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.416362 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghzql\" (UniqueName: \"kubernetes.io/projected/f0a30163-0b42-493b-b775-d88218bd1844-kube-api-access-ghzql\") pod \"f0a30163-0b42-493b-b775-d88218bd1844\" (UID: \"f0a30163-0b42-493b-b775-d88218bd1844\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.416403 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovs-rundir\") pod \"c631ddad-ab3a-488f-9947-2f3385fd912c\" (UID: \"c631ddad-ab3a-488f-9947-2f3385fd912c\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.416953 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "c631ddad-ab3a-488f-9947-2f3385fd912c" (UID: "c631ddad-ab3a-488f-9947-2f3385fd912c"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.417116 4909 generic.go:334] "Generic (PLEG): container finished" podID="9b24f572-8a70-4a46-b3cf-e50ae4859892" containerID="5181b71ba760ed91aeb7b3915c3d7814e4e85e3d4983092599caaa351d56bd8b" exitCode=0 Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.417159 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b24f572-8a70-4a46-b3cf-e50ae4859892","Type":"ContainerDied","Data":"5181b71ba760ed91aeb7b3915c3d7814e4e85e3d4983092599caaa351d56bd8b"} Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.417734 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0a30163-0b42-493b-b775-d88218bd1844-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "f0a30163-0b42-493b-b775-d88218bd1844" (UID: "f0a30163-0b42-493b-b775-d88218bd1844"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.420910 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c631ddad-ab3a-488f-9947-2f3385fd912c-config" (OuterVolumeSpecName: "config") pod "c631ddad-ab3a-488f-9947-2f3385fd912c" (UID: "c631ddad-ab3a-488f-9947-2f3385fd912c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.424253 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-scripts" (OuterVolumeSpecName: "scripts") pod "f0a30163-0b42-493b-b775-d88218bd1844" (UID: "f0a30163-0b42-493b-b775-d88218bd1844"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.426163 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "c631ddad-ab3a-488f-9947-2f3385fd912c" (UID: "c631ddad-ab3a-488f-9947-2f3385fd912c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.430486 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-config" (OuterVolumeSpecName: "config") pod "f0a30163-0b42-493b-b775-d88218bd1844" (UID: "f0a30163-0b42-493b-b775-d88218bd1844"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.430587 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "f0a30163-0b42-493b-b775-d88218bd1844" (UID: "f0a30163-0b42-493b-b775-d88218bd1844"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.434329 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0a30163-0b42-493b-b775-d88218bd1844-kube-api-access-ghzql" (OuterVolumeSpecName: "kube-api-access-ghzql") pod "f0a30163-0b42-493b-b775-d88218bd1844" (UID: "f0a30163-0b42-493b-b775-d88218bd1844"). InnerVolumeSpecName "kube-api-access-ghzql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.442392 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c631ddad-ab3a-488f-9947-2f3385fd912c-kube-api-access-qm4ws" (OuterVolumeSpecName: "kube-api-access-qm4ws") pod "c631ddad-ab3a-488f-9947-2f3385fd912c" (UID: "c631ddad-ab3a-488f-9947-2f3385fd912c"). InnerVolumeSpecName "kube-api-access-qm4ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.476651 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c631ddad-ab3a-488f-9947-2f3385fd912c" (UID: "c631ddad-ab3a-488f-9947-2f3385fd912c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.506376 4909 scope.go:117] "RemoveContainer" containerID="908ec2618be6ce0ccb5553a70d2b7d51ef915f04d269be9c3d7f53480f657d99" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.517072 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8588c46577-4cp8s" podUID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.167:9696/\": dial tcp 10.217.0.167:9696: connect: connection refused" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518255 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"7b658933-f66d-47df-8b75-a42cd55b9bf4\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518309 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkm9j\" (UniqueName: \"kubernetes.io/projected/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-kube-api-access-jkm9j\") pod \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518335 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-combined-ca-bundle\") pod \"7b658933-f66d-47df-8b75-a42cd55b9bf4\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518380 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-swift-storage-0\") pod \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518422 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config-secret\") pod \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518470 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-nb\") pod \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518562 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config\") pod \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518584 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6vpm\" (UniqueName: \"kubernetes.io/projected/c8dba959-faf4-4f15-96d3-e8f67ae00d62-kube-api-access-m6vpm\") pod \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518612 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdb-rundir\") pod \"7b658933-f66d-47df-8b75-a42cd55b9bf4\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518668 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-config\") pod \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518703 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-svc\") pod \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518738 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-metrics-certs-tls-certs\") pod \"7b658933-f66d-47df-8b75-a42cd55b9bf4\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518769 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-config\") pod \"7b658933-f66d-47df-8b75-a42cd55b9bf4\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518822 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv8g4\" (UniqueName: \"kubernetes.io/projected/7b658933-f66d-47df-8b75-a42cd55b9bf4-kube-api-access-dv8g4\") pod \"7b658933-f66d-47df-8b75-a42cd55b9bf4\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518849 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-scripts\") pod \"7b658933-f66d-47df-8b75-a42cd55b9bf4\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518891 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-sb\") pod \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\" (UID: \"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518916 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-combined-ca-bundle\") pod \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\" (UID: \"c8dba959-faf4-4f15-96d3-e8f67ae00d62\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.518955 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdbserver-sb-tls-certs\") pod \"7b658933-f66d-47df-8b75-a42cd55b9bf4\" (UID: \"7b658933-f66d-47df-8b75-a42cd55b9bf4\") " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.519710 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.519730 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm4ws\" (UniqueName: \"kubernetes.io/projected/c631ddad-ab3a-488f-9947-2f3385fd912c-kube-api-access-qm4ws\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.519744 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghzql\" (UniqueName: \"kubernetes.io/projected/f0a30163-0b42-493b-b775-d88218bd1844-kube-api-access-ghzql\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.519761 4909 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c631ddad-ab3a-488f-9947-2f3385fd912c-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.519773 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.519784 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f0a30163-0b42-493b-b775-d88218bd1844-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.519797 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0a30163-0b42-493b-b775-d88218bd1844-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.519826 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.519838 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c631ddad-ab3a-488f-9947-2f3385fd912c-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.519861 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.523654 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "7b658933-f66d-47df-8b75-a42cd55b9bf4" (UID: "7b658933-f66d-47df-8b75-a42cd55b9bf4"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.528152 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-config" (OuterVolumeSpecName: "config") pod "7b658933-f66d-47df-8b75-a42cd55b9bf4" (UID: "7b658933-f66d-47df-8b75-a42cd55b9bf4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.531029 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-scripts" (OuterVolumeSpecName: "scripts") pod "7b658933-f66d-47df-8b75-a42cd55b9bf4" (UID: "7b658933-f66d-47df-8b75-a42cd55b9bf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.531253 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "7b658933-f66d-47df-8b75-a42cd55b9bf4" (UID: "7b658933-f66d-47df-8b75-a42cd55b9bf4"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.563058 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b658933-f66d-47df-8b75-a42cd55b9bf4-kube-api-access-dv8g4" (OuterVolumeSpecName: "kube-api-access-dv8g4") pod "7b658933-f66d-47df-8b75-a42cd55b9bf4" (UID: "7b658933-f66d-47df-8b75-a42cd55b9bf4"). InnerVolumeSpecName "kube-api-access-dv8g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.569395 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-kube-api-access-jkm9j" (OuterVolumeSpecName: "kube-api-access-jkm9j") pod "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" (UID: "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425"). InnerVolumeSpecName "kube-api-access-jkm9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.604381 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8dba959-faf4-4f15-96d3-e8f67ae00d62-kube-api-access-m6vpm" (OuterVolumeSpecName: "kube-api-access-m6vpm") pod "c8dba959-faf4-4f15-96d3-e8f67ae00d62" (UID: "c8dba959-faf4-4f15-96d3-e8f67ae00d62"). InnerVolumeSpecName "kube-api-access-m6vpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.621444 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6vpm\" (UniqueName: \"kubernetes.io/projected/c8dba959-faf4-4f15-96d3-e8f67ae00d62-kube-api-access-m6vpm\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.621476 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.621487 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.621497 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dv8g4\" (UniqueName: \"kubernetes.io/projected/7b658933-f66d-47df-8b75-a42cd55b9bf4-kube-api-access-dv8g4\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.621504 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b658933-f66d-47df-8b75-a42cd55b9bf4-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.621523 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.621532 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkm9j\" (UniqueName: \"kubernetes.io/projected/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-kube-api-access-jkm9j\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.638797 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0a30163-0b42-493b-b775-d88218bd1844" (UID: "f0a30163-0b42-493b-b775-d88218bd1844"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.660116 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.712067 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f0a30163-0b42-493b-b775-d88218bd1844" (UID: "f0a30163-0b42-493b-b775-d88218bd1844"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.724158 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.724183 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.724195 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.824043 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.826461 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.827058 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" (UID: "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.894088 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b658933-f66d-47df-8b75-a42cd55b9bf4" (UID: "7b658933-f66d-47df-8b75-a42cd55b9bf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.929360 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.929402 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.933984 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "c631ddad-ab3a-488f-9947-2f3385fd912c" (UID: "c631ddad-ab3a-488f-9947-2f3385fd912c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.977066 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8dba959-faf4-4f15-96d3-e8f67ae00d62" (UID: "c8dba959-faf4-4f15-96d3-e8f67ae00d62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.981467 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" (UID: "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:46 crc kubenswrapper[4909]: I0202 10:54:46.994430 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c8dba959-faf4-4f15-96d3-e8f67ae00d62" (UID: "c8dba959-faf4-4f15-96d3-e8f67ae00d62"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.011341 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c8dba959-faf4-4f15-96d3-e8f67ae00d62" (UID: "c8dba959-faf4-4f15-96d3-e8f67ae00d62"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.022389 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-config" (OuterVolumeSpecName: "config") pod "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" (UID: "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.043391 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.043421 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.043430 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c8dba959-faf4-4f15-96d3-e8f67ae00d62-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.043440 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.043450 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c631ddad-ab3a-488f-9947-2f3385fd912c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.043458 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8dba959-faf4-4f15-96d3-e8f67ae00d62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.047886 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" (UID: "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.054046 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" (UID: "81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.054409 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05519046-6a41-47f7-9247-03cff29382a5" path="/var/lib/kubelet/pods/05519046-6a41-47f7-9247-03cff29382a5/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.055231 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb" path="/var/lib/kubelet/pods/1f19c47b-c9fc-4762-9cb0-fcb8b9ad95bb/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.055974 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25588ec8-c1dd-42ca-983e-54a84e3f8a15" path="/var/lib/kubelet/pods/25588ec8-c1dd-42ca-983e-54a84e3f8a15/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.056613 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="359391a8-8d1e-47a2-b849-e7d574bd0613" path="/var/lib/kubelet/pods/359391a8-8d1e-47a2-b849-e7d574bd0613/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.057988 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36fe6eb3-17e6-417d-86c8-5b776beb7ddd" path="/var/lib/kubelet/pods/36fe6eb3-17e6-417d-86c8-5b776beb7ddd/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.058651 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bad7649-e463-477c-b9d9-b317be65e8d1" path="/var/lib/kubelet/pods/6bad7649-e463-477c-b9d9-b317be65e8d1/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.060217 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72af2fcd-83e7-4adf-bbf6-399f49b07e5b" path="/var/lib/kubelet/pods/72af2fcd-83e7-4adf-bbf6-399f49b07e5b/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.061543 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784c4213-ccd0-4c0a-9205-9f251e470297" path="/var/lib/kubelet/pods/784c4213-ccd0-4c0a-9205-9f251e470297/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.062238 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee6c054-0ce0-44c7-96aa-1722b6339cfe" path="/var/lib/kubelet/pods/aee6c054-0ce0-44c7-96aa-1722b6339cfe/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.063096 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8dba959-faf4-4f15-96d3-e8f67ae00d62" path="/var/lib/kubelet/pods/c8dba959-faf4-4f15-96d3-e8f67ae00d62/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.065032 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d38222ad-04a2-4a42-8f8f-0789dc7c4c49" path="/var/lib/kubelet/pods/d38222ad-04a2-4a42-8f8f-0789dc7c4c49/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.065791 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daee344f-d74a-43d0-9fc4-f651011ef32f" path="/var/lib/kubelet/pods/daee344f-d74a-43d0-9fc4-f651011ef32f/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.066441 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451" path="/var/lib/kubelet/pods/f8bc1e02-e0c8-4f2f-8bb5-b8a86dc5d451/volumes" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.070788 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "7b658933-f66d-47df-8b75-a42cd55b9bf4" (UID: "7b658933-f66d-47df-8b75-a42cd55b9bf4"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.084492 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7b658933-f66d-47df-8b75-a42cd55b9bf4" (UID: "7b658933-f66d-47df-8b75-a42cd55b9bf4"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.122710 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "f0a30163-0b42-493b-b775-d88218bd1844" (UID: "f0a30163-0b42-493b-b775-d88218bd1844"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.146349 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a30163-0b42-493b-b775-d88218bd1844-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.146393 4909 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.146403 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.146411 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.146419 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b658933-f66d-47df-8b75-a42cd55b9bf4-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.212225 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.254232 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.312644 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.329289 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.339202 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.349732 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgwmc\" (UniqueName: \"kubernetes.io/projected/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-kube-api-access-lgwmc\") pod \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.349786 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-combined-ca-bundle\") pod \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.349906 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-etc-swift\") pod \"40590ae8-432b-4d01-a586-f61f07a206b0\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.349978 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-config-data\") pod \"40590ae8-432b-4d01-a586-f61f07a206b0\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.350001 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-public-tls-certs\") pod \"40590ae8-432b-4d01-a586-f61f07a206b0\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.350047 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-internal-tls-certs\") pod \"40590ae8-432b-4d01-a586-f61f07a206b0\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.350080 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-run-httpd\") pod \"40590ae8-432b-4d01-a586-f61f07a206b0\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.350119 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-config-data\") pod \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.350147 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-log-httpd\") pod \"40590ae8-432b-4d01-a586-f61f07a206b0\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.350180 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-combined-ca-bundle\") pod \"40590ae8-432b-4d01-a586-f61f07a206b0\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.350218 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-vencrypt-tls-certs\") pod \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.350242 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-nova-novncproxy-tls-certs\") pod \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\" (UID: \"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.350288 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fw6x\" (UniqueName: \"kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-kube-api-access-8fw6x\") pod \"40590ae8-432b-4d01-a586-f61f07a206b0\" (UID: \"40590ae8-432b-4d01-a586-f61f07a206b0\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.350949 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "40590ae8-432b-4d01-a586-f61f07a206b0" (UID: "40590ae8-432b-4d01-a586-f61f07a206b0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.351476 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.356239 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "40590ae8-432b-4d01-a586-f61f07a206b0" (UID: "40590ae8-432b-4d01-a586-f61f07a206b0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.369745 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "40590ae8-432b-4d01-a586-f61f07a206b0" (UID: "40590ae8-432b-4d01-a586-f61f07a206b0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.374626 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-kube-api-access-lgwmc" (OuterVolumeSpecName: "kube-api-access-lgwmc") pod "20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" (UID: "20cda1f6-04e5-4e71-9fa8-4cad8dbac52c"). InnerVolumeSpecName "kube-api-access-lgwmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.382469 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-kube-api-access-8fw6x" (OuterVolumeSpecName: "kube-api-access-8fw6x") pod "40590ae8-432b-4d01-a586-f61f07a206b0" (UID: "40590ae8-432b-4d01-a586-f61f07a206b0"). InnerVolumeSpecName "kube-api-access-8fw6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.453314 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2pfm\" (UniqueName: \"kubernetes.io/projected/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kube-api-access-w2pfm\") pod \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.453622 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-operator-scripts\") pod \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.453645 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kolla-config\") pod \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.453750 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-default\") pod \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.453795 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-combined-ca-bundle\") pod \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.453847 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.453886 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-generated\") pod \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.454004 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-galera-tls-certs\") pod \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\" (UID: \"e0af95ff-c608-4b73-92fa-d4a443a9eaaf\") " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.454435 4909 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.454453 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/40590ae8-432b-4d01-a586-f61f07a206b0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.454465 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fw6x\" (UniqueName: \"kubernetes.io/projected/40590ae8-432b-4d01-a586-f61f07a206b0-kube-api-access-8fw6x\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.454477 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgwmc\" (UniqueName: \"kubernetes.io/projected/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-kube-api-access-lgwmc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: E0202 10:54:47.454534 4909 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 02 10:54:47 crc kubenswrapper[4909]: E0202 10:54:47.454583 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data podName:b441d32f-f76f-4e7b-b3fe-40e93b126567 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:51.454567322 +0000 UTC m=+1417.200668057 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data") pod "rabbitmq-server-0" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567") : configmap "rabbitmq-config-data" not found Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.455891 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e0af95ff-c608-4b73-92fa-d4a443a9eaaf" (UID: "e0af95ff-c608-4b73-92fa-d4a443a9eaaf"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.460256 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d64c85fd5-nns29" event={"ID":"40590ae8-432b-4d01-a586-f61f07a206b0","Type":"ContainerDied","Data":"b424138c0c1add112c48a249aab00d403a2069f30efdfff11dfbcc1d85d30b5d"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.460337 4909 scope.go:117] "RemoveContainer" containerID="35a67f40f9a2ea8483465449c6307919ab0017c539599beb47d55125999c2929" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.460523 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d64c85fd5-nns29" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.477320 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e0af95ff-c608-4b73-92fa-d4a443a9eaaf" (UID: "e0af95ff-c608-4b73-92fa-d4a443a9eaaf"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.478019 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0af95ff-c608-4b73-92fa-d4a443a9eaaf" (UID: "e0af95ff-c608-4b73-92fa-d4a443a9eaaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.479406 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e0af95ff-c608-4b73-92fa-d4a443a9eaaf" (UID: "e0af95ff-c608-4b73-92fa-d4a443a9eaaf"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.482083 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kube-api-access-w2pfm" (OuterVolumeSpecName: "kube-api-access-w2pfm") pod "e0af95ff-c608-4b73-92fa-d4a443a9eaaf" (UID: "e0af95ff-c608-4b73-92fa-d4a443a9eaaf"). InnerVolumeSpecName "kube-api-access-w2pfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.485029 4909 generic.go:334] "Generic (PLEG): container finished" podID="e0af95ff-c608-4b73-92fa-d4a443a9eaaf" containerID="f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c" exitCode=0 Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.485199 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0af95ff-c608-4b73-92fa-d4a443a9eaaf","Type":"ContainerDied","Data":"f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.485244 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0af95ff-c608-4b73-92fa-d4a443a9eaaf","Type":"ContainerDied","Data":"2ed57c154181aa9e3a9a049b34ab49ad76353cbbaf30f1aa14875cc0bdc23724"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.485361 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.491939 4909 scope.go:117] "RemoveContainer" containerID="811b1c8d9db9cb5f0ee50813644e170eb643be43ad3fd1cf8b8def4d133ce9a4" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.500113 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" event={"ID":"81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425","Type":"ContainerDied","Data":"37fdcaad723edae7948c00a5a5707c296106f75851e5fbb092feb86674389462"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.500288 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-j8gxg" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.520205 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "e0af95ff-c608-4b73-92fa-d4a443a9eaaf" (UID: "e0af95ff-c608-4b73-92fa-d4a443a9eaaf"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.520584 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7b658933-f66d-47df-8b75-a42cd55b9bf4/ovsdbserver-sb/0.log" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.520798 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.520940 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7b658933-f66d-47df-8b75-a42cd55b9bf4","Type":"ContainerDied","Data":"798aff6bed83c271b8c0d250d07fb63f55fe57fd4e6aecfa75d1eb15594d1b45"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.539260 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-j8gxg"] Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.541310 4909 scope.go:117] "RemoveContainer" containerID="f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.541414 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" podUID="c97f6f0e-16ab-439c-a14c-3d908758b1db" containerName="barbican-keystone-listener-log" containerID="cri-o://20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1" gracePeriod=30 Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.541455 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" event={"ID":"c97f6f0e-16ab-439c-a14c-3d908758b1db","Type":"ContainerStarted","Data":"20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.541510 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" podUID="c97f6f0e-16ab-439c-a14c-3d908758b1db" containerName="barbican-keystone-listener" containerID="cri-o://749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d" gracePeriod=30 Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.548880 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-j8gxg"] Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.552416 4909 generic.go:334] "Generic (PLEG): container finished" podID="d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" containerID="f486bfdd773f8cf45643a4ed2ecd1ba0189d94637e89e939369da9808daa4c06" exitCode=1 Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.552468 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-blz7s" event={"ID":"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060","Type":"ContainerDied","Data":"f486bfdd773f8cf45643a4ed2ecd1ba0189d94637e89e939369da9808daa4c06"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.553036 4909 scope.go:117] "RemoveContainer" containerID="f486bfdd773f8cf45643a4ed2ecd1ba0189d94637e89e939369da9808daa4c06" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.556243 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.557372 4909 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.557400 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.557425 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.557437 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.557448 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2pfm\" (UniqueName: \"kubernetes.io/projected/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-kube-api-access-w2pfm\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.557459 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.565937 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.570840 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" podStartSLOduration=4.570800426 podStartE2EDuration="4.570800426s" podCreationTimestamp="2026-02-02 10:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:47.569285883 +0000 UTC m=+1413.315386618" watchObservedRunningTime="2026-02-02 10:54:47.570800426 +0000 UTC m=+1413.316901161" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.578291 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dd95bf6f-66vhd" event={"ID":"65028b8f-2d3c-40f3-8c17-239856623f4e","Type":"ContainerStarted","Data":"4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.578453 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6dd95bf6f-66vhd" podUID="65028b8f-2d3c-40f3-8c17-239856623f4e" containerName="barbican-worker-log" containerID="cri-o://0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3" gracePeriod=30 Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.578720 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6dd95bf6f-66vhd" podUID="65028b8f-2d3c-40f3-8c17-239856623f4e" containerName="barbican-worker" containerID="cri-o://4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7" gracePeriod=30 Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.590586 4909 generic.go:334] "Generic (PLEG): container finished" podID="20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" containerID="3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d" exitCode=0 Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.590662 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c","Type":"ContainerDied","Data":"3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.590689 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20cda1f6-04e5-4e71-9fa8-4cad8dbac52c","Type":"ContainerDied","Data":"c1c243cca4352a0b3706630b065e4cb97271d4046dbed87883879523bfd5cc00"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.590774 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.597678 4909 scope.go:117] "RemoveContainer" containerID="4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.616460 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-pzcjh" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.616596 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bbd5778f6-n6mw6" event={"ID":"ddda8a20-60ba-4ae9-837a-44fa44518b8a","Type":"ContainerStarted","Data":"d022e9f76cf1abc36a5058aa5c728c3e6e96256eb301dd364d41444bb1c46cc1"} Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.622409 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6dd95bf6f-66vhd" podStartSLOduration=4.622382492 podStartE2EDuration="4.622382492s" podCreationTimestamp="2026-02-02 10:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:47.61631114 +0000 UTC m=+1413.362411875" watchObservedRunningTime="2026-02-02 10:54:47.622382492 +0000 UTC m=+1413.368483227" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.804188 4909 scope.go:117] "RemoveContainer" containerID="f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c" Feb 02 10:54:47 crc kubenswrapper[4909]: E0202 10:54:47.806313 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c\": container with ID starting with f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c not found: ID does not exist" containerID="f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.806350 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c"} err="failed to get container status \"f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c\": rpc error: code = NotFound desc = could not find container \"f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c\": container with ID starting with f24f9c93513df807a7e645eb277c34de34ac929d0c937520c5a166ce83f93d7c not found: ID does not exist" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.806373 4909 scope.go:117] "RemoveContainer" containerID="4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d" Feb 02 10:54:47 crc kubenswrapper[4909]: E0202 10:54:47.808076 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d\": container with ID starting with 4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d not found: ID does not exist" containerID="4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.808113 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d"} err="failed to get container status \"4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d\": rpc error: code = NotFound desc = could not find container \"4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d\": container with ID starting with 4e8dc723a6916637428b6b4fcd605580da2f3e93fc64c90c5b30d9c96dc9f64d not found: ID does not exist" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.808136 4909 scope.go:117] "RemoveContainer" containerID="3487017a73b83d9a77436b7d788ba96efdea8f82fb04b45ef153d5a993e724bc" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.881259 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-config-data" (OuterVolumeSpecName: "config-data") pod "20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" (UID: "20cda1f6-04e5-4e71-9fa8-4cad8dbac52c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.894978 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" (UID: "20cda1f6-04e5-4e71-9fa8-4cad8dbac52c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.902101 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.944983 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0af95ff-c608-4b73-92fa-d4a443a9eaaf" (UID: "e0af95ff-c608-4b73-92fa-d4a443a9eaaf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.968177 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "40590ae8-432b-4d01-a586-f61f07a206b0" (UID: "40590ae8-432b-4d01-a586-f61f07a206b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.976644 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.976666 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.976677 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.976925 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.976935 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:47 crc kubenswrapper[4909]: I0202 10:54:47.994050 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-config-data" (OuterVolumeSpecName: "config-data") pod "40590ae8-432b-4d01-a586-f61f07a206b0" (UID: "40590ae8-432b-4d01-a586-f61f07a206b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.003186 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40590ae8-432b-4d01-a586-f61f07a206b0" (UID: "40590ae8-432b-4d01-a586-f61f07a206b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.003949 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "e0af95ff-c608-4b73-92fa-d4a443a9eaaf" (UID: "e0af95ff-c608-4b73-92fa-d4a443a9eaaf"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.009932 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" (UID: "20cda1f6-04e5-4e71-9fa8-4cad8dbac52c"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.024688 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "40590ae8-432b-4d01-a586-f61f07a206b0" (UID: "40590ae8-432b-4d01-a586-f61f07a206b0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.025911 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" (UID: "20cda1f6-04e5-4e71-9fa8-4cad8dbac52c"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.078668 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.078957 4909 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0af95ff-c608-4b73-92fa-d4a443a9eaaf-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.079035 4909 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.080071 4909 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.080182 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data podName:1ab15f72-b249-42d5-8698-273c5afc7758 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:52.080167543 +0000 UTC m=+1417.826268278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data") pod "rabbitmq-cell1-server-0" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758") : configmap "rabbitmq-cell1-config-data" not found Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.079094 4909 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.082165 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.082257 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/40590ae8-432b-4d01-a586-f61f07a206b0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.113145 4909 scope.go:117] "RemoveContainer" containerID="745f0130c8a843a1f7f679fb91fbed011c5cc6c79ffff4ba7e21abf7eb7801df" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.134320 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-pzcjh"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.162048 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-pzcjh"] Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.165243 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466 is running failed: container process not found" containerID="7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.166085 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466 is running failed: container process not found" containerID="7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.166330 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466 is running failed: container process not found" containerID="7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.166353 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="05143579-706d-4107-9d7a-a63b4a13c187" containerName="nova-cell1-conductor-conductor" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.188940 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-d64c85fd5-nns29"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.193027 4909 scope.go:117] "RemoveContainer" containerID="d2191dd6198a0fc9ecc9c919260563196adfa2a35bc25ae0ca32279cce086a5f" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.232705 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.233782 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-d64c85fd5-nns29"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.262878 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.283543 4909 scope.go:117] "RemoveContainer" containerID="b05bbe00b26a1b9593023419b5d4a14f74126dc62da00a8e37a5ac60df828f91" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.303936 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.313921 4909 scope.go:117] "RemoveContainer" containerID="3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.324880 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.353295 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.374172 4909 scope.go:117] "RemoveContainer" containerID="3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.379551 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d\": container with ID starting with 3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d not found: ID does not exist" containerID="3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.379583 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d"} err="failed to get container status \"3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d\": rpc error: code = NotFound desc = could not find container \"3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d\": container with ID starting with 3f21c558f0896ab2fa711d0d7b27be96691f1afc7425bfa37b7f33bdec9c6f9d not found: ID does not exist" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.387159 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-config-data\") pod \"05143579-706d-4107-9d7a-a63b4a13c187\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.387311 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2s7w\" (UniqueName: \"kubernetes.io/projected/05143579-706d-4107-9d7a-a63b4a13c187-kube-api-access-l2s7w\") pod \"05143579-706d-4107-9d7a-a63b4a13c187\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.387390 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-combined-ca-bundle\") pod \"05143579-706d-4107-9d7a-a63b4a13c187\" (UID: \"05143579-706d-4107-9d7a-a63b4a13c187\") " Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.401275 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05143579-706d-4107-9d7a-a63b4a13c187-kube-api-access-l2s7w" (OuterVolumeSpecName: "kube-api-access-l2s7w") pod "05143579-706d-4107-9d7a-a63b4a13c187" (UID: "05143579-706d-4107-9d7a-a63b4a13c187"). InnerVolumeSpecName "kube-api-access-l2s7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.436225 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-config-data" (OuterVolumeSpecName: "config-data") pod "05143579-706d-4107-9d7a-a63b4a13c187" (UID: "05143579-706d-4107-9d7a-a63b4a13c187"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.441853 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05143579-706d-4107-9d7a-a63b4a13c187" (UID: "05143579-706d-4107-9d7a-a63b4a13c187"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.483458 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.483906 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.484084 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.484108 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.487505 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.489078 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.489099 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05143579-706d-4107-9d7a-a63b4a13c187-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.489108 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2s7w\" (UniqueName: \"kubernetes.io/projected/05143579-706d-4107-9d7a-a63b4a13c187-kube-api-access-l2s7w\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.489620 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.496131 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.496323 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovs-vswitchd" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.566715 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-qpqvt" podUID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" containerName="ovn-controller" probeResult="failure" output=< Feb 02 10:54:48 crc kubenswrapper[4909]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Feb 02 10:54:48 crc kubenswrapper[4909]: > Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.633312 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.634743 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="ceilometer-central-agent" containerID="cri-o://1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2" gracePeriod=30 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.636036 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="proxy-httpd" containerID="cri-o://3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7" gracePeriod=30 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.636218 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="ceilometer-notification-agent" containerID="cri-o://023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6" gracePeriod=30 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.636268 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="sg-core" containerID="cri-o://166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f" gracePeriod=30 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.655197 4909 generic.go:334] "Generic (PLEG): container finished" podID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" containerID="a0b641eb35c253bddcb998df3a1e8bb281e92fcbf9ed7e43633609aff1d97578" exitCode=0 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.655316 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d56d8dff8-fh9sw" event={"ID":"b6be13cc-01ed-441f-b2c9-dc024fcb4b18","Type":"ContainerDied","Data":"a0b641eb35c253bddcb998df3a1e8bb281e92fcbf9ed7e43633609aff1d97578"} Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.692959 4909 generic.go:334] "Generic (PLEG): container finished" podID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerID="9b97792ee0643776ed45dd9bce3742b6344f3f3eb329b31a1d413958f371ef6e" exitCode=0 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.693061 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14a8699a-66db-48c6-8834-bda4e21ef1d9","Type":"ContainerDied","Data":"9b97792ee0643776ed45dd9bce3742b6344f3f3eb329b31a1d413958f371ef6e"} Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.716384 4909 generic.go:334] "Generic (PLEG): container finished" podID="65028b8f-2d3c-40f3-8c17-239856623f4e" containerID="0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3" exitCode=143 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.716492 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dd95bf6f-66vhd" event={"ID":"65028b8f-2d3c-40f3-8c17-239856623f4e","Type":"ContainerDied","Data":"0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3"} Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.726765 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.727022 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="236e10c6-5b7d-4f9e-b82a-5c68edc93692" containerName="kube-state-metrics" containerID="cri-o://a5429c7323d593269c829013791c5ec02f6c2311d2ea72699e259a5d72454bac" gracePeriod=30 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.727752 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:58394->10.217.0.206:8775: read: connection reset by peer" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.727765 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": read tcp 10.217.0.2:58392->10.217.0.206:8775: read: connection reset by peer" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.753871 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bbd5778f6-n6mw6" event={"ID":"ddda8a20-60ba-4ae9-837a-44fa44518b8a","Type":"ContainerStarted","Data":"e4c6472255c48be75153726b320528677bd55174125c6e07a90c92c9ea5dd4e3"} Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.754107 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6bbd5778f6-n6mw6" podUID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" containerName="barbican-api-log" containerID="cri-o://d022e9f76cf1abc36a5058aa5c728c3e6e96256eb301dd364d41444bb1c46cc1" gracePeriod=30 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.754334 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.754373 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.755888 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6bbd5778f6-n6mw6" podUID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" containerName="barbican-api" containerID="cri-o://e4c6472255c48be75153726b320528677bd55174125c6e07a90c92c9ea5dd4e3" gracePeriod=30 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.828395 4909 generic.go:334] "Generic (PLEG): container finished" podID="c97f6f0e-16ab-439c-a14c-3d908758b1db" containerID="20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1" exitCode=143 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.828488 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" event={"ID":"c97f6f0e-16ab-439c-a14c-3d908758b1db","Type":"ContainerDied","Data":"20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1"} Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.828515 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" event={"ID":"c97f6f0e-16ab-439c-a14c-3d908758b1db","Type":"ContainerStarted","Data":"749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d"} Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.828786 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.829271 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="9dab4432-0762-45a8-88ab-3a99217a790f" containerName="memcached" containerID="cri-o://a5be7b718c825caf194dfbbef2df9f7def820d23f84184c159660c56fbb1b591" gracePeriod=30 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.885069 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-910a-account-create-update-lvntk"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.907842 4909 generic.go:334] "Generic (PLEG): container finished" podID="05143579-706d-4107-9d7a-a63b4a13c187" containerID="7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466" exitCode=0 Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.907937 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"05143579-706d-4107-9d7a-a63b4a13c187","Type":"ContainerDied","Data":"7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466"} Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.907965 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"05143579-706d-4107-9d7a-a63b4a13c187","Type":"ContainerDied","Data":"b02bf9e93273b22183635526cc42cdd13e7fa25dcb85df2b9d0b26fc7b7fe2d1"} Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.907982 4909 scope.go:117] "RemoveContainer" containerID="7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.908104 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.961284 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-910a-account-create-update-lvntk"] Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970261 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-910a-account-create-update-kc8n7"] Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970661 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0af95ff-c608-4b73-92fa-d4a443a9eaaf" containerName="galera" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970672 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0af95ff-c608-4b73-92fa-d4a443a9eaaf" containerName="galera" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970680 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b658933-f66d-47df-8b75-a42cd55b9bf4" containerName="openstack-network-exporter" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970686 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b658933-f66d-47df-8b75-a42cd55b9bf4" containerName="openstack-network-exporter" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970701 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" containerName="dnsmasq-dns" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970707 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" containerName="dnsmasq-dns" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970722 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40590ae8-432b-4d01-a586-f61f07a206b0" containerName="proxy-httpd" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970728 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="40590ae8-432b-4d01-a586-f61f07a206b0" containerName="proxy-httpd" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970740 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05143579-706d-4107-9d7a-a63b4a13c187" containerName="nova-cell1-conductor-conductor" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970745 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="05143579-706d-4107-9d7a-a63b4a13c187" containerName="nova-cell1-conductor-conductor" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970760 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c631ddad-ab3a-488f-9947-2f3385fd912c" containerName="openstack-network-exporter" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970768 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c631ddad-ab3a-488f-9947-2f3385fd912c" containerName="openstack-network-exporter" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970779 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0a30163-0b42-493b-b775-d88218bd1844" containerName="openstack-network-exporter" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970784 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0a30163-0b42-493b-b775-d88218bd1844" containerName="openstack-network-exporter" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970794 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970799 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970873 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0a30163-0b42-493b-b775-d88218bd1844" containerName="ovsdbserver-nb" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970881 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0a30163-0b42-493b-b775-d88218bd1844" containerName="ovsdbserver-nb" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970890 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0af95ff-c608-4b73-92fa-d4a443a9eaaf" containerName="mysql-bootstrap" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970895 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0af95ff-c608-4b73-92fa-d4a443a9eaaf" containerName="mysql-bootstrap" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970906 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40590ae8-432b-4d01-a586-f61f07a206b0" containerName="proxy-server" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970911 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="40590ae8-432b-4d01-a586-f61f07a206b0" containerName="proxy-server" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970920 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b658933-f66d-47df-8b75-a42cd55b9bf4" containerName="ovsdbserver-sb" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970926 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b658933-f66d-47df-8b75-a42cd55b9bf4" containerName="ovsdbserver-sb" Feb 02 10:54:48 crc kubenswrapper[4909]: E0202 10:54:48.970938 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" containerName="init" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.970943 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" containerName="init" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975191 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="40590ae8-432b-4d01-a586-f61f07a206b0" containerName="proxy-httpd" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975223 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="05143579-706d-4107-9d7a-a63b4a13c187" containerName="nova-cell1-conductor-conductor" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975232 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0a30163-0b42-493b-b775-d88218bd1844" containerName="openstack-network-exporter" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975246 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c631ddad-ab3a-488f-9947-2f3385fd912c" containerName="openstack-network-exporter" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975258 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" containerName="dnsmasq-dns" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975266 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b658933-f66d-47df-8b75-a42cd55b9bf4" containerName="ovsdbserver-sb" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975279 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="40590ae8-432b-4d01-a586-f61f07a206b0" containerName="proxy-server" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975291 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0af95ff-c608-4b73-92fa-d4a443a9eaaf" containerName="galera" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975300 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b658933-f66d-47df-8b75-a42cd55b9bf4" containerName="openstack-network-exporter" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975308 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0a30163-0b42-493b-b775-d88218bd1844" containerName="ovsdbserver-nb" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975321 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.975899 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:48 crc kubenswrapper[4909]: I0202 10:54:48.980094 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.032946 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrsh9\" (UniqueName: \"kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9\") pod \"keystone-910a-account-create-update-kc8n7\" (UID: \"7913b849-56c2-493f-822d-f8f15dfc4fe1\") " pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.033138 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts\") pod \"keystone-910a-account-create-update-kc8n7\" (UID: \"7913b849-56c2-493f-822d-f8f15dfc4fe1\") " pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.049695 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6bbd5778f6-n6mw6" podStartSLOduration=6.049668069 podStartE2EDuration="6.049668069s" podCreationTimestamp="2026-02-02 10:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:48.847787661 +0000 UTC m=+1414.593888416" watchObservedRunningTime="2026-02-02 10:54:49.049668069 +0000 UTC m=+1414.795768804" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.109464 4909 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-blz7s" secret="" err="secret \"galera-openstack-dockercfg-ld7ld\" not found" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.141184 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrsh9\" (UniqueName: \"kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9\") pod \"keystone-910a-account-create-update-kc8n7\" (UID: \"7913b849-56c2-493f-822d-f8f15dfc4fe1\") " pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.141398 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts\") pod \"keystone-910a-account-create-update-kc8n7\" (UID: \"7913b849-56c2-493f-822d-f8f15dfc4fe1\") " pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.141573 4909 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.141659 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts podName:7913b849-56c2-493f-822d-f8f15dfc4fe1 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:49.641639753 +0000 UTC m=+1415.387740488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts") pod "keystone-910a-account-create-update-kc8n7" (UID: "7913b849-56c2-493f-822d-f8f15dfc4fe1") : configmap "openstack-scripts" not found Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.142020 4909 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.143974 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts podName:d5e2e76d-5e02-4488-ae0d-5acbdb1aa060 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:49.643951039 +0000 UTC m=+1415.390051774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts") pod "root-account-create-update-blz7s" (UID: "d5e2e76d-5e02-4488-ae0d-5acbdb1aa060") : configmap "openstack-scripts" not found Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.158311 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" podUID="04232dcc-dda5-4774-b999-5104335f2da0" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:34502->10.217.0.162:9311: read: connection reset by peer" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.158425 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" podUID="04232dcc-dda5-4774-b999-5104335f2da0" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:34508->10.217.0.162:9311: read: connection reset by peer" Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.162981 4909 projected.go:194] Error preparing data for projected volume kube-api-access-rrsh9 for pod openstack/keystone-910a-account-create-update-kc8n7: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.163086 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9 podName:7913b849-56c2-493f-822d-f8f15dfc4fe1 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:49.663062772 +0000 UTC m=+1415.409163507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rrsh9" (UniqueName: "kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9") pod "keystone-910a-account-create-update-kc8n7" (UID: "7913b849-56c2-493f-822d-f8f15dfc4fe1") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.211955 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-blz7s" podStartSLOduration=6.211930871 podStartE2EDuration="6.211930871s" podCreationTimestamp="2026-02-02 10:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 10:54:49.096316065 +0000 UTC m=+1414.842416800" watchObservedRunningTime="2026-02-02 10:54:49.211930871 +0000 UTC m=+1414.958031596" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.463214 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20cda1f6-04e5-4e71-9fa8-4cad8dbac52c" path="/var/lib/kubelet/pods/20cda1f6-04e5-4e71-9fa8-4cad8dbac52c/volumes" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.467526 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aa83e5f-8a9a-4275-989a-105cf6370d74" path="/var/lib/kubelet/pods/3aa83e5f-8a9a-4275-989a-105cf6370d74/volumes" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.468195 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40590ae8-432b-4d01-a586-f61f07a206b0" path="/var/lib/kubelet/pods/40590ae8-432b-4d01-a586-f61f07a206b0/volumes" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.472088 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b658933-f66d-47df-8b75-a42cd55b9bf4" path="/var/lib/kubelet/pods/7b658933-f66d-47df-8b75-a42cd55b9bf4/volumes" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.472969 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425" path="/var/lib/kubelet/pods/81b1f2c4-a7f5-4700-b0bf-ea0f1e8fa425/volumes" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.484921 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c631ddad-ab3a-488f-9947-2f3385fd912c" path="/var/lib/kubelet/pods/c631ddad-ab3a-488f-9947-2f3385fd912c/volumes" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.485878 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0af95ff-c608-4b73-92fa-d4a443a9eaaf" path="/var/lib/kubelet/pods/e0af95ff-c608-4b73-92fa-d4a443a9eaaf/volumes" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.487307 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0a30163-0b42-493b-b775-d88218bd1844" path="/var/lib/kubelet/pods/f0a30163-0b42-493b-b775-d88218bd1844/volumes" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488049 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-910a-account-create-update-kc8n7"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488081 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-smqnr"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488091 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8sv84"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488102 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-smqnr"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488117 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-blz7s" event={"ID":"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060","Type":"ContainerStarted","Data":"00d102ada4d1a621cc75e76f5d11972c92c765cc655c0852af742709db540a93"} Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488136 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8sv84"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488147 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7bb594654d-prg2q"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488158 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488172 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-blz7s"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488181 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-h4czn"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488193 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-h4czn"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.488203 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-910a-account-create-update-kc8n7"] Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.491395 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7bb594654d-prg2q" podUID="86bc749f-73e5-4bcc-8079-7c9b053e0318" containerName="keystone-api" containerID="cri-o://0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737" gracePeriod=30 Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.491619 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6be13cc_01ed_441f_b2c9_dc024fcb4b18.slice/crio-conmon-a0b641eb35c253bddcb998df3a1e8bb281e92fcbf9ed7e43633609aff1d97578.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6be13cc_01ed_441f_b2c9_dc024fcb4b18.slice/crio-a0b641eb35c253bddcb998df3a1e8bb281e92fcbf9ed7e43633609aff1d97578.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97409ffd_f1ab_4a1a_9939_a041a4085b1a.slice/crio-conmon-ecc6af1fec99e4b5728487ec159109b228720e7e811d71b0bc2fd00d2e8a68cc.scope\": RecentStats: unable to find data in memory cache]" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.673160 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts\") pod \"keystone-910a-account-create-update-kc8n7\" (UID: \"7913b849-56c2-493f-822d-f8f15dfc4fe1\") " pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.673253 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrsh9\" (UniqueName: \"kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9\") pod \"keystone-910a-account-create-update-kc8n7\" (UID: \"7913b849-56c2-493f-822d-f8f15dfc4fe1\") " pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.673593 4909 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.673638 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts podName:d5e2e76d-5e02-4488-ae0d-5acbdb1aa060 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:50.673624634 +0000 UTC m=+1416.419725369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts") pod "root-account-create-update-blz7s" (UID: "d5e2e76d-5e02-4488-ae0d-5acbdb1aa060") : configmap "openstack-scripts" not found Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.673666 4909 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.673683 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts podName:7913b849-56c2-493f-822d-f8f15dfc4fe1 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:50.673678245 +0000 UTC m=+1416.419778980 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts") pod "keystone-910a-account-create-update-kc8n7" (UID: "7913b849-56c2-493f-822d-f8f15dfc4fe1") : configmap "openstack-scripts" not found Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.676755 4909 projected.go:194] Error preparing data for projected volume kube-api-access-rrsh9 for pod openstack/keystone-910a-account-create-update-kc8n7: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.676798 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9 podName:7913b849-56c2-493f-822d-f8f15dfc4fe1 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:50.676786354 +0000 UTC m=+1416.422887089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rrsh9" (UniqueName: "kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9") pod "keystone-910a-account-create-update-kc8n7" (UID: "7913b849-56c2-493f-822d-f8f15dfc4fe1") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 02 10:54:49 crc kubenswrapper[4909]: I0202 10:54:49.828540 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="68e55a25-f51a-49a9-af91-ffbab9ad611e" containerName="galera" containerID="cri-o://a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9" gracePeriod=30 Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.881068 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51 is running failed: container process not found" containerID="e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.882784 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51 is running failed: container process not found" containerID="e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.884005 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51 is running failed: container process not found" containerID="e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.884038 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="2399f7a5-86e0-46bd-9f3d-624d1208b9cc" containerName="nova-cell0-conductor-conductor" Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.925292 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e is running failed: container process not found" containerID="7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.926459 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e is running failed: container process not found" containerID="7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.937119 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e is running failed: container process not found" containerID="7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 10:54:49 crc kubenswrapper[4909]: E0202 10:54:49.937190 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ba559652-0584-49c5-91d6-7d7fdd596dc2" containerName="nova-scheduler-scheduler" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.075092 4909 generic.go:334] "Generic (PLEG): container finished" podID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerID="9146b7fe892e3b1b3b9341aaa935f843c8006e1d193949290905af61bbfd863a" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.075165 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f0cf9e9-b663-4be3-a435-c7dd6deea228","Type":"ContainerDied","Data":"9146b7fe892e3b1b3b9341aaa935f843c8006e1d193949290905af61bbfd863a"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.075193 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7f0cf9e9-b663-4be3-a435-c7dd6deea228","Type":"ContainerDied","Data":"616e8f784384a3a669ea27e5b62f8fac3c4e970a6e41b551be916f38b10f6f48"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.075214 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="616e8f784384a3a669ea27e5b62f8fac3c4e970a6e41b551be916f38b10f6f48" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.076615 4909 generic.go:334] "Generic (PLEG): container finished" podID="9dab4432-0762-45a8-88ab-3a99217a790f" containerID="a5be7b718c825caf194dfbbef2df9f7def820d23f84184c159660c56fbb1b591" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.076664 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"9dab4432-0762-45a8-88ab-3a99217a790f","Type":"ContainerDied","Data":"a5be7b718c825caf194dfbbef2df9f7def820d23f84184c159660c56fbb1b591"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.078256 4909 generic.go:334] "Generic (PLEG): container finished" podID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" containerID="ecc6af1fec99e4b5728487ec159109b228720e7e811d71b0bc2fd00d2e8a68cc" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.078293 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97409ffd-f1ab-4a1a-9939-a041a4085b1a","Type":"ContainerDied","Data":"ecc6af1fec99e4b5728487ec159109b228720e7e811d71b0bc2fd00d2e8a68cc"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.078308 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"97409ffd-f1ab-4a1a-9939-a041a4085b1a","Type":"ContainerDied","Data":"7262c0c2c09b87e3d707614ee0dbfc03f24e14199b066cf7824b4f081f53c249"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.078321 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7262c0c2c09b87e3d707614ee0dbfc03f24e14199b066cf7824b4f081f53c249" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.079573 4909 generic.go:334] "Generic (PLEG): container finished" podID="04232dcc-dda5-4774-b999-5104335f2da0" containerID="806aaa0ff1451eccc6cc80605c3b759d86a7c3041d5b3685c9a6764ed0b9dda2" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.079610 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" event={"ID":"04232dcc-dda5-4774-b999-5104335f2da0","Type":"ContainerDied","Data":"806aaa0ff1451eccc6cc80605c3b759d86a7c3041d5b3685c9a6764ed0b9dda2"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.079627 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" event={"ID":"04232dcc-dda5-4774-b999-5104335f2da0","Type":"ContainerDied","Data":"943333632efbfbf6251d2da0a6f3592d760b3a1be6b03002e16b1751ad7c126d"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.079637 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="943333632efbfbf6251d2da0a6f3592d760b3a1be6b03002e16b1751ad7c126d" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.080901 4909 generic.go:334] "Generic (PLEG): container finished" podID="d4589304-68d2-48c9-a691-e34a9cb4c75b" containerID="397ade4dfec5a408d839ad6fb1e26085b9b09d9f93c56399c0d7b99f36c1aded" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.080938 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d4cb98fc-6tg42" event={"ID":"d4589304-68d2-48c9-a691-e34a9cb4c75b","Type":"ContainerDied","Data":"397ade4dfec5a408d839ad6fb1e26085b9b09d9f93c56399c0d7b99f36c1aded"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.083995 4909 generic.go:334] "Generic (PLEG): container finished" podID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerID="3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.084018 4909 generic.go:334] "Generic (PLEG): container finished" podID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerID="166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f" exitCode=2 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.084027 4909 generic.go:334] "Generic (PLEG): container finished" podID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerID="1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.084063 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a43fc6d-3442-4921-93bc-ef5ab2273a78","Type":"ContainerDied","Data":"3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.084085 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a43fc6d-3442-4921-93bc-ef5ab2273a78","Type":"ContainerDied","Data":"166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.084095 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a43fc6d-3442-4921-93bc-ef5ab2273a78","Type":"ContainerDied","Data":"1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.087671 4909 generic.go:334] "Generic (PLEG): container finished" podID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" containerID="b85d85d06b59a99a123c0147182bc23c0f8782f8baf6bdc50f6e3c5737f2292a" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.087722 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" event={"ID":"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a","Type":"ContainerDied","Data":"b85d85d06b59a99a123c0147182bc23c0f8782f8baf6bdc50f6e3c5737f2292a"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.091008 4909 generic.go:334] "Generic (PLEG): container finished" podID="9b24f572-8a70-4a46-b3cf-e50ae4859892" containerID="e29f67614fb9cccc18985f89282ec44019b367ff92f09a6e3769b0344cbb1193" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.091069 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b24f572-8a70-4a46-b3cf-e50ae4859892","Type":"ContainerDied","Data":"e29f67614fb9cccc18985f89282ec44019b367ff92f09a6e3769b0344cbb1193"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.105774 4909 generic.go:334] "Generic (PLEG): container finished" podID="ba559652-0584-49c5-91d6-7d7fdd596dc2" containerID="7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.106939 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba559652-0584-49c5-91d6-7d7fdd596dc2","Type":"ContainerDied","Data":"7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.108959 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d56d8dff8-fh9sw" event={"ID":"b6be13cc-01ed-441f-b2c9-dc024fcb4b18","Type":"ContainerDied","Data":"ca078b1d864dcc02751079f14dd3708fe78bce7b544c8de78b4bb2d92c23e89e"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.108989 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca078b1d864dcc02751079f14dd3708fe78bce7b544c8de78b4bb2d92c23e89e" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.110388 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"14a8699a-66db-48c6-8834-bda4e21ef1d9","Type":"ContainerDied","Data":"c689550316c47545c7dfb54660b66e0d29cc015454099c5186b37d663c0c183c"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.110415 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c689550316c47545c7dfb54660b66e0d29cc015454099c5186b37d663c0c183c" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.114713 4909 generic.go:334] "Generic (PLEG): container finished" podID="d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" containerID="00d102ada4d1a621cc75e76f5d11972c92c765cc655c0852af742709db540a93" exitCode=1 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.114774 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-blz7s" event={"ID":"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060","Type":"ContainerDied","Data":"00d102ada4d1a621cc75e76f5d11972c92c765cc655c0852af742709db540a93"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.117227 4909 generic.go:334] "Generic (PLEG): container finished" podID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerID="a20507244041d23b32f190d2a807a0eee356b71acfba50ced33b0ba77be75a3b" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.117283 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7202de6a-156c-4c06-9e08-3e62cfcf367e","Type":"ContainerDied","Data":"a20507244041d23b32f190d2a807a0eee356b71acfba50ced33b0ba77be75a3b"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.117304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7202de6a-156c-4c06-9e08-3e62cfcf367e","Type":"ContainerDied","Data":"6d0e811b8f50ead3e9e919e4fcc41893c00d5a1a8955f6a999108e5c8dc0292d"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.117317 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d0e811b8f50ead3e9e919e4fcc41893c00d5a1a8955f6a999108e5c8dc0292d" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.119057 4909 generic.go:334] "Generic (PLEG): container finished" podID="236e10c6-5b7d-4f9e-b82a-5c68edc93692" containerID="a5429c7323d593269c829013791c5ec02f6c2311d2ea72699e259a5d72454bac" exitCode=2 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.119095 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"236e10c6-5b7d-4f9e-b82a-5c68edc93692","Type":"ContainerDied","Data":"a5429c7323d593269c829013791c5ec02f6c2311d2ea72699e259a5d72454bac"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.119111 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"236e10c6-5b7d-4f9e-b82a-5c68edc93692","Type":"ContainerDied","Data":"1269fcc242324b31f01bdc44584ebc1d0b910481dab215f3c382b9ce22b3d69a"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.119120 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1269fcc242324b31f01bdc44584ebc1d0b910481dab215f3c382b9ce22b3d69a" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.120597 4909 generic.go:334] "Generic (PLEG): container finished" podID="8c52b752-391b-4770-9191-3494df4e3999" containerID="79243ba3eb5d42f1bd0f138c616462a4f129bde3f205dc9026f0724a065bb0f7" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.120633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c52b752-391b-4770-9191-3494df4e3999","Type":"ContainerDied","Data":"79243ba3eb5d42f1bd0f138c616462a4f129bde3f205dc9026f0724a065bb0f7"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.120649 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c52b752-391b-4770-9191-3494df4e3999","Type":"ContainerDied","Data":"62b4ce862ca6d33d4c3131f7c2d0009cb4ae7a959c2b8124e274d729561a329f"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.120658 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62b4ce862ca6d33d4c3131f7c2d0009cb4ae7a959c2b8124e274d729561a329f" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.121975 4909 generic.go:334] "Generic (PLEG): container finished" podID="2399f7a5-86e0-46bd-9f3d-624d1208b9cc" containerID="e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.122014 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2399f7a5-86e0-46bd-9f3d-624d1208b9cc","Type":"ContainerDied","Data":"e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.124261 4909 generic.go:334] "Generic (PLEG): container finished" podID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" containerID="e4c6472255c48be75153726b320528677bd55174125c6e07a90c92c9ea5dd4e3" exitCode=0 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.124282 4909 generic.go:334] "Generic (PLEG): container finished" podID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" containerID="d022e9f76cf1abc36a5058aa5c728c3e6e96256eb301dd364d41444bb1c46cc1" exitCode=143 Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.124301 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bbd5778f6-n6mw6" event={"ID":"ddda8a20-60ba-4ae9-837a-44fa44518b8a","Type":"ContainerDied","Data":"e4c6472255c48be75153726b320528677bd55174125c6e07a90c92c9ea5dd4e3"} Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.124324 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bbd5778f6-n6mw6" event={"ID":"ddda8a20-60ba-4ae9-837a-44fa44518b8a","Type":"ContainerDied","Data":"d022e9f76cf1abc36a5058aa5c728c3e6e96256eb301dd364d41444bb1c46cc1"} Feb 02 10:54:50 crc kubenswrapper[4909]: E0202 10:54:50.424533 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-rrsh9 operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-910a-account-create-update-kc8n7" podUID="7913b849-56c2-493f-822d-f8f15dfc4fe1" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.427573 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.430133 4909 scope.go:117] "RemoveContainer" containerID="7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466" Feb 02 10:54:50 crc kubenswrapper[4909]: E0202 10:54:50.430722 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466\": container with ID starting with 7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466 not found: ID does not exist" containerID="7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.430756 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466"} err="failed to get container status \"7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466\": rpc error: code = NotFound desc = could not find container \"7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466\": container with ID starting with 7e1c8b8ad2c25126925d89042eabffbf703c5cb8a560053f5e2057f259e35466 not found: ID does not exist" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.430783 4909 scope.go:117] "RemoveContainer" containerID="f486bfdd773f8cf45643a4ed2ecd1ba0189d94637e89e939369da9808daa4c06" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.449589 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.487772 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.497543 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.514185 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-public-tls-certs\") pod \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.514236 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-combined-ca-bundle\") pod \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.514397 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krr4b\" (UniqueName: \"kubernetes.io/projected/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-kube-api-access-krr4b\") pod \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.514447 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-logs\") pod \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.514879 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-config-data\") pod \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.515477 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-scripts\") pod \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.515565 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-internal-tls-certs\") pod \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\" (UID: \"b6be13cc-01ed-441f-b2c9-dc024fcb4b18\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.537455 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-logs" (OuterVolumeSpecName: "logs") pod "b6be13cc-01ed-441f-b2c9-dc024fcb4b18" (UID: "b6be13cc-01ed-441f-b2c9-dc024fcb4b18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.540579 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.542221 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-scripts" (OuterVolumeSpecName: "scripts") pod "b6be13cc-01ed-441f-b2c9-dc024fcb4b18" (UID: "b6be13cc-01ed-441f-b2c9-dc024fcb4b18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.542375 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.566896 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-kube-api-access-krr4b" (OuterVolumeSpecName: "kube-api-access-krr4b") pod "b6be13cc-01ed-441f-b2c9-dc024fcb4b18" (UID: "b6be13cc-01ed-441f-b2c9-dc024fcb4b18"). InnerVolumeSpecName "kube-api-access-krr4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.584852 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.624899 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-httpd-run\") pod \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.624962 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a8699a-66db-48c6-8834-bda4e21ef1d9-logs\") pod \"14a8699a-66db-48c6-8834-bda4e21ef1d9\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625524 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-scripts\") pod \"14a8699a-66db-48c6-8834-bda4e21ef1d9\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625564 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-logs\") pod \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625581 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625598 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-config\") pod \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625644 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5v9j\" (UniqueName: \"kubernetes.io/projected/7f0cf9e9-b663-4be3-a435-c7dd6deea228-kube-api-access-x5v9j\") pod \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625696 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-combined-ca-bundle\") pod \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625727 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data-custom\") pod \"14a8699a-66db-48c6-8834-bda4e21ef1d9\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625744 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-combined-ca-bundle\") pod \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625763 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-combined-ca-bundle\") pod \"7202de6a-156c-4c06-9e08-3e62cfcf367e\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625783 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-public-tls-certs\") pod \"7202de6a-156c-4c06-9e08-3e62cfcf367e\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625798 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-combined-ca-bundle\") pod \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625847 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f0cf9e9-b663-4be3-a435-c7dd6deea228-logs\") pod \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.625873 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhq9p\" (UniqueName: \"kubernetes.io/projected/97409ffd-f1ab-4a1a-9939-a041a4085b1a-kube-api-access-xhq9p\") pod \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626528 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-scripts\") pod \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626564 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wckzf\" (UniqueName: \"kubernetes.io/projected/7202de6a-156c-4c06-9e08-3e62cfcf367e-kube-api-access-wckzf\") pod \"7202de6a-156c-4c06-9e08-3e62cfcf367e\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626609 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-public-tls-certs\") pod \"14a8699a-66db-48c6-8834-bda4e21ef1d9\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626626 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-combined-ca-bundle\") pod \"14a8699a-66db-48c6-8834-bda4e21ef1d9\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626640 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14a8699a-66db-48c6-8834-bda4e21ef1d9-etc-machine-id\") pod \"14a8699a-66db-48c6-8834-bda4e21ef1d9\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626657 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-internal-tls-certs\") pod \"14a8699a-66db-48c6-8834-bda4e21ef1d9\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626675 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-internal-tls-certs\") pod \"7202de6a-156c-4c06-9e08-3e62cfcf367e\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626707 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data\") pod \"14a8699a-66db-48c6-8834-bda4e21ef1d9\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626727 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvnvx\" (UniqueName: \"kubernetes.io/projected/14a8699a-66db-48c6-8834-bda4e21ef1d9-kube-api-access-vvnvx\") pod \"14a8699a-66db-48c6-8834-bda4e21ef1d9\" (UID: \"14a8699a-66db-48c6-8834-bda4e21ef1d9\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626748 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-config-data\") pod \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626767 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7202de6a-156c-4c06-9e08-3e62cfcf367e-logs\") pod \"7202de6a-156c-4c06-9e08-3e62cfcf367e\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-config-data\") pod \"7202de6a-156c-4c06-9e08-3e62cfcf367e\" (UID: \"7202de6a-156c-4c06-9e08-3e62cfcf367e\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626844 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-certs\") pod \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626863 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-config-data\") pod \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626884 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-internal-tls-certs\") pod \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\" (UID: \"97409ffd-f1ab-4a1a-9939-a041a4085b1a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626906 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skgpn\" (UniqueName: \"kubernetes.io/projected/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-api-access-skgpn\") pod \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\" (UID: \"236e10c6-5b7d-4f9e-b82a-5c68edc93692\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.626930 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-nova-metadata-tls-certs\") pod \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\" (UID: \"7f0cf9e9-b663-4be3-a435-c7dd6deea228\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.627598 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.632330 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14a8699a-66db-48c6-8834-bda4e21ef1d9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "14a8699a-66db-48c6-8834-bda4e21ef1d9" (UID: "14a8699a-66db-48c6-8834-bda4e21ef1d9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.633450 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.633484 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/14a8699a-66db-48c6-8834-bda4e21ef1d9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.633498 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krr4b\" (UniqueName: \"kubernetes.io/projected/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-kube-api-access-krr4b\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.633509 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.642225 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-logs" (OuterVolumeSpecName: "logs") pod "97409ffd-f1ab-4a1a-9939-a041a4085b1a" (UID: "97409ffd-f1ab-4a1a-9939-a041a4085b1a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.642550 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "97409ffd-f1ab-4a1a-9939-a041a4085b1a" (UID: "97409ffd-f1ab-4a1a-9939-a041a4085b1a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.642889 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a8699a-66db-48c6-8834-bda4e21ef1d9-logs" (OuterVolumeSpecName: "logs") pod "14a8699a-66db-48c6-8834-bda4e21ef1d9" (UID: "14a8699a-66db-48c6-8834-bda4e21ef1d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.662320 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.667700 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-config-data" (OuterVolumeSpecName: "config-data") pod "b6be13cc-01ed-441f-b2c9-dc024fcb4b18" (UID: "b6be13cc-01ed-441f-b2c9-dc024fcb4b18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.674305 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7202de6a-156c-4c06-9e08-3e62cfcf367e-logs" (OuterVolumeSpecName: "logs") pod "7202de6a-156c-4c06-9e08-3e62cfcf367e" (UID: "7202de6a-156c-4c06-9e08-3e62cfcf367e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.690905 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "97409ffd-f1ab-4a1a-9939-a041a4085b1a" (UID: "97409ffd-f1ab-4a1a-9939-a041a4085b1a"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.699300 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f0cf9e9-b663-4be3-a435-c7dd6deea228-logs" (OuterVolumeSpecName: "logs") pod "7f0cf9e9-b663-4be3-a435-c7dd6deea228" (UID: "7f0cf9e9-b663-4be3-a435-c7dd6deea228"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.699830 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a8699a-66db-48c6-8834-bda4e21ef1d9-kube-api-access-vvnvx" (OuterVolumeSpecName: "kube-api-access-vvnvx") pod "14a8699a-66db-48c6-8834-bda4e21ef1d9" (UID: "14a8699a-66db-48c6-8834-bda4e21ef1d9"). InnerVolumeSpecName "kube-api-access-vvnvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.701119 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-scripts" (OuterVolumeSpecName: "scripts") pod "14a8699a-66db-48c6-8834-bda4e21ef1d9" (UID: "14a8699a-66db-48c6-8834-bda4e21ef1d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.701383 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-scripts" (OuterVolumeSpecName: "scripts") pod "97409ffd-f1ab-4a1a-9939-a041a4085b1a" (UID: "97409ffd-f1ab-4a1a-9939-a041a4085b1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.701695 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7202de6a-156c-4c06-9e08-3e62cfcf367e-kube-api-access-wckzf" (OuterVolumeSpecName: "kube-api-access-wckzf") pod "7202de6a-156c-4c06-9e08-3e62cfcf367e" (UID: "7202de6a-156c-4c06-9e08-3e62cfcf367e"). InnerVolumeSpecName "kube-api-access-wckzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.702092 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738001 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f0cf9e9-b663-4be3-a435-c7dd6deea228-kube-api-access-x5v9j" (OuterVolumeSpecName: "kube-api-access-x5v9j") pod "7f0cf9e9-b663-4be3-a435-c7dd6deea228" (UID: "7f0cf9e9-b663-4be3-a435-c7dd6deea228"). InnerVolumeSpecName "kube-api-access-x5v9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738558 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04232dcc-dda5-4774-b999-5104335f2da0-logs\") pod \"04232dcc-dda5-4774-b999-5104335f2da0\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738585 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-public-tls-certs\") pod \"8c52b752-391b-4770-9191-3494df4e3999\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738629 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data\") pod \"04232dcc-dda5-4774-b999-5104335f2da0\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738677 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8c52b752-391b-4770-9191-3494df4e3999\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738714 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-internal-tls-certs\") pod \"04232dcc-dda5-4774-b999-5104335f2da0\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738744 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-httpd-run\") pod \"8c52b752-391b-4770-9191-3494df4e3999\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738764 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-scripts\") pod \"8c52b752-391b-4770-9191-3494df4e3999\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738793 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-combined-ca-bundle\") pod \"04232dcc-dda5-4774-b999-5104335f2da0\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738837 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-public-tls-certs\") pod \"04232dcc-dda5-4774-b999-5104335f2da0\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738882 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-combined-ca-bundle\") pod \"8c52b752-391b-4770-9191-3494df4e3999\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738900 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nw9q\" (UniqueName: \"kubernetes.io/projected/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-kube-api-access-6nw9q\") pod \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738926 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data-custom\") pod \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.738951 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwc9w\" (UniqueName: \"kubernetes.io/projected/8c52b752-391b-4770-9191-3494df4e3999-kube-api-access-pwc9w\") pod \"8c52b752-391b-4770-9191-3494df4e3999\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739017 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data\") pod \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739034 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-combined-ca-bundle\") pod \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739065 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data-custom\") pod \"04232dcc-dda5-4774-b999-5104335f2da0\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739140 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-config-data\") pod \"8c52b752-391b-4770-9191-3494df4e3999\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739180 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtjfn\" (UniqueName: \"kubernetes.io/projected/04232dcc-dda5-4774-b999-5104335f2da0-kube-api-access-rtjfn\") pod \"04232dcc-dda5-4774-b999-5104335f2da0\" (UID: \"04232dcc-dda5-4774-b999-5104335f2da0\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739209 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-logs\") pod \"8c52b752-391b-4770-9191-3494df4e3999\" (UID: \"8c52b752-391b-4770-9191-3494df4e3999\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739223 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-logs\") pod \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\" (UID: \"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739415 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrsh9\" (UniqueName: \"kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9\") pod \"keystone-910a-account-create-update-kc8n7\" (UID: \"7913b849-56c2-493f-822d-f8f15dfc4fe1\") " pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739538 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts\") pod \"keystone-910a-account-create-update-kc8n7\" (UID: \"7913b849-56c2-493f-822d-f8f15dfc4fe1\") " pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739615 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739626 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5v9j\" (UniqueName: \"kubernetes.io/projected/7f0cf9e9-b663-4be3-a435-c7dd6deea228-kube-api-access-x5v9j\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739636 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f0cf9e9-b663-4be3-a435-c7dd6deea228-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739644 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739653 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wckzf\" (UniqueName: \"kubernetes.io/projected/7202de6a-156c-4c06-9e08-3e62cfcf367e-kube-api-access-wckzf\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739662 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvnvx\" (UniqueName: \"kubernetes.io/projected/14a8699a-66db-48c6-8834-bda4e21ef1d9-kube-api-access-vvnvx\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739670 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7202de6a-156c-4c06-9e08-3e62cfcf367e-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739678 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739686 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739694 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/14a8699a-66db-48c6-8834-bda4e21ef1d9-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739702 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.739710 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97409ffd-f1ab-4a1a-9939-a041a4085b1a-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.755998 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04232dcc-dda5-4774-b999-5104335f2da0-logs" (OuterVolumeSpecName: "logs") pod "04232dcc-dda5-4774-b999-5104335f2da0" (UID: "04232dcc-dda5-4774-b999-5104335f2da0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.759699 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-logs" (OuterVolumeSpecName: "logs") pod "bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" (UID: "bc6ef35e-2d40-46c4-9cf7-e2e8adae067a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.759781 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6be13cc-01ed-441f-b2c9-dc024fcb4b18" (UID: "b6be13cc-01ed-441f-b2c9-dc024fcb4b18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.760003 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-logs" (OuterVolumeSpecName: "logs") pod "8c52b752-391b-4770-9191-3494df4e3999" (UID: "8c52b752-391b-4770-9191-3494df4e3999"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.762003 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97409ffd-f1ab-4a1a-9939-a041a4085b1a-kube-api-access-xhq9p" (OuterVolumeSpecName: "kube-api-access-xhq9p") pod "97409ffd-f1ab-4a1a-9939-a041a4085b1a" (UID: "97409ffd-f1ab-4a1a-9939-a041a4085b1a"). InnerVolumeSpecName "kube-api-access-xhq9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: E0202 10:54:50.762108 4909 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.762473 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-api-access-skgpn" (OuterVolumeSpecName: "kube-api-access-skgpn") pod "236e10c6-5b7d-4f9e-b82a-5c68edc93692" (UID: "236e10c6-5b7d-4f9e-b82a-5c68edc93692"). InnerVolumeSpecName "kube-api-access-skgpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.762562 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8c52b752-391b-4770-9191-3494df4e3999" (UID: "8c52b752-391b-4770-9191-3494df4e3999"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: E0202 10:54:50.762889 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts podName:7913b849-56c2-493f-822d-f8f15dfc4fe1 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:52.762864914 +0000 UTC m=+1418.508965679 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts") pod "keystone-910a-account-create-update-kc8n7" (UID: "7913b849-56c2-493f-822d-f8f15dfc4fe1") : configmap "openstack-scripts" not found Feb 02 10:54:50 crc kubenswrapper[4909]: E0202 10:54:50.763094 4909 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 02 10:54:50 crc kubenswrapper[4909]: E0202 10:54:50.763116 4909 projected.go:194] Error preparing data for projected volume kube-api-access-rrsh9 for pod openstack/keystone-910a-account-create-update-kc8n7: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 02 10:54:50 crc kubenswrapper[4909]: E0202 10:54:50.763189 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts podName:d5e2e76d-5e02-4488-ae0d-5acbdb1aa060 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:52.763166292 +0000 UTC m=+1418.509267097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts") pod "root-account-create-update-blz7s" (UID: "d5e2e76d-5e02-4488-ae0d-5acbdb1aa060") : configmap "openstack-scripts" not found Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.762060 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "14a8699a-66db-48c6-8834-bda4e21ef1d9" (UID: "14a8699a-66db-48c6-8834-bda4e21ef1d9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: E0202 10:54:50.763393 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9 podName:7913b849-56c2-493f-822d-f8f15dfc4fe1 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:52.763380498 +0000 UTC m=+1418.509481313 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rrsh9" (UniqueName: "kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9") pod "keystone-910a-account-create-update-kc8n7" (UID: "7913b849-56c2-493f-822d-f8f15dfc4fe1") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.790993 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f0cf9e9-b663-4be3-a435-c7dd6deea228" (UID: "7f0cf9e9-b663-4be3-a435-c7dd6deea228"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.793071 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" (UID: "bc6ef35e-2d40-46c4-9cf7-e2e8adae067a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.797696 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04232dcc-dda5-4774-b999-5104335f2da0-kube-api-access-rtjfn" (OuterVolumeSpecName: "kube-api-access-rtjfn") pod "04232dcc-dda5-4774-b999-5104335f2da0" (UID: "04232dcc-dda5-4774-b999-5104335f2da0"). InnerVolumeSpecName "kube-api-access-rtjfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.799073 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-scripts" (OuterVolumeSpecName: "scripts") pod "8c52b752-391b-4770-9191-3494df4e3999" (UID: "8c52b752-391b-4770-9191-3494df4e3999"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.799173 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "04232dcc-dda5-4774-b999-5104335f2da0" (UID: "04232dcc-dda5-4774-b999-5104335f2da0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842550 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842611 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842621 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skgpn\" (UniqueName: \"kubernetes.io/projected/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-api-access-skgpn\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842634 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtjfn\" (UniqueName: \"kubernetes.io/projected/04232dcc-dda5-4774-b999-5104335f2da0-kube-api-access-rtjfn\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842644 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842702 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842712 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842720 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04232dcc-dda5-4774-b999-5104335f2da0-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842959 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842976 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhq9p\" (UniqueName: \"kubernetes.io/projected/97409ffd-f1ab-4a1a-9939-a041a4085b1a-kube-api-access-xhq9p\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842985 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c52b752-391b-4770-9191-3494df4e3999-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.842995 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.843004 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.850385 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-kube-api-access-6nw9q" (OuterVolumeSpecName: "kube-api-access-6nw9q") pod "bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" (UID: "bc6ef35e-2d40-46c4-9cf7-e2e8adae067a"). InnerVolumeSpecName "kube-api-access-6nw9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.852006 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c52b752-391b-4770-9191-3494df4e3999-kube-api-access-pwc9w" (OuterVolumeSpecName: "kube-api-access-pwc9w") pod "8c52b752-391b-4770-9191-3494df4e3999" (UID: "8c52b752-391b-4770-9191-3494df4e3999"). InnerVolumeSpecName "kube-api-access-pwc9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.852476 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "8c52b752-391b-4770-9191-3494df4e3999" (UID: "8c52b752-391b-4770-9191-3494df4e3999"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.873141 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97409ffd-f1ab-4a1a-9939-a041a4085b1a" (UID: "97409ffd-f1ab-4a1a-9939-a041a4085b1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.940927 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.944107 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-combined-ca-bundle\") pod \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.944150 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-config-data\") pod \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.944517 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhp6b\" (UniqueName: \"kubernetes.io/projected/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-kube-api-access-jhp6b\") pod \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\" (UID: \"2399f7a5-86e0-46bd-9f3d-624d1208b9cc\") " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.944984 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.945004 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.945027 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.945040 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nw9q\" (UniqueName: \"kubernetes.io/projected/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-kube-api-access-6nw9q\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.945053 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwc9w\" (UniqueName: \"kubernetes.io/projected/8c52b752-391b-4770-9191-3494df4e3999-kube-api-access-pwc9w\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.981793 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-config-data" (OuterVolumeSpecName: "config-data") pod "2399f7a5-86e0-46bd-9f3d-624d1208b9cc" (UID: "2399f7a5-86e0-46bd-9f3d-624d1208b9cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:50 crc kubenswrapper[4909]: I0202 10:54:50.993046 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-kube-api-access-jhp6b" (OuterVolumeSpecName: "kube-api-access-jhp6b") pod "2399f7a5-86e0-46bd-9f3d-624d1208b9cc" (UID: "2399f7a5-86e0-46bd-9f3d-624d1208b9cc"). InnerVolumeSpecName "kube-api-access-jhp6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.012459 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14a8699a-66db-48c6-8834-bda4e21ef1d9" (UID: "14a8699a-66db-48c6-8834-bda4e21ef1d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.024850 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "14a8699a-66db-48c6-8834-bda4e21ef1d9" (UID: "14a8699a-66db-48c6-8834-bda4e21ef1d9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.029530 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588438b1-3078-46cb-a08a-8f1f215ee5f4" path="/var/lib/kubelet/pods/588438b1-3078-46cb-a08a-8f1f215ee5f4/volumes" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.030285 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b651773-9cf9-4458-9c39-37b9104ff41e" path="/var/lib/kubelet/pods/9b651773-9cf9-4458-9c39-37b9104ff41e/volumes" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.031148 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2391e0d-9f13-489b-8e71-15a8da6cfbfe" path="/var/lib/kubelet/pods/d2391e0d-9f13-489b-8e71-15a8da6cfbfe/volumes" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.037605 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" (UID: "bc6ef35e-2d40-46c4-9cf7-e2e8adae067a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.048464 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.048531 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.048549 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.048561 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.048574 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhp6b\" (UniqueName: \"kubernetes.io/projected/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-kube-api-access-jhp6b\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.059399 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data" (OuterVolumeSpecName: "config-data") pod "14a8699a-66db-48c6-8834-bda4e21ef1d9" (UID: "14a8699a-66db-48c6-8834-bda4e21ef1d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.087640 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2399f7a5-86e0-46bd-9f3d-624d1208b9cc" (UID: "2399f7a5-86e0-46bd-9f3d-624d1208b9cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.131879 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.135928 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7202de6a-156c-4c06-9e08-3e62cfcf367e" (UID: "7202de6a-156c-4c06-9e08-3e62cfcf367e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.136899 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04232dcc-dda5-4774-b999-5104335f2da0" (UID: "04232dcc-dda5-4774-b999-5104335f2da0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.144218 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-config-data" (OuterVolumeSpecName: "config-data") pod "97409ffd-f1ab-4a1a-9939-a041a4085b1a" (UID: "97409ffd-f1ab-4a1a-9939-a041a4085b1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.148179 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.153558 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.153888 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.153990 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.154004 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.154013 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2399f7a5-86e0-46bd-9f3d-624d1208b9cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.154022 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.159043 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.166635 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.177071 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191436 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191795 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d4cb98fc-6tg42" event={"ID":"d4589304-68d2-48c9-a691-e34a9cb4c75b","Type":"ContainerDied","Data":"d45f2f677836f43b339590ff0f90600a91d9a2459322b376e7c3797133b37fbf"} Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191853 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba559652-0584-49c5-91d6-7d7fdd596dc2","Type":"ContainerDied","Data":"f6edd9044a4aea805102e06530f56978ae8028bdfcf591965857f26a4ce79788"} Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191871 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6edd9044a4aea805102e06530f56978ae8028bdfcf591965857f26a4ce79788" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191882 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2399f7a5-86e0-46bd-9f3d-624d1208b9cc","Type":"ContainerDied","Data":"a0f1e33d6f4f839c914c2fcd7281839ff059cf40f2d13b954274a4f08f99dc88"} Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191895 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75589bd9c8-npg4p" event={"ID":"bc6ef35e-2d40-46c4-9cf7-e2e8adae067a","Type":"ContainerDied","Data":"3666c9dd742bb2e660a2e5f09ee5ad6371c8f5a2cea7da3cc530291b2492447c"} Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191912 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9b24f572-8a70-4a46-b3cf-e50ae4859892","Type":"ContainerDied","Data":"a5a62e112f12cc73a9b1a64250d6c437a2ab6a6453b3ed40c4fdc0c5c3f85c8e"} Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191925 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5a62e112f12cc73a9b1a64250d6c437a2ab6a6453b3ed40c4fdc0c5c3f85c8e" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191935 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bbd5778f6-n6mw6" event={"ID":"ddda8a20-60ba-4ae9-837a-44fa44518b8a","Type":"ContainerDied","Data":"1394fd3d1fe33a21fdb054a3861a35490ddf37761eae2045385f28453f46ff3a"} Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191951 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-blz7s" event={"ID":"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060","Type":"ContainerDied","Data":"50c92004cccd536ad46a63bf928a30fe9638398efd3f8ced1d508d7cc8a24b19"} Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191964 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50c92004cccd536ad46a63bf928a30fe9638398efd3f8ced1d508d7cc8a24b19" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.191982 4909 scope.go:117] "RemoveContainer" containerID="397ade4dfec5a408d839ad6fb1e26085b9b09d9f93c56399c0d7b99f36c1aded" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.193710 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.193923 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.195757 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.196700 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.196711 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"9dab4432-0762-45a8-88ab-3a99217a790f","Type":"ContainerDied","Data":"9b7ddf81296dc46ea659b3831d6bab2f73b124d8cb5ba35a9e22af4b77f8a85b"} Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.197434 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d56d8dff8-fh9sw" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.197537 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c7fbb5f9b-4fbwp" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.197607 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.198239 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.199044 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.201172 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.213010 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.220548 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.227773 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "236e10c6-5b7d-4f9e-b82a-5c68edc93692" (UID: "236e10c6-5b7d-4f9e-b82a-5c68edc93692"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.227886 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8c52b752-391b-4770-9191-3494df4e3999" (UID: "8c52b752-391b-4770-9191-3494df4e3999"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.233538 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data" (OuterVolumeSpecName: "config-data") pod "bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" (UID: "bc6ef35e-2d40-46c4-9cf7-e2e8adae067a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.234474 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.237955 4909 scope.go:117] "RemoveContainer" containerID="766cdd0ecebd2c830eb9eb4207f7f04fbf9e06724a13a143f2ffa40073116503" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.255479 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-config-data" (OuterVolumeSpecName: "config-data") pod "7f0cf9e9-b663-4be3-a435-c7dd6deea228" (UID: "7f0cf9e9-b663-4be3-a435-c7dd6deea228"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.259646 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpgnr\" (UniqueName: \"kubernetes.io/projected/9dab4432-0762-45a8-88ab-3a99217a790f-kube-api-access-wpgnr\") pod \"9dab4432-0762-45a8-88ab-3a99217a790f\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.259721 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd56h\" (UniqueName: \"kubernetes.io/projected/d4589304-68d2-48c9-a691-e34a9cb4c75b-kube-api-access-jd56h\") pod \"d4589304-68d2-48c9-a691-e34a9cb4c75b\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.259750 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data-custom\") pod \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.259776 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-public-tls-certs\") pod \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.259800 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-config-data\") pod \"ba559652-0584-49c5-91d6-7d7fdd596dc2\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.259851 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-kolla-config\") pod \"9dab4432-0762-45a8-88ab-3a99217a790f\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.259884 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58h4b\" (UniqueName: \"kubernetes.io/projected/ba559652-0584-49c5-91d6-7d7fdd596dc2-kube-api-access-58h4b\") pod \"ba559652-0584-49c5-91d6-7d7fdd596dc2\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.260006 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-combined-ca-bundle\") pod \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.260037 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddda8a20-60ba-4ae9-837a-44fa44518b8a-logs\") pod \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.260059 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-internal-tls-certs\") pod \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.261154 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.261181 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.261195 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.261208 4909 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.267024 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ddda8a20-60ba-4ae9-837a-44fa44518b8a-logs" (OuterVolumeSpecName: "logs") pod "ddda8a20-60ba-4ae9-837a-44fa44518b8a" (UID: "ddda8a20-60ba-4ae9-837a-44fa44518b8a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.267198 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-blz7s" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.267665 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "9dab4432-0762-45a8-88ab-3a99217a790f" (UID: "9dab4432-0762-45a8-88ab-3a99217a790f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.273140 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.274086 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7202de6a-156c-4c06-9e08-3e62cfcf367e" (UID: "7202de6a-156c-4c06-9e08-3e62cfcf367e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.275412 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4589304-68d2-48c9-a691-e34a9cb4c75b-kube-api-access-jd56h" (OuterVolumeSpecName: "kube-api-access-jd56h") pod "d4589304-68d2-48c9-a691-e34a9cb4c75b" (UID: "d4589304-68d2-48c9-a691-e34a9cb4c75b"). InnerVolumeSpecName "kube-api-access-jd56h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.278767 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ddda8a20-60ba-4ae9-837a-44fa44518b8a" (UID: "ddda8a20-60ba-4ae9-837a-44fa44518b8a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.281642 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b6be13cc-01ed-441f-b2c9-dc024fcb4b18" (UID: "b6be13cc-01ed-441f-b2c9-dc024fcb4b18"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.279002 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.281552 4909 scope.go:117] "RemoveContainer" containerID="e61cb4193c75698f10d9c67582e2641c6ccc4d6732028cd8616852c78bc88b51" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.283768 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dab4432-0762-45a8-88ab-3a99217a790f-kube-api-access-wpgnr" (OuterVolumeSpecName: "kube-api-access-wpgnr") pod "9dab4432-0762-45a8-88ab-3a99217a790f" (UID: "9dab4432-0762-45a8-88ab-3a99217a790f"). InnerVolumeSpecName "kube-api-access-wpgnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.284468 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "236e10c6-5b7d-4f9e-b82a-5c68edc93692" (UID: "236e10c6-5b7d-4f9e-b82a-5c68edc93692"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.291285 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba559652-0584-49c5-91d6-7d7fdd596dc2-kube-api-access-58h4b" (OuterVolumeSpecName: "kube-api-access-58h4b") pod "ba559652-0584-49c5-91d6-7d7fdd596dc2" (UID: "ba559652-0584-49c5-91d6-7d7fdd596dc2"). InnerVolumeSpecName "kube-api-access-58h4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.327099 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-config-data" (OuterVolumeSpecName: "config-data") pod "7202de6a-156c-4c06-9e08-3e62cfcf367e" (UID: "7202de6a-156c-4c06-9e08-3e62cfcf367e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.332463 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data" (OuterVolumeSpecName: "config-data") pod "04232dcc-dda5-4774-b999-5104335f2da0" (UID: "04232dcc-dda5-4774-b999-5104335f2da0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.337462 4909 scope.go:117] "RemoveContainer" containerID="b85d85d06b59a99a123c0147182bc23c0f8782f8baf6bdc50f6e3c5737f2292a" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.348233 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-config-data" (OuterVolumeSpecName: "config-data") pod "ba559652-0584-49c5-91d6-7d7fdd596dc2" (UID: "ba559652-0584-49c5-91d6-7d7fdd596dc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.351717 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ddda8a20-60ba-4ae9-837a-44fa44518b8a" (UID: "ddda8a20-60ba-4ae9-837a-44fa44518b8a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.353515 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddda8a20-60ba-4ae9-837a-44fa44518b8a" (UID: "ddda8a20-60ba-4ae9-837a-44fa44518b8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361370 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts\") pod \"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060\" (UID: \"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361429 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data\") pod \"d4589304-68d2-48c9-a691-e34a9cb4c75b\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361449 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-combined-ca-bundle\") pod \"d4589304-68d2-48c9-a691-e34a9cb4c75b\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8j8j\" (UniqueName: \"kubernetes.io/projected/ddda8a20-60ba-4ae9-837a-44fa44518b8a-kube-api-access-g8j8j\") pod \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361499 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-combined-ca-bundle\") pod \"9dab4432-0762-45a8-88ab-3a99217a790f\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361515 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsh84\" (UniqueName: \"kubernetes.io/projected/9b24f572-8a70-4a46-b3cf-e50ae4859892-kube-api-access-lsh84\") pod \"9b24f572-8a70-4a46-b3cf-e50ae4859892\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361532 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4589304-68d2-48c9-a691-e34a9cb4c75b-logs\") pod \"d4589304-68d2-48c9-a691-e34a9cb4c75b\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361560 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-memcached-tls-certs\") pod \"9dab4432-0762-45a8-88ab-3a99217a790f\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361599 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-combined-ca-bundle\") pod \"ba559652-0584-49c5-91d6-7d7fdd596dc2\" (UID: \"ba559652-0584-49c5-91d6-7d7fdd596dc2\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361619 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-scripts\") pod \"9b24f572-8a70-4a46-b3cf-e50ae4859892\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361642 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data-custom\") pod \"d4589304-68d2-48c9-a691-e34a9cb4c75b\" (UID: \"d4589304-68d2-48c9-a691-e34a9cb4c75b\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361748 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data\") pod \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\" (UID: \"ddda8a20-60ba-4ae9-837a-44fa44518b8a\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361769 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w68r\" (UniqueName: \"kubernetes.io/projected/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-kube-api-access-7w68r\") pod \"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060\" (UID: \"d5e2e76d-5e02-4488-ae0d-5acbdb1aa060\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361817 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data-custom\") pod \"9b24f572-8a70-4a46-b3cf-e50ae4859892\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361844 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data\") pod \"9b24f572-8a70-4a46-b3cf-e50ae4859892\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361883 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b24f572-8a70-4a46-b3cf-e50ae4859892-etc-machine-id\") pod \"9b24f572-8a70-4a46-b3cf-e50ae4859892\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361920 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-config-data\") pod \"9dab4432-0762-45a8-88ab-3a99217a790f\" (UID: \"9dab4432-0762-45a8-88ab-3a99217a790f\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.361934 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-combined-ca-bundle\") pod \"9b24f572-8a70-4a46-b3cf-e50ae4859892\" (UID: \"9b24f572-8a70-4a46-b3cf-e50ae4859892\") " Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362543 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362562 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362572 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362581 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362591 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpgnr\" (UniqueName: \"kubernetes.io/projected/9dab4432-0762-45a8-88ab-3a99217a790f-kube-api-access-wpgnr\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362600 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd56h\" (UniqueName: \"kubernetes.io/projected/d4589304-68d2-48c9-a691-e34a9cb4c75b-kube-api-access-jd56h\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362609 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362618 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362628 4909 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362636 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58h4b\" (UniqueName: \"kubernetes.io/projected/ba559652-0584-49c5-91d6-7d7fdd596dc2-kube-api-access-58h4b\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362646 4909 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362655 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362664 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ddda8a20-60ba-4ae9-837a-44fa44518b8a-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.362672 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.363223 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4589304-68d2-48c9-a691-e34a9cb4c75b-logs" (OuterVolumeSpecName: "logs") pod "d4589304-68d2-48c9-a691-e34a9cb4c75b" (UID: "d4589304-68d2-48c9-a691-e34a9cb4c75b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.363252 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" (UID: "d5e2e76d-5e02-4488-ae0d-5acbdb1aa060"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.367337 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b24f572-8a70-4a46-b3cf-e50ae4859892-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9b24f572-8a70-4a46-b3cf-e50ae4859892" (UID: "9b24f572-8a70-4a46-b3cf-e50ae4859892"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.368635 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddda8a20-60ba-4ae9-837a-44fa44518b8a-kube-api-access-g8j8j" (OuterVolumeSpecName: "kube-api-access-g8j8j") pod "ddda8a20-60ba-4ae9-837a-44fa44518b8a" (UID: "ddda8a20-60ba-4ae9-837a-44fa44518b8a"). InnerVolumeSpecName "kube-api-access-g8j8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.369175 4909 scope.go:117] "RemoveContainer" containerID="cdf003d472b80a2499008207df29bc8a7fe1a4f1fbca8fa6aa522ad599f3f1ea" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.369943 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d4589304-68d2-48c9-a691-e34a9cb4c75b" (UID: "d4589304-68d2-48c9-a691-e34a9cb4c75b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.370169 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-config-data" (OuterVolumeSpecName: "config-data") pod "9dab4432-0762-45a8-88ab-3a99217a790f" (UID: "9dab4432-0762-45a8-88ab-3a99217a790f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.371550 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9b24f572-8a70-4a46-b3cf-e50ae4859892" (UID: "9b24f572-8a70-4a46-b3cf-e50ae4859892"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.375646 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b24f572-8a70-4a46-b3cf-e50ae4859892-kube-api-access-lsh84" (OuterVolumeSpecName: "kube-api-access-lsh84") pod "9b24f572-8a70-4a46-b3cf-e50ae4859892" (UID: "9b24f572-8a70-4a46-b3cf-e50ae4859892"). InnerVolumeSpecName "kube-api-access-lsh84". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.377220 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-scripts" (OuterVolumeSpecName: "scripts") pod "9b24f572-8a70-4a46-b3cf-e50ae4859892" (UID: "9b24f572-8a70-4a46-b3cf-e50ae4859892"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.400376 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-kube-api-access-7w68r" (OuterVolumeSpecName: "kube-api-access-7w68r") pod "d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" (UID: "d5e2e76d-5e02-4488-ae0d-5acbdb1aa060"). InnerVolumeSpecName "kube-api-access-7w68r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.403458 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c52b752-391b-4770-9191-3494df4e3999" (UID: "8c52b752-391b-4770-9191-3494df4e3999"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.410259 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "14a8699a-66db-48c6-8834-bda4e21ef1d9" (UID: "14a8699a-66db-48c6-8834-bda4e21ef1d9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.414831 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "236e10c6-5b7d-4f9e-b82a-5c68edc93692" (UID: "236e10c6-5b7d-4f9e-b82a-5c68edc93692"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.422715 4909 scope.go:117] "RemoveContainer" containerID="e4c6472255c48be75153726b320528677bd55174125c6e07a90c92c9ea5dd4e3" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.452426 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba559652-0584-49c5-91d6-7d7fdd596dc2" (UID: "ba559652-0584-49c5-91d6-7d7fdd596dc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.464146 4909 scope.go:117] "RemoveContainer" containerID="d022e9f76cf1abc36a5058aa5c728c3e6e96256eb301dd364d41444bb1c46cc1" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.491500 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9dab4432-0762-45a8-88ab-3a99217a790f-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.491539 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/236e10c6-5b7d-4f9e-b82a-5c68edc93692-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.491557 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492033 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8j8j\" (UniqueName: \"kubernetes.io/projected/ddda8a20-60ba-4ae9-837a-44fa44518b8a-kube-api-access-g8j8j\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492058 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4589304-68d2-48c9-a691-e34a9cb4c75b-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492071 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsh84\" (UniqueName: \"kubernetes.io/projected/9b24f572-8a70-4a46-b3cf-e50ae4859892-kube-api-access-lsh84\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492083 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/14a8699a-66db-48c6-8834-bda4e21ef1d9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492099 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492110 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba559652-0584-49c5-91d6-7d7fdd596dc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492120 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492131 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492148 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w68r\" (UniqueName: \"kubernetes.io/projected/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060-kube-api-access-7w68r\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492160 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.492171 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b24f572-8a70-4a46-b3cf-e50ae4859892-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: E0202 10:54:51.492699 4909 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 02 10:54:51 crc kubenswrapper[4909]: E0202 10:54:51.492771 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data podName:b441d32f-f76f-4e7b-b3fe-40e93b126567 nodeName:}" failed. No retries permitted until 2026-02-02 10:54:59.492744438 +0000 UTC m=+1425.238845173 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data") pod "rabbitmq-server-0" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567") : configmap "rabbitmq-config-data" not found Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.512731 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data" (OuterVolumeSpecName: "config-data") pod "ddda8a20-60ba-4ae9-837a-44fa44518b8a" (UID: "ddda8a20-60ba-4ae9-837a-44fa44518b8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.531066 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data" (OuterVolumeSpecName: "config-data") pod "d4589304-68d2-48c9-a691-e34a9cb4c75b" (UID: "d4589304-68d2-48c9-a691-e34a9cb4c75b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.535336 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "9dab4432-0762-45a8-88ab-3a99217a790f" (UID: "9dab4432-0762-45a8-88ab-3a99217a790f"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.537013 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7f0cf9e9-b663-4be3-a435-c7dd6deea228" (UID: "7f0cf9e9-b663-4be3-a435-c7dd6deea228"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.545839 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-config-data" (OuterVolumeSpecName: "config-data") pod "8c52b752-391b-4770-9191-3494df4e3999" (UID: "8c52b752-391b-4770-9191-3494df4e3999"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.550231 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ddda8a20-60ba-4ae9-837a-44fa44518b8a" (UID: "ddda8a20-60ba-4ae9-837a-44fa44518b8a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.564138 4909 scope.go:117] "RemoveContainer" containerID="a5be7b718c825caf194dfbbef2df9f7def820d23f84184c159660c56fbb1b591" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.575534 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-75589bd9c8-npg4p"] Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.582991 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7202de6a-156c-4c06-9e08-3e62cfcf367e" (UID: "7202de6a-156c-4c06-9e08-3e62cfcf367e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.585059 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "97409ffd-f1ab-4a1a-9939-a041a4085b1a" (UID: "97409ffd-f1ab-4a1a-9939-a041a4085b1a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.587405 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-75589bd9c8-npg4p"] Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.588845 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b6be13cc-01ed-441f-b2c9-dc024fcb4b18" (UID: "b6be13cc-01ed-441f-b2c9-dc024fcb4b18"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.599199 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.599236 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/97409ffd-f1ab-4a1a-9939-a041a4085b1a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.599521 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f0cf9e9-b663-4be3-a435-c7dd6deea228-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.599532 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ddda8a20-60ba-4ae9-837a-44fa44518b8a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.599543 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c52b752-391b-4770-9191-3494df4e3999-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.599552 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6be13cc-01ed-441f-b2c9-dc024fcb4b18-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.599560 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.608900 4909 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.608944 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7202de6a-156c-4c06-9e08-3e62cfcf367e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.614078 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.619240 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "04232dcc-dda5-4774-b999-5104335f2da0" (UID: "04232dcc-dda5-4774-b999-5104335f2da0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.620863 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.624832 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4589304-68d2-48c9-a691-e34a9cb4c75b" (UID: "d4589304-68d2-48c9-a691-e34a9cb4c75b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.639084 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.644840 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.649397 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9dab4432-0762-45a8-88ab-3a99217a790f" (UID: "9dab4432-0762-45a8-88ab-3a99217a790f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.710949 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4589304-68d2-48c9-a691-e34a9cb4c75b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.710983 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.710992 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dab4432-0762-45a8-88ab-3a99217a790f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.728914 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "04232dcc-dda5-4774-b999-5104335f2da0" (UID: "04232dcc-dda5-4774-b999-5104335f2da0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.733954 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b24f572-8a70-4a46-b3cf-e50ae4859892" (UID: "9b24f572-8a70-4a46-b3cf-e50ae4859892"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.734624 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data" (OuterVolumeSpecName: "config-data") pod "9b24f572-8a70-4a46-b3cf-e50ae4859892" (UID: "9b24f572-8a70-4a46-b3cf-e50ae4859892"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.813767 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.813797 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b24f572-8a70-4a46-b3cf-e50ae4859892-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.813820 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/04232dcc-dda5-4774-b999-5104335f2da0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.938997 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6c7fbb5f9b-4fbwp"] Feb 02 10:54:51 crc kubenswrapper[4909]: I0202 10:54:51.951765 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6c7fbb5f9b-4fbwp"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.002205 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5d56d8dff8-fh9sw"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.005750 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5d56d8dff8-fh9sw"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.023335 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.041673 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.056218 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.065782 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.073970 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.081608 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.094934 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.107287 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.111064 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.118441 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: E0202 10:54:52.120776 4909 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 02 10:54:52 crc kubenswrapper[4909]: E0202 10:54:52.120846 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data podName:1ab15f72-b249-42d5-8698-273c5afc7758 nodeName:}" failed. No retries permitted until 2026-02-02 10:55:00.12082878 +0000 UTC m=+1425.866929515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data") pod "rabbitmq-cell1-server-0" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758") : configmap "rabbitmq-cell1-config-data" not found Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.124756 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.203549 4909 generic.go:334] "Generic (PLEG): container finished" podID="68e55a25-f51a-49a9-af91-ffbab9ad611e" containerID="a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9" exitCode=0 Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.203595 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68e55a25-f51a-49a9-af91-ffbab9ad611e","Type":"ContainerDied","Data":"a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9"} Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.203616 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"68e55a25-f51a-49a9-af91-ffbab9ad611e","Type":"ContainerDied","Data":"a7c5a4187e08328045571b7c3962f6647c59fbeb74ad545b12297f90a448ec37"} Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.203631 4909 scope.go:117] "RemoveContainer" containerID="a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.203726 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.206131 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bbd5778f6-n6mw6" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.217611 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d4cb98fc-6tg42" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.221385 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-operator-scripts\") pod \"68e55a25-f51a-49a9-af91-ffbab9ad611e\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.221424 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"68e55a25-f51a-49a9-af91-ffbab9ad611e\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.221500 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-default\") pod \"68e55a25-f51a-49a9-af91-ffbab9ad611e\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.221559 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4r9g\" (UniqueName: \"kubernetes.io/projected/68e55a25-f51a-49a9-af91-ffbab9ad611e-kube-api-access-p4r9g\") pod \"68e55a25-f51a-49a9-af91-ffbab9ad611e\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.221590 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-generated\") pod \"68e55a25-f51a-49a9-af91-ffbab9ad611e\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.221615 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-combined-ca-bundle\") pod \"68e55a25-f51a-49a9-af91-ffbab9ad611e\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.221647 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-kolla-config\") pod \"68e55a25-f51a-49a9-af91-ffbab9ad611e\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.221682 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-galera-tls-certs\") pod \"68e55a25-f51a-49a9-af91-ffbab9ad611e\" (UID: \"68e55a25-f51a-49a9-af91-ffbab9ad611e\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.223399 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68e55a25-f51a-49a9-af91-ffbab9ad611e" (UID: "68e55a25-f51a-49a9-af91-ffbab9ad611e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.224495 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "68e55a25-f51a-49a9-af91-ffbab9ad611e" (UID: "68e55a25-f51a-49a9-af91-ffbab9ad611e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.225130 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "68e55a25-f51a-49a9-af91-ffbab9ad611e" (UID: "68e55a25-f51a-49a9-af91-ffbab9ad611e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.225216 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "68e55a25-f51a-49a9-af91-ffbab9ad611e" (UID: "68e55a25-f51a-49a9-af91-ffbab9ad611e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.231015 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e55a25-f51a-49a9-af91-ffbab9ad611e-kube-api-access-p4r9g" (OuterVolumeSpecName: "kube-api-access-p4r9g") pod "68e55a25-f51a-49a9-af91-ffbab9ad611e" (UID: "68e55a25-f51a-49a9-af91-ffbab9ad611e"). InnerVolumeSpecName "kube-api-access-p4r9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.235441 4909 generic.go:334] "Generic (PLEG): container finished" podID="b441d32f-f76f-4e7b-b3fe-40e93b126567" containerID="7fce69eec287d7c5a7114d96e223cf91ca30b188e5b132915844731ca8a68ec2" exitCode=0 Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.235521 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b441d32f-f76f-4e7b-b3fe-40e93b126567","Type":"ContainerDied","Data":"7fce69eec287d7c5a7114d96e223cf91ca30b188e5b132915844731ca8a68ec2"} Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.246666 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-910a-account-create-update-kc8n7" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.246714 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.246864 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-blz7s" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.248101 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.274185 4909 scope.go:117] "RemoveContainer" containerID="41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.281159 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "68e55a25-f51a-49a9-af91-ffbab9ad611e" (UID: "68e55a25-f51a-49a9-af91-ffbab9ad611e"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.285423 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "68e55a25-f51a-49a9-af91-ffbab9ad611e" (UID: "68e55a25-f51a-49a9-af91-ffbab9ad611e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.293739 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6bbd5778f6-n6mw6"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.307872 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6bbd5778f6-n6mw6"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.319654 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6d4cb98fc-6tg42"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.319867 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68e55a25-f51a-49a9-af91-ffbab9ad611e" (UID: "68e55a25-f51a-49a9-af91-ffbab9ad611e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.323323 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.323354 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.323367 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.323378 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4r9g\" (UniqueName: \"kubernetes.io/projected/68e55a25-f51a-49a9-af91-ffbab9ad611e-kube-api-access-p4r9g\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.323389 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/68e55a25-f51a-49a9-af91-ffbab9ad611e-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.323401 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.323411 4909 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/68e55a25-f51a-49a9-af91-ffbab9ad611e-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.323421 4909 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/68e55a25-f51a-49a9-af91-ffbab9ad611e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.327672 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6d4cb98fc-6tg42"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.352552 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.425076 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.477074 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-910a-account-create-update-kc8n7"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.555348 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-910a-account-create-update-kc8n7"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.596254 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-blz7s"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.598235 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrsh9\" (UniqueName: \"kubernetes.io/projected/7913b849-56c2-493f-822d-f8f15dfc4fe1-kube-api-access-rrsh9\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.598265 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7913b849-56c2-493f-822d-f8f15dfc4fe1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.601379 4909 scope.go:117] "RemoveContainer" containerID="a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9" Feb 02 10:54:52 crc kubenswrapper[4909]: E0202 10:54:52.602086 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9\": container with ID starting with a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9 not found: ID does not exist" containerID="a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.602131 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9"} err="failed to get container status \"a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9\": rpc error: code = NotFound desc = could not find container \"a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9\": container with ID starting with a40ccf8ab1f9299d55595a979c250007b22d501fa1ea6728aab9956000c6cce9 not found: ID does not exist" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.602158 4909 scope.go:117] "RemoveContainer" containerID="41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9" Feb 02 10:54:52 crc kubenswrapper[4909]: E0202 10:54:52.602485 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9\": container with ID starting with 41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9 not found: ID does not exist" containerID="41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.602510 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9"} err="failed to get container status \"41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9\": rpc error: code = NotFound desc = could not find container \"41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9\": container with ID starting with 41d92ab63056ddb6217280c35f43928d9c56048d9b17b5873ed61e7a5221dda9 not found: ID does not exist" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.610701 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-blz7s"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.629102 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.637029 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.649845 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.653852 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.715857 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.720541 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 10:54:52 crc kubenswrapper[4909]: E0202 10:54:52.728328 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52e5a0b83ebdf2c32308bcba8bf2bcd22b63c5893664ccffa254ee3d4272e8b7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 02 10:54:52 crc kubenswrapper[4909]: E0202 10:54:52.729679 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52e5a0b83ebdf2c32308bcba8bf2bcd22b63c5893664ccffa254ee3d4272e8b7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 02 10:54:52 crc kubenswrapper[4909]: E0202 10:54:52.730905 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52e5a0b83ebdf2c32308bcba8bf2bcd22b63c5893664ccffa254ee3d4272e8b7" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 02 10:54:52 crc kubenswrapper[4909]: E0202 10:54:52.730938 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerName="ovn-northd" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.733081 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805012 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b441d32f-f76f-4e7b-b3fe-40e93b126567-erlang-cookie-secret\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805072 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-plugins-conf\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805106 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlm2v\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-kube-api-access-rlm2v\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805137 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805161 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-server-conf\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805222 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-plugins\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805311 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-confd\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805356 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-tls\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805381 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805420 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-erlang-cookie\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.805447 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b441d32f-f76f-4e7b-b3fe-40e93b126567-pod-info\") pod \"b441d32f-f76f-4e7b-b3fe-40e93b126567\" (UID: \"b441d32f-f76f-4e7b-b3fe-40e93b126567\") " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.807764 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.814276 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.818011 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b441d32f-f76f-4e7b-b3fe-40e93b126567-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.818398 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.829145 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.829150 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b441d32f-f76f-4e7b-b3fe-40e93b126567-pod-info" (OuterVolumeSpecName: "pod-info") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.829318 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.829396 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-kube-api-access-rlm2v" (OuterVolumeSpecName: "kube-api-access-rlm2v") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "kube-api-access-rlm2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.837896 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data" (OuterVolumeSpecName: "config-data") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.850025 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-server-conf" (OuterVolumeSpecName: "server-conf") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.907710 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.907765 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.907778 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.907789 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.907801 4909 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b441d32f-f76f-4e7b-b3fe-40e93b126567-pod-info\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.907833 4909 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b441d32f-f76f-4e7b-b3fe-40e93b126567-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.907844 4909 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.907855 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlm2v\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-kube-api-access-rlm2v\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.907877 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.907887 4909 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b441d32f-f76f-4e7b-b3fe-40e93b126567-server-conf\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.936082 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 02 10:54:52 crc kubenswrapper[4909]: I0202 10:54:52.949578 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b441d32f-f76f-4e7b-b3fe-40e93b126567" (UID: "b441d32f-f76f-4e7b-b3fe-40e93b126567"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.010116 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.010150 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b441d32f-f76f-4e7b-b3fe-40e93b126567-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.025962 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04232dcc-dda5-4774-b999-5104335f2da0" path="/var/lib/kubelet/pods/04232dcc-dda5-4774-b999-5104335f2da0/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.026697 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a8699a-66db-48c6-8834-bda4e21ef1d9" path="/var/lib/kubelet/pods/14a8699a-66db-48c6-8834-bda4e21ef1d9/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.027539 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236e10c6-5b7d-4f9e-b82a-5c68edc93692" path="/var/lib/kubelet/pods/236e10c6-5b7d-4f9e-b82a-5c68edc93692/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.029646 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2399f7a5-86e0-46bd-9f3d-624d1208b9cc" path="/var/lib/kubelet/pods/2399f7a5-86e0-46bd-9f3d-624d1208b9cc/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.030423 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e55a25-f51a-49a9-af91-ffbab9ad611e" path="/var/lib/kubelet/pods/68e55a25-f51a-49a9-af91-ffbab9ad611e/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.031582 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" path="/var/lib/kubelet/pods/7202de6a-156c-4c06-9e08-3e62cfcf367e/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.032677 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7913b849-56c2-493f-822d-f8f15dfc4fe1" path="/var/lib/kubelet/pods/7913b849-56c2-493f-822d-f8f15dfc4fe1/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.033100 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" path="/var/lib/kubelet/pods/7f0cf9e9-b663-4be3-a435-c7dd6deea228/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.033773 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c52b752-391b-4770-9191-3494df4e3999" path="/var/lib/kubelet/pods/8c52b752-391b-4770-9191-3494df4e3999/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.040037 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" path="/var/lib/kubelet/pods/97409ffd-f1ab-4a1a-9939-a041a4085b1a/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.041122 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b24f572-8a70-4a46-b3cf-e50ae4859892" path="/var/lib/kubelet/pods/9b24f572-8a70-4a46-b3cf-e50ae4859892/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.041709 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dab4432-0762-45a8-88ab-3a99217a790f" path="/var/lib/kubelet/pods/9dab4432-0762-45a8-88ab-3a99217a790f/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.047974 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" path="/var/lib/kubelet/pods/b6be13cc-01ed-441f-b2c9-dc024fcb4b18/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.048505 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba559652-0584-49c5-91d6-7d7fdd596dc2" path="/var/lib/kubelet/pods/ba559652-0584-49c5-91d6-7d7fdd596dc2/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.049057 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" path="/var/lib/kubelet/pods/bc6ef35e-2d40-46c4-9cf7-e2e8adae067a/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.050459 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4589304-68d2-48c9-a691-e34a9cb4c75b" path="/var/lib/kubelet/pods/d4589304-68d2-48c9-a691-e34a9cb4c75b/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.050971 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" path="/var/lib/kubelet/pods/d5e2e76d-5e02-4488-ae0d-5acbdb1aa060/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.051453 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" path="/var/lib/kubelet/pods/ddda8a20-60ba-4ae9-837a-44fa44518b8a/volumes" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.058230 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111158 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2vfv\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-kube-api-access-d2vfv\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111608 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-erlang-cookie\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111673 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-plugins\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111735 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ab15f72-b249-42d5-8698-273c5afc7758-pod-info\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111755 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ab15f72-b249-42d5-8698-273c5afc7758-erlang-cookie-secret\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111833 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111858 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-confd\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111904 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-tls\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111924 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111975 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-server-conf\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.111992 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-plugins-conf\") pod \"1ab15f72-b249-42d5-8698-273c5afc7758\" (UID: \"1ab15f72-b249-42d5-8698-273c5afc7758\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.112300 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.112547 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.112767 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.115964 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.115997 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.116933 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.118961 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-kube-api-access-d2vfv" (OuterVolumeSpecName: "kube-api-access-d2vfv") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "kube-api-access-d2vfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.120304 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ab15f72-b249-42d5-8698-273c5afc7758-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.126629 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1ab15f72-b249-42d5-8698-273c5afc7758-pod-info" (OuterVolumeSpecName: "pod-info") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.134586 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data" (OuterVolumeSpecName: "config-data") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.155966 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-server-conf" (OuterVolumeSpecName: "server-conf") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214084 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1ab15f72-b249-42d5-8698-273c5afc7758" (UID: "1ab15f72-b249-42d5-8698-273c5afc7758"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214463 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214488 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214500 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214524 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214534 4909 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-server-conf\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214546 4909 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ab15f72-b249-42d5-8698-273c5afc7758-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214557 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2vfv\" (UniqueName: \"kubernetes.io/projected/1ab15f72-b249-42d5-8698-273c5afc7758-kube-api-access-d2vfv\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214569 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ab15f72-b249-42d5-8698-273c5afc7758-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214578 4909 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ab15f72-b249-42d5-8698-273c5afc7758-pod-info\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.214588 4909 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ab15f72-b249-42d5-8698-273c5afc7758-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.224583 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.263779 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b441d32f-f76f-4e7b-b3fe-40e93b126567","Type":"ContainerDied","Data":"9c883b677a4169ab8884d97967256980cb494afea7976d549974ccaea0a4f2cf"} Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.264034 4909 scope.go:117] "RemoveContainer" containerID="7fce69eec287d7c5a7114d96e223cf91ca30b188e5b132915844731ca8a68ec2" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.263819 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.270233 4909 generic.go:334] "Generic (PLEG): container finished" podID="86bc749f-73e5-4bcc-8079-7c9b053e0318" containerID="0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737" exitCode=0 Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.270363 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bb594654d-prg2q" event={"ID":"86bc749f-73e5-4bcc-8079-7c9b053e0318","Type":"ContainerDied","Data":"0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737"} Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.270484 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7bb594654d-prg2q" event={"ID":"86bc749f-73e5-4bcc-8079-7c9b053e0318","Type":"ContainerDied","Data":"970657d27ac1802c3277eb374c96858a64ea9afab783dcccf3e14553e78df636"} Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.270658 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7bb594654d-prg2q" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.272267 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.278009 4909 generic.go:334] "Generic (PLEG): container finished" podID="1ab15f72-b249-42d5-8698-273c5afc7758" containerID="13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9" exitCode=0 Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.285428 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.286289 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1ab15f72-b249-42d5-8698-273c5afc7758","Type":"ContainerDied","Data":"13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9"} Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.286347 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1ab15f72-b249-42d5-8698-273c5afc7758","Type":"ContainerDied","Data":"b10b7913375dd80b6c0d930e6fa0ad13deb369bf183283f054af8abe9b8de69c"} Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.291353 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_87203850-864b-4fff-b340-25e4f5c6e7c9/ovn-northd/0.log" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.291403 4909 generic.go:334] "Generic (PLEG): container finished" podID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerID="52e5a0b83ebdf2c32308bcba8bf2bcd22b63c5893664ccffa254ee3d4272e8b7" exitCode=139 Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.291432 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87203850-864b-4fff-b340-25e4f5c6e7c9","Type":"ContainerDied","Data":"52e5a0b83ebdf2c32308bcba8bf2bcd22b63c5893664ccffa254ee3d4272e8b7"} Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.297882 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.309862 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.315059 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-internal-tls-certs\") pod \"86bc749f-73e5-4bcc-8079-7c9b053e0318\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.315219 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glpp4\" (UniqueName: \"kubernetes.io/projected/86bc749f-73e5-4bcc-8079-7c9b053e0318-kube-api-access-glpp4\") pod \"86bc749f-73e5-4bcc-8079-7c9b053e0318\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.315260 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-config-data\") pod \"86bc749f-73e5-4bcc-8079-7c9b053e0318\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.315301 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-fernet-keys\") pod \"86bc749f-73e5-4bcc-8079-7c9b053e0318\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.315348 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-combined-ca-bundle\") pod \"86bc749f-73e5-4bcc-8079-7c9b053e0318\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.315377 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-scripts\") pod \"86bc749f-73e5-4bcc-8079-7c9b053e0318\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.315393 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-credential-keys\") pod \"86bc749f-73e5-4bcc-8079-7c9b053e0318\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.315410 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-public-tls-certs\") pod \"86bc749f-73e5-4bcc-8079-7c9b053e0318\" (UID: \"86bc749f-73e5-4bcc-8079-7c9b053e0318\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.315750 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.319859 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86bc749f-73e5-4bcc-8079-7c9b053e0318-kube-api-access-glpp4" (OuterVolumeSpecName: "kube-api-access-glpp4") pod "86bc749f-73e5-4bcc-8079-7c9b053e0318" (UID: "86bc749f-73e5-4bcc-8079-7c9b053e0318"). InnerVolumeSpecName "kube-api-access-glpp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.320069 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "86bc749f-73e5-4bcc-8079-7c9b053e0318" (UID: "86bc749f-73e5-4bcc-8079-7c9b053e0318"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.321269 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-scripts" (OuterVolumeSpecName: "scripts") pod "86bc749f-73e5-4bcc-8079-7c9b053e0318" (UID: "86bc749f-73e5-4bcc-8079-7c9b053e0318"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.338147 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "86bc749f-73e5-4bcc-8079-7c9b053e0318" (UID: "86bc749f-73e5-4bcc-8079-7c9b053e0318"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.362653 4909 scope.go:117] "RemoveContainer" containerID="c73875b9010e1e509b2e9adfe296f5305133f400c8e52a90164f9f8d577e55df" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.364282 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_87203850-864b-4fff-b340-25e4f5c6e7c9/ovn-northd/0.log" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.364344 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.365837 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.371231 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86bc749f-73e5-4bcc-8079-7c9b053e0318" (UID: "86bc749f-73e5-4bcc-8079-7c9b053e0318"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.374075 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.374260 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "86bc749f-73e5-4bcc-8079-7c9b053e0318" (UID: "86bc749f-73e5-4bcc-8079-7c9b053e0318"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.390491 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-config-data" (OuterVolumeSpecName: "config-data") pod "86bc749f-73e5-4bcc-8079-7c9b053e0318" (UID: "86bc749f-73e5-4bcc-8079-7c9b053e0318"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.415432 4909 scope.go:117] "RemoveContainer" containerID="0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737" Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.416329 4909 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 02 10:54:53 crc kubenswrapper[4909]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-02T10:54:46Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 02 10:54:53 crc kubenswrapper[4909]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Feb 02 10:54:53 crc kubenswrapper[4909]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-qpqvt" message=< Feb 02 10:54:53 crc kubenswrapper[4909]: Exiting ovn-controller (1) [FAILED] Feb 02 10:54:53 crc kubenswrapper[4909]: Killing ovn-controller (1) [ OK ] Feb 02 10:54:53 crc kubenswrapper[4909]: Killing ovn-controller (1) with SIGKILL [ OK ] Feb 02 10:54:53 crc kubenswrapper[4909]: 2026-02-02T10:54:46Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 02 10:54:53 crc kubenswrapper[4909]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Feb 02 10:54:53 crc kubenswrapper[4909]: > Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.416356 4909 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 02 10:54:53 crc kubenswrapper[4909]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-02-02T10:54:46Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Feb 02 10:54:53 crc kubenswrapper[4909]: /etc/init.d/functions: line 589: 386 Alarm clock "$@" Feb 02 10:54:53 crc kubenswrapper[4909]: > pod="openstack/ovn-controller-qpqvt" podUID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" containerName="ovn-controller" containerID="cri-o://f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.416387 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-qpqvt" podUID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" containerName="ovn-controller" containerID="cri-o://f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3" gracePeriod=22 Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417114 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-combined-ca-bundle\") pod \"87203850-864b-4fff-b340-25e4f5c6e7c9\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417152 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-config\") pod \"87203850-864b-4fff-b340-25e4f5c6e7c9\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417224 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-metrics-certs-tls-certs\") pod \"87203850-864b-4fff-b340-25e4f5c6e7c9\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417249 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-scripts\") pod \"87203850-864b-4fff-b340-25e4f5c6e7c9\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417327 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-northd-tls-certs\") pod \"87203850-864b-4fff-b340-25e4f5c6e7c9\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417356 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m82mq\" (UniqueName: \"kubernetes.io/projected/87203850-864b-4fff-b340-25e4f5c6e7c9-kube-api-access-m82mq\") pod \"87203850-864b-4fff-b340-25e4f5c6e7c9\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417393 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-rundir\") pod \"87203850-864b-4fff-b340-25e4f5c6e7c9\" (UID: \"87203850-864b-4fff-b340-25e4f5c6e7c9\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417747 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417763 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417776 4909 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417789 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417800 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glpp4\" (UniqueName: \"kubernetes.io/projected/86bc749f-73e5-4bcc-8079-7c9b053e0318-kube-api-access-glpp4\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417897 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.417910 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.418329 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "86bc749f-73e5-4bcc-8079-7c9b053e0318" (UID: "86bc749f-73e5-4bcc-8079-7c9b053e0318"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.419116 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-config" (OuterVolumeSpecName: "config") pod "87203850-864b-4fff-b340-25e4f5c6e7c9" (UID: "87203850-864b-4fff-b340-25e4f5c6e7c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.419180 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "87203850-864b-4fff-b340-25e4f5c6e7c9" (UID: "87203850-864b-4fff-b340-25e4f5c6e7c9"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.419881 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-scripts" (OuterVolumeSpecName: "scripts") pod "87203850-864b-4fff-b340-25e4f5c6e7c9" (UID: "87203850-864b-4fff-b340-25e4f5c6e7c9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.423943 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87203850-864b-4fff-b340-25e4f5c6e7c9-kube-api-access-m82mq" (OuterVolumeSpecName: "kube-api-access-m82mq") pod "87203850-864b-4fff-b340-25e4f5c6e7c9" (UID: "87203850-864b-4fff-b340-25e4f5c6e7c9"). InnerVolumeSpecName "kube-api-access-m82mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.457561 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87203850-864b-4fff-b340-25e4f5c6e7c9" (UID: "87203850-864b-4fff-b340-25e4f5c6e7c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.463262 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3 is running failed: container process not found" containerID="f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.463609 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3 is running failed: container process not found" containerID="f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.463856 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3 is running failed: container process not found" containerID="f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.463899 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-qpqvt" podUID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" containerName="ovn-controller" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.475396 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "87203850-864b-4fff-b340-25e4f5c6e7c9" (UID: "87203850-864b-4fff-b340-25e4f5c6e7c9"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.477439 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.477501 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.478433 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.478609 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.481462 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.481514 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovs-vswitchd" Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.481896 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.481918 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.507174 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "87203850-864b-4fff-b340-25e4f5c6e7c9" (UID: "87203850-864b-4fff-b340-25e4f5c6e7c9"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.511088 4909 scope.go:117] "RemoveContainer" containerID="0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737" Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.517325 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737\": container with ID starting with 0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737 not found: ID does not exist" containerID="0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.517372 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737"} err="failed to get container status \"0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737\": rpc error: code = NotFound desc = could not find container \"0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737\": container with ID starting with 0a7a142102f29fedad6139ea592267e9ca537d4410527980039f2daeffcde737 not found: ID does not exist" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.517403 4909 scope.go:117] "RemoveContainer" containerID="13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.522727 4909 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.522768 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.522780 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86bc749f-73e5-4bcc-8079-7c9b053e0318-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.522791 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.522824 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m82mq\" (UniqueName: \"kubernetes.io/projected/87203850-864b-4fff-b340-25e4f5c6e7c9-kube-api-access-m82mq\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.522835 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/87203850-864b-4fff-b340-25e4f5c6e7c9-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.522847 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87203850-864b-4fff-b340-25e4f5c6e7c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.522857 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87203850-864b-4fff-b340-25e4f5c6e7c9-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.553276 4909 scope.go:117] "RemoveContainer" containerID="529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.578210 4909 scope.go:117] "RemoveContainer" containerID="13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9" Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.581287 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9\": container with ID starting with 13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9 not found: ID does not exist" containerID="13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.581319 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9"} err="failed to get container status \"13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9\": rpc error: code = NotFound desc = could not find container \"13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9\": container with ID starting with 13eac5d6e05d47fc8abeb3f859fa00ba42d29ef4a5dae2782f3325bd81acbda9 not found: ID does not exist" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.581340 4909 scope.go:117] "RemoveContainer" containerID="529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360" Feb 02 10:54:53 crc kubenswrapper[4909]: E0202 10:54:53.581670 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360\": container with ID starting with 529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360 not found: ID does not exist" containerID="529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.581692 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360"} err="failed to get container status \"529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360\": rpc error: code = NotFound desc = could not find container \"529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360\": container with ID starting with 529dbc5ded5b168f357c98964ff8ca25a705fa2dcb89e66bab1ae2d9a409b360 not found: ID does not exist" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.611895 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7bb594654d-prg2q"] Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.617847 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7bb594654d-prg2q"] Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.868102 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.874778 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qpqvt_6b8dd51c-207c-4fae-8d5a-7271a159f0ff/ovn-controller/0.log" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.874866 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qpqvt" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927181 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-config-data\") pod \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927227 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-ovn-controller-tls-certs\") pod \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927252 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-ceilometer-tls-certs\") pod \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927268 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run-ovn\") pod \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927281 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run\") pod \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927316 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ts2l\" (UniqueName: \"kubernetes.io/projected/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-kube-api-access-7ts2l\") pod \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927339 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-run-httpd\") pod \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927361 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-log-httpd\") pod \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927379 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-combined-ca-bundle\") pod \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927423 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-log-ovn\") pod \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927443 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-scripts\") pod \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927477 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-sg-core-conf-yaml\") pod \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927491 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m42tt\" (UniqueName: \"kubernetes.io/projected/9a43fc6d-3442-4921-93bc-ef5ab2273a78-kube-api-access-m42tt\") pod \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927514 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-scripts\") pod \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\" (UID: \"9a43fc6d-3442-4921-93bc-ef5ab2273a78\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.927540 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-combined-ca-bundle\") pod \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\" (UID: \"6b8dd51c-207c-4fae-8d5a-7271a159f0ff\") " Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.928142 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9a43fc6d-3442-4921-93bc-ef5ab2273a78" (UID: "9a43fc6d-3442-4921-93bc-ef5ab2273a78"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.929291 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-scripts" (OuterVolumeSpecName: "scripts") pod "6b8dd51c-207c-4fae-8d5a-7271a159f0ff" (UID: "6b8dd51c-207c-4fae-8d5a-7271a159f0ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.930016 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9a43fc6d-3442-4921-93bc-ef5ab2273a78" (UID: "9a43fc6d-3442-4921-93bc-ef5ab2273a78"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.938925 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6b8dd51c-207c-4fae-8d5a-7271a159f0ff" (UID: "6b8dd51c-207c-4fae-8d5a-7271a159f0ff"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.940938 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6b8dd51c-207c-4fae-8d5a-7271a159f0ff" (UID: "6b8dd51c-207c-4fae-8d5a-7271a159f0ff"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.942193 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run" (OuterVolumeSpecName: "var-run") pod "6b8dd51c-207c-4fae-8d5a-7271a159f0ff" (UID: "6b8dd51c-207c-4fae-8d5a-7271a159f0ff"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.942434 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a43fc6d-3442-4921-93bc-ef5ab2273a78-kube-api-access-m42tt" (OuterVolumeSpecName: "kube-api-access-m42tt") pod "9a43fc6d-3442-4921-93bc-ef5ab2273a78" (UID: "9a43fc6d-3442-4921-93bc-ef5ab2273a78"). InnerVolumeSpecName "kube-api-access-m42tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.944242 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-scripts" (OuterVolumeSpecName: "scripts") pod "9a43fc6d-3442-4921-93bc-ef5ab2273a78" (UID: "9a43fc6d-3442-4921-93bc-ef5ab2273a78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.946716 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-kube-api-access-7ts2l" (OuterVolumeSpecName: "kube-api-access-7ts2l") pod "6b8dd51c-207c-4fae-8d5a-7271a159f0ff" (UID: "6b8dd51c-207c-4fae-8d5a-7271a159f0ff"). InnerVolumeSpecName "kube-api-access-7ts2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.968080 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b8dd51c-207c-4fae-8d5a-7271a159f0ff" (UID: "6b8dd51c-207c-4fae-8d5a-7271a159f0ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.969715 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9a43fc6d-3442-4921-93bc-ef5ab2273a78" (UID: "9a43fc6d-3442-4921-93bc-ef5ab2273a78"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.985696 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "9a43fc6d-3442-4921-93bc-ef5ab2273a78" (UID: "9a43fc6d-3442-4921-93bc-ef5ab2273a78"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:53 crc kubenswrapper[4909]: I0202 10:54:53.992896 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a43fc6d-3442-4921-93bc-ef5ab2273a78" (UID: "9a43fc6d-3442-4921-93bc-ef5ab2273a78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.011273 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "6b8dd51c-207c-4fae-8d5a-7271a159f0ff" (UID: "6b8dd51c-207c-4fae-8d5a-7271a159f0ff"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.011369 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-config-data" (OuterVolumeSpecName: "config-data") pod "9a43fc6d-3442-4921-93bc-ef5ab2273a78" (UID: "9a43fc6d-3442-4921-93bc-ef5ab2273a78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028734 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028769 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m42tt\" (UniqueName: \"kubernetes.io/projected/9a43fc6d-3442-4921-93bc-ef5ab2273a78-kube-api-access-m42tt\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028825 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028835 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028844 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028852 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028861 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028869 4909 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028878 4909 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028887 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ts2l\" (UniqueName: \"kubernetes.io/projected/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-kube-api-access-7ts2l\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028894 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028902 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9a43fc6d-3442-4921-93bc-ef5ab2273a78-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028910 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a43fc6d-3442-4921-93bc-ef5ab2273a78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028918 4909 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.028925 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b8dd51c-207c-4fae-8d5a-7271a159f0ff-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.140514 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.172:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.302921 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-qpqvt_6b8dd51c-207c-4fae-8d5a-7271a159f0ff/ovn-controller/0.log" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.302981 4909 generic.go:334] "Generic (PLEG): container finished" podID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" containerID="f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3" exitCode=137 Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.303050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qpqvt" event={"ID":"6b8dd51c-207c-4fae-8d5a-7271a159f0ff","Type":"ContainerDied","Data":"f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3"} Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.303052 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-qpqvt" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.303475 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-qpqvt" event={"ID":"6b8dd51c-207c-4fae-8d5a-7271a159f0ff","Type":"ContainerDied","Data":"715516fcb25964318a24c5b3c3d1e5cae4bec401cbb1c61092a96208c284ff4f"} Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.303520 4909 scope.go:117] "RemoveContainer" containerID="f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.306580 4909 generic.go:334] "Generic (PLEG): container finished" podID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerID="023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6" exitCode=0 Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.306634 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.306630 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a43fc6d-3442-4921-93bc-ef5ab2273a78","Type":"ContainerDied","Data":"023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6"} Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.306701 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9a43fc6d-3442-4921-93bc-ef5ab2273a78","Type":"ContainerDied","Data":"fa3aea4766d2e55fa0b1da6a6ceecc305e907d363b002158b6ecbcc66daa9b97"} Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.308693 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_87203850-864b-4fff-b340-25e4f5c6e7c9/ovn-northd/0.log" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.309113 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"87203850-864b-4fff-b340-25e4f5c6e7c9","Type":"ContainerDied","Data":"e2fc144e387bccba8e52d0a45c65dc1984de3c4a640c768871b3c7e5c07b8b0f"} Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.309142 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.327262 4909 scope.go:117] "RemoveContainer" containerID="f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3" Feb 02 10:54:54 crc kubenswrapper[4909]: E0202 10:54:54.328980 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3\": container with ID starting with f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3 not found: ID does not exist" containerID="f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.329041 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3"} err="failed to get container status \"f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3\": rpc error: code = NotFound desc = could not find container \"f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3\": container with ID starting with f70f93df07da4945068625ee668078a7ef350e8122cc19259daecb117441bbe3 not found: ID does not exist" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.329068 4909 scope.go:117] "RemoveContainer" containerID="3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.350122 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.358297 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.365324 4909 scope.go:117] "RemoveContainer" containerID="166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.367481 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-qpqvt"] Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.378730 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-qpqvt"] Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.400719 4909 scope.go:117] "RemoveContainer" containerID="023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.414384 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.426219 4909 scope.go:117] "RemoveContainer" containerID="1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.427956 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.448046 4909 scope.go:117] "RemoveContainer" containerID="3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7" Feb 02 10:54:54 crc kubenswrapper[4909]: E0202 10:54:54.448498 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7\": container with ID starting with 3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7 not found: ID does not exist" containerID="3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.448529 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7"} err="failed to get container status \"3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7\": rpc error: code = NotFound desc = could not find container \"3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7\": container with ID starting with 3bc7584cdea845ca454ea508cc0c6b2466da6ae49cb6266de80fbde89deebdb7 not found: ID does not exist" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.448551 4909 scope.go:117] "RemoveContainer" containerID="166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f" Feb 02 10:54:54 crc kubenswrapper[4909]: E0202 10:54:54.448861 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f\": container with ID starting with 166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f not found: ID does not exist" containerID="166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.448884 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f"} err="failed to get container status \"166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f\": rpc error: code = NotFound desc = could not find container \"166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f\": container with ID starting with 166e3b47bb57c5cbc573f49e8b964cfffa3a76870bbf15fbe19e47476b7d8a1f not found: ID does not exist" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.448897 4909 scope.go:117] "RemoveContainer" containerID="023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6" Feb 02 10:54:54 crc kubenswrapper[4909]: E0202 10:54:54.449153 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6\": container with ID starting with 023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6 not found: ID does not exist" containerID="023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.449179 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6"} err="failed to get container status \"023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6\": rpc error: code = NotFound desc = could not find container \"023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6\": container with ID starting with 023a431e4b03424676a729d714404990c19e0adf3d117b84ebe38b391d0036b6 not found: ID does not exist" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.449193 4909 scope.go:117] "RemoveContainer" containerID="1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2" Feb 02 10:54:54 crc kubenswrapper[4909]: E0202 10:54:54.449453 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2\": container with ID starting with 1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2 not found: ID does not exist" containerID="1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.449474 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2"} err="failed to get container status \"1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2\": rpc error: code = NotFound desc = could not find container \"1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2\": container with ID starting with 1c2720d46332594a79c263f4b5b11335295a93ffe32d4ee3e0fd204905c00fa2 not found: ID does not exist" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.449486 4909 scope.go:117] "RemoveContainer" containerID="c74c134d5ea53fc5d68abe3fe06e4cfbcfa8770404a1e48f3df9ace3dc76800d" Feb 02 10:54:54 crc kubenswrapper[4909]: I0202 10:54:54.466781 4909 scope.go:117] "RemoveContainer" containerID="52e5a0b83ebdf2c32308bcba8bf2bcd22b63c5893664ccffa254ee3d4272e8b7" Feb 02 10:54:55 crc kubenswrapper[4909]: I0202 10:54:55.025165 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ab15f72-b249-42d5-8698-273c5afc7758" path="/var/lib/kubelet/pods/1ab15f72-b249-42d5-8698-273c5afc7758/volumes" Feb 02 10:54:55 crc kubenswrapper[4909]: I0202 10:54:55.025874 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" path="/var/lib/kubelet/pods/6b8dd51c-207c-4fae-8d5a-7271a159f0ff/volumes" Feb 02 10:54:55 crc kubenswrapper[4909]: I0202 10:54:55.026344 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86bc749f-73e5-4bcc-8079-7c9b053e0318" path="/var/lib/kubelet/pods/86bc749f-73e5-4bcc-8079-7c9b053e0318/volumes" Feb 02 10:54:55 crc kubenswrapper[4909]: I0202 10:54:55.027319 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87203850-864b-4fff-b340-25e4f5c6e7c9" path="/var/lib/kubelet/pods/87203850-864b-4fff-b340-25e4f5c6e7c9/volumes" Feb 02 10:54:55 crc kubenswrapper[4909]: I0202 10:54:55.028044 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" path="/var/lib/kubelet/pods/9a43fc6d-3442-4921-93bc-ef5ab2273a78/volumes" Feb 02 10:54:55 crc kubenswrapper[4909]: I0202 10:54:55.029433 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b441d32f-f76f-4e7b-b3fe-40e93b126567" path="/var/lib/kubelet/pods/b441d32f-f76f-4e7b-b3fe-40e93b126567/volumes" Feb 02 10:54:55 crc kubenswrapper[4909]: I0202 10:54:55.356600 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="236e10c6-5b7d-4f9e-b82a-5c68edc93692" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.195:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.339186 4909 generic.go:334] "Generic (PLEG): container finished" podID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerID="8a5e2b95b89f07eeb7333cda5e4cb6d87b241046d11a832dbb17fcb90f90c063" exitCode=0 Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.339254 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8588c46577-4cp8s" event={"ID":"d1145da4-90e5-422b-917a-33473a9c5d6a","Type":"ContainerDied","Data":"8a5e2b95b89f07eeb7333cda5e4cb6d87b241046d11a832dbb17fcb90f90c063"} Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.470547 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.562362 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5t64\" (UniqueName: \"kubernetes.io/projected/d1145da4-90e5-422b-917a-33473a9c5d6a-kube-api-access-v5t64\") pod \"d1145da4-90e5-422b-917a-33473a9c5d6a\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.562446 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-public-tls-certs\") pod \"d1145da4-90e5-422b-917a-33473a9c5d6a\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.562490 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-internal-tls-certs\") pod \"d1145da4-90e5-422b-917a-33473a9c5d6a\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.562537 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-httpd-config\") pod \"d1145da4-90e5-422b-917a-33473a9c5d6a\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.562604 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-ovndb-tls-certs\") pod \"d1145da4-90e5-422b-917a-33473a9c5d6a\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.562705 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-config\") pod \"d1145da4-90e5-422b-917a-33473a9c5d6a\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.562748 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-combined-ca-bundle\") pod \"d1145da4-90e5-422b-917a-33473a9c5d6a\" (UID: \"d1145da4-90e5-422b-917a-33473a9c5d6a\") " Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.570208 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1145da4-90e5-422b-917a-33473a9c5d6a-kube-api-access-v5t64" (OuterVolumeSpecName: "kube-api-access-v5t64") pod "d1145da4-90e5-422b-917a-33473a9c5d6a" (UID: "d1145da4-90e5-422b-917a-33473a9c5d6a"). InnerVolumeSpecName "kube-api-access-v5t64". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.570972 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d1145da4-90e5-422b-917a-33473a9c5d6a" (UID: "d1145da4-90e5-422b-917a-33473a9c5d6a"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.605849 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d1145da4-90e5-422b-917a-33473a9c5d6a" (UID: "d1145da4-90e5-422b-917a-33473a9c5d6a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.606164 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1145da4-90e5-422b-917a-33473a9c5d6a" (UID: "d1145da4-90e5-422b-917a-33473a9c5d6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.610501 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-config" (OuterVolumeSpecName: "config") pod "d1145da4-90e5-422b-917a-33473a9c5d6a" (UID: "d1145da4-90e5-422b-917a-33473a9c5d6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.618451 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d1145da4-90e5-422b-917a-33473a9c5d6a" (UID: "d1145da4-90e5-422b-917a-33473a9c5d6a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.630686 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d1145da4-90e5-422b-917a-33473a9c5d6a" (UID: "d1145da4-90e5-422b-917a-33473a9c5d6a"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.664933 4909 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.664968 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.664979 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.664987 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5t64\" (UniqueName: \"kubernetes.io/projected/d1145da4-90e5-422b-917a-33473a9c5d6a-kube-api-access-v5t64\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.664999 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.665009 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:56 crc kubenswrapper[4909]: I0202 10:54:56.665016 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1145da4-90e5-422b-917a-33473a9c5d6a-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 10:54:57 crc kubenswrapper[4909]: I0202 10:54:57.348052 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8588c46577-4cp8s" event={"ID":"d1145da4-90e5-422b-917a-33473a9c5d6a","Type":"ContainerDied","Data":"59b1d6f9825a899073b39bab4b3554e3b90a77cb33ecbb1602394e0b14fc8b49"} Feb 02 10:54:57 crc kubenswrapper[4909]: I0202 10:54:57.348109 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8588c46577-4cp8s" Feb 02 10:54:57 crc kubenswrapper[4909]: I0202 10:54:57.348127 4909 scope.go:117] "RemoveContainer" containerID="07fee0cf291ae485ea00f0668c32744aaec34c29563b1aae25a47955a349b94d" Feb 02 10:54:57 crc kubenswrapper[4909]: I0202 10:54:57.365166 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8588c46577-4cp8s"] Feb 02 10:54:57 crc kubenswrapper[4909]: I0202 10:54:57.370003 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8588c46577-4cp8s"] Feb 02 10:54:57 crc kubenswrapper[4909]: I0202 10:54:57.374787 4909 scope.go:117] "RemoveContainer" containerID="8a5e2b95b89f07eeb7333cda5e4cb6d87b241046d11a832dbb17fcb90f90c063" Feb 02 10:54:58 crc kubenswrapper[4909]: E0202 10:54:58.476393 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:54:58 crc kubenswrapper[4909]: E0202 10:54:58.477231 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:54:58 crc kubenswrapper[4909]: E0202 10:54:58.477648 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:54:58 crc kubenswrapper[4909]: E0202 10:54:58.477690 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" Feb 02 10:54:58 crc kubenswrapper[4909]: E0202 10:54:58.477653 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:54:58 crc kubenswrapper[4909]: E0202 10:54:58.479109 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:54:58 crc kubenswrapper[4909]: E0202 10:54:58.480214 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:54:58 crc kubenswrapper[4909]: E0202 10:54:58.480255 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovs-vswitchd" Feb 02 10:54:59 crc kubenswrapper[4909]: I0202 10:54:59.027830 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1145da4-90e5-422b-917a-33473a9c5d6a" path="/var/lib/kubelet/pods/d1145da4-90e5-422b-917a-33473a9c5d6a/volumes" Feb 02 10:55:03 crc kubenswrapper[4909]: E0202 10:55:03.476053 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:55:03 crc kubenswrapper[4909]: E0202 10:55:03.476786 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:55:03 crc kubenswrapper[4909]: E0202 10:55:03.477153 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:55:03 crc kubenswrapper[4909]: E0202 10:55:03.477203 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" Feb 02 10:55:03 crc kubenswrapper[4909]: E0202 10:55:03.477305 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:55:03 crc kubenswrapper[4909]: E0202 10:55:03.478468 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:55:03 crc kubenswrapper[4909]: E0202 10:55:03.480315 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:55:03 crc kubenswrapper[4909]: E0202 10:55:03.480359 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovs-vswitchd" Feb 02 10:55:08 crc kubenswrapper[4909]: E0202 10:55:08.476227 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:55:08 crc kubenswrapper[4909]: E0202 10:55:08.477766 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:55:08 crc kubenswrapper[4909]: E0202 10:55:08.477901 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:55:08 crc kubenswrapper[4909]: E0202 10:55:08.478650 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:55:08 crc kubenswrapper[4909]: E0202 10:55:08.478701 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" Feb 02 10:55:08 crc kubenswrapper[4909]: E0202 10:55:08.479618 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:55:08 crc kubenswrapper[4909]: E0202 10:55:08.481397 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:55:08 crc kubenswrapper[4909]: E0202 10:55:08.481487 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovs-vswitchd" Feb 02 10:55:13 crc kubenswrapper[4909]: E0202 10:55:13.475933 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:55:13 crc kubenswrapper[4909]: E0202 10:55:13.477094 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:55:13 crc kubenswrapper[4909]: E0202 10:55:13.477269 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:55:13 crc kubenswrapper[4909]: E0202 10:55:13.477496 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 02 10:55:13 crc kubenswrapper[4909]: E0202 10:55:13.477520 4909 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" Feb 02 10:55:13 crc kubenswrapper[4909]: E0202 10:55:13.478869 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:55:13 crc kubenswrapper[4909]: E0202 10:55:13.480686 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 02 10:55:13 crc kubenswrapper[4909]: E0202 10:55:13.480719 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-9hkv5" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovs-vswitchd" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.505376 4909 generic.go:334] "Generic (PLEG): container finished" podID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerID="5e88d5dc47ad71a01580306942e0f3a9a30eb6a37d0332e786a59186dadbb937" exitCode=137 Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.505566 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"5e88d5dc47ad71a01580306942e0f3a9a30eb6a37d0332e786a59186dadbb937"} Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.510627 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9hkv5_ef5a90ca-c133-400b-b869-becc0b1f60a0/ovs-vswitchd/0.log" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.511526 4909 generic.go:334] "Generic (PLEG): container finished" podID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" exitCode=137 Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.511570 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9hkv5" event={"ID":"ef5a90ca-c133-400b-b869-becc0b1f60a0","Type":"ContainerDied","Data":"8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839"} Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.511597 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9hkv5" event={"ID":"ef5a90ca-c133-400b-b869-becc0b1f60a0","Type":"ContainerDied","Data":"4b7b9aa2d302731ab882bb6396e5cedb806ae9cfe1d2e855629879fcef6b91e8"} Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.511607 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b7b9aa2d302731ab882bb6396e5cedb806ae9cfe1d2e855629879fcef6b91e8" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.512237 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9hkv5_ef5a90ca-c133-400b-b869-becc0b1f60a0/ovs-vswitchd/0.log" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.512886 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.539063 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5a90ca-c133-400b-b869-becc0b1f60a0-scripts\") pod \"ef5a90ca-c133-400b-b869-becc0b1f60a0\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.539110 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-log\") pod \"ef5a90ca-c133-400b-b869-becc0b1f60a0\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.539282 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9npj5\" (UniqueName: \"kubernetes.io/projected/ef5a90ca-c133-400b-b869-becc0b1f60a0-kube-api-access-9npj5\") pod \"ef5a90ca-c133-400b-b869-becc0b1f60a0\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.539357 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-lib\") pod \"ef5a90ca-c133-400b-b869-becc0b1f60a0\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.539420 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-etc-ovs\") pod \"ef5a90ca-c133-400b-b869-becc0b1f60a0\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.539437 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-run\") pod \"ef5a90ca-c133-400b-b869-becc0b1f60a0\" (UID: \"ef5a90ca-c133-400b-b869-becc0b1f60a0\") " Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.539641 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "ef5a90ca-c133-400b-b869-becc0b1f60a0" (UID: "ef5a90ca-c133-400b-b869-becc0b1f60a0"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.539668 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-run" (OuterVolumeSpecName: "var-run") pod "ef5a90ca-c133-400b-b869-becc0b1f60a0" (UID: "ef5a90ca-c133-400b-b869-becc0b1f60a0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.539668 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-lib" (OuterVolumeSpecName: "var-lib") pod "ef5a90ca-c133-400b-b869-becc0b1f60a0" (UID: "ef5a90ca-c133-400b-b869-becc0b1f60a0"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.540212 4909 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.540231 4909 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-run\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.540313 4909 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-lib\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.540594 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-log" (OuterVolumeSpecName: "var-log") pod "ef5a90ca-c133-400b-b869-becc0b1f60a0" (UID: "ef5a90ca-c133-400b-b869-becc0b1f60a0"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.540907 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5a90ca-c133-400b-b869-becc0b1f60a0-scripts" (OuterVolumeSpecName: "scripts") pod "ef5a90ca-c133-400b-b869-becc0b1f60a0" (UID: "ef5a90ca-c133-400b-b869-becc0b1f60a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.545933 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5a90ca-c133-400b-b869-becc0b1f60a0-kube-api-access-9npj5" (OuterVolumeSpecName: "kube-api-access-9npj5") pod "ef5a90ca-c133-400b-b869-becc0b1f60a0" (UID: "ef5a90ca-c133-400b-b869-becc0b1f60a0"). InnerVolumeSpecName "kube-api-access-9npj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.641590 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ef5a90ca-c133-400b-b869-becc0b1f60a0-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.641626 4909 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ef5a90ca-c133-400b-b869-becc0b1f60a0-var-log\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:15 crc kubenswrapper[4909]: I0202 10:55:15.641637 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9npj5\" (UniqueName: \"kubernetes.io/projected/ef5a90ca-c133-400b-b869-becc0b1f60a0-kube-api-access-9npj5\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.269419 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.351511 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-cache\") pod \"7c400ec0-7faf-4151-b34e-ee28044b89e7\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.351566 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") pod \"7c400ec0-7faf-4151-b34e-ee28044b89e7\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.351647 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c400ec0-7faf-4151-b34e-ee28044b89e7-combined-ca-bundle\") pod \"7c400ec0-7faf-4151-b34e-ee28044b89e7\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.351700 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"7c400ec0-7faf-4151-b34e-ee28044b89e7\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.351751 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-lock\") pod \"7c400ec0-7faf-4151-b34e-ee28044b89e7\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.351786 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7mxn\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-kube-api-access-v7mxn\") pod \"7c400ec0-7faf-4151-b34e-ee28044b89e7\" (UID: \"7c400ec0-7faf-4151-b34e-ee28044b89e7\") " Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.352199 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-cache" (OuterVolumeSpecName: "cache") pod "7c400ec0-7faf-4151-b34e-ee28044b89e7" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.352292 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-lock" (OuterVolumeSpecName: "lock") pod "7c400ec0-7faf-4151-b34e-ee28044b89e7" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.354767 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-kube-api-access-v7mxn" (OuterVolumeSpecName: "kube-api-access-v7mxn") pod "7c400ec0-7faf-4151-b34e-ee28044b89e7" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7"). InnerVolumeSpecName "kube-api-access-v7mxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.354887 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7c400ec0-7faf-4151-b34e-ee28044b89e7" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.355182 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "7c400ec0-7faf-4151-b34e-ee28044b89e7" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.452984 4909 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-cache\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.453011 4909 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.453032 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.453041 4909 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7c400ec0-7faf-4151-b34e-ee28044b89e7-lock\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.453050 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7mxn\" (UniqueName: \"kubernetes.io/projected/7c400ec0-7faf-4151-b34e-ee28044b89e7-kube-api-access-v7mxn\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.467387 4909 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.529819 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9hkv5" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.529862 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.529925 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7c400ec0-7faf-4151-b34e-ee28044b89e7","Type":"ContainerDied","Data":"360dc222948004e550a1dca18a33e231919a25bfefa73d0be631cf2a93456e82"} Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.529959 4909 scope.go:117] "RemoveContainer" containerID="5e88d5dc47ad71a01580306942e0f3a9a30eb6a37d0332e786a59186dadbb937" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.554082 4909 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.564695 4909 scope.go:117] "RemoveContainer" containerID="403544c94cece49fecf3e837a0ceb79d6332a055fdfa13f162fa0295add4bbb7" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.565926 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-9hkv5"] Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.572045 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-9hkv5"] Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.576535 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c400ec0-7faf-4151-b34e-ee28044b89e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c400ec0-7faf-4151-b34e-ee28044b89e7" (UID: "7c400ec0-7faf-4151-b34e-ee28044b89e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.582551 4909 scope.go:117] "RemoveContainer" containerID="e7bf2be62b7e97abc8524922264d489ae88403ad72a536c23b1a2c1f6278395b" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.599001 4909 scope.go:117] "RemoveContainer" containerID="2e0f7d484d5647dce2a899dda4783d32b2783471d37c33f12abbe7c324c5a495" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.615345 4909 scope.go:117] "RemoveContainer" containerID="4769dc8794676e2cda42fe1e1591dca654572c12d01ef7df149e868e62a3a604" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.633880 4909 scope.go:117] "RemoveContainer" containerID="858adda0671141db145aafebaea7fb2c8fe86cafddc9fe78b0569ebbb13f0012" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.654789 4909 scope.go:117] "RemoveContainer" containerID="3c20875f1de9fb350baed752230d656b218f7820f84b09af0fae63228ac55300" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.656566 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c400ec0-7faf-4151-b34e-ee28044b89e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.675231 4909 scope.go:117] "RemoveContainer" containerID="cbfed4f1069ee483c0b5622d010bb4050eb703f82417b83b9be6a69f91f6416e" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.694957 4909 scope.go:117] "RemoveContainer" containerID="15ada96398d5e422dc198589740e54091d46278fea5a1976da718efa78d1aea0" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.713246 4909 scope.go:117] "RemoveContainer" containerID="1a36d372a28220791f7d96800ea8889bd6721cfb062a1df523373e1709e54828" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.728718 4909 scope.go:117] "RemoveContainer" containerID="75e96eb7331791bb4bb7609ae1be6e72027c31244a81358a0312c1fe5e88d79b" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.743915 4909 scope.go:117] "RemoveContainer" containerID="f4ba97736c676ea7105906bcc956b64814d9063116ac1ae4c7dd355c1861de1a" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.762229 4909 scope.go:117] "RemoveContainer" containerID="bb0826ef7db615ea25639f354b4e3d66a6b4b7a4c615c65933bc47849ffdcf70" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.785185 4909 scope.go:117] "RemoveContainer" containerID="688f9c6773a03040cd5ec3fcaab21a892ba396e408fdcffda51c117011469111" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.806151 4909 scope.go:117] "RemoveContainer" containerID="bdf18a1a02fe85f1bb6cf3ed370c1c5c18587a314b6cf14fea7469bf28031585" Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.877869 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:55:16 crc kubenswrapper[4909]: I0202 10:55:16.883593 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.024434 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" path="/var/lib/kubelet/pods/7c400ec0-7faf-4151-b34e-ee28044b89e7/volumes" Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.026546 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" path="/var/lib/kubelet/pods/ef5a90ca-c133-400b-b869-becc0b1f60a0/volumes" Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.947881 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.977613 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5tst\" (UniqueName: \"kubernetes.io/projected/c97f6f0e-16ab-439c-a14c-3d908758b1db-kube-api-access-r5tst\") pod \"c97f6f0e-16ab-439c-a14c-3d908758b1db\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.977714 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data-custom\") pod \"c97f6f0e-16ab-439c-a14c-3d908758b1db\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.977784 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c97f6f0e-16ab-439c-a14c-3d908758b1db-logs\") pod \"c97f6f0e-16ab-439c-a14c-3d908758b1db\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.977850 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-combined-ca-bundle\") pod \"c97f6f0e-16ab-439c-a14c-3d908758b1db\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.977933 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data\") pod \"c97f6f0e-16ab-439c-a14c-3d908758b1db\" (UID: \"c97f6f0e-16ab-439c-a14c-3d908758b1db\") " Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.984473 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97f6f0e-16ab-439c-a14c-3d908758b1db-kube-api-access-r5tst" (OuterVolumeSpecName: "kube-api-access-r5tst") pod "c97f6f0e-16ab-439c-a14c-3d908758b1db" (UID: "c97f6f0e-16ab-439c-a14c-3d908758b1db"). InnerVolumeSpecName "kube-api-access-r5tst". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.986115 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c97f6f0e-16ab-439c-a14c-3d908758b1db-logs" (OuterVolumeSpecName: "logs") pod "c97f6f0e-16ab-439c-a14c-3d908758b1db" (UID: "c97f6f0e-16ab-439c-a14c-3d908758b1db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:17 crc kubenswrapper[4909]: I0202 10:55:17.989276 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c97f6f0e-16ab-439c-a14c-3d908758b1db" (UID: "c97f6f0e-16ab-439c-a14c-3d908758b1db"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.017519 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.027031 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c97f6f0e-16ab-439c-a14c-3d908758b1db" (UID: "c97f6f0e-16ab-439c-a14c-3d908758b1db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.034895 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data" (OuterVolumeSpecName: "config-data") pod "c97f6f0e-16ab-439c-a14c-3d908758b1db" (UID: "c97f6f0e-16ab-439c-a14c-3d908758b1db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079087 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65028b8f-2d3c-40f3-8c17-239856623f4e-logs\") pod \"65028b8f-2d3c-40f3-8c17-239856623f4e\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079175 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data\") pod \"65028b8f-2d3c-40f3-8c17-239856623f4e\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079221 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-combined-ca-bundle\") pod \"65028b8f-2d3c-40f3-8c17-239856623f4e\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079258 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gghks\" (UniqueName: \"kubernetes.io/projected/65028b8f-2d3c-40f3-8c17-239856623f4e-kube-api-access-gghks\") pod \"65028b8f-2d3c-40f3-8c17-239856623f4e\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079306 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data-custom\") pod \"65028b8f-2d3c-40f3-8c17-239856623f4e\" (UID: \"65028b8f-2d3c-40f3-8c17-239856623f4e\") " Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079523 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65028b8f-2d3c-40f3-8c17-239856623f4e-logs" (OuterVolumeSpecName: "logs") pod "65028b8f-2d3c-40f3-8c17-239856623f4e" (UID: "65028b8f-2d3c-40f3-8c17-239856623f4e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079796 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5tst\" (UniqueName: \"kubernetes.io/projected/c97f6f0e-16ab-439c-a14c-3d908758b1db-kube-api-access-r5tst\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079851 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079865 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c97f6f0e-16ab-439c-a14c-3d908758b1db-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079877 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65028b8f-2d3c-40f3-8c17-239856623f4e-logs\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079888 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.079899 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c97f6f0e-16ab-439c-a14c-3d908758b1db-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.082601 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65028b8f-2d3c-40f3-8c17-239856623f4e-kube-api-access-gghks" (OuterVolumeSpecName: "kube-api-access-gghks") pod "65028b8f-2d3c-40f3-8c17-239856623f4e" (UID: "65028b8f-2d3c-40f3-8c17-239856623f4e"). InnerVolumeSpecName "kube-api-access-gghks". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.082617 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "65028b8f-2d3c-40f3-8c17-239856623f4e" (UID: "65028b8f-2d3c-40f3-8c17-239856623f4e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.096730 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65028b8f-2d3c-40f3-8c17-239856623f4e" (UID: "65028b8f-2d3c-40f3-8c17-239856623f4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.121428 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data" (OuterVolumeSpecName: "config-data") pod "65028b8f-2d3c-40f3-8c17-239856623f4e" (UID: "65028b8f-2d3c-40f3-8c17-239856623f4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.180753 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.180792 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.180888 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gghks\" (UniqueName: \"kubernetes.io/projected/65028b8f-2d3c-40f3-8c17-239856623f4e-kube-api-access-gghks\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.180906 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65028b8f-2d3c-40f3-8c17-239856623f4e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.550086 4909 generic.go:334] "Generic (PLEG): container finished" podID="65028b8f-2d3c-40f3-8c17-239856623f4e" containerID="4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7" exitCode=137 Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.550138 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dd95bf6f-66vhd" event={"ID":"65028b8f-2d3c-40f3-8c17-239856623f4e","Type":"ContainerDied","Data":"4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7"} Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.550182 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6dd95bf6f-66vhd" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.550198 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6dd95bf6f-66vhd" event={"ID":"65028b8f-2d3c-40f3-8c17-239856623f4e","Type":"ContainerDied","Data":"3efade9a5319967de578520d20eaa5fe362b0bb9eea6200c5e03127aa5ee76b8"} Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.550228 4909 scope.go:117] "RemoveContainer" containerID="4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.554053 4909 generic.go:334] "Generic (PLEG): container finished" podID="c97f6f0e-16ab-439c-a14c-3d908758b1db" containerID="749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d" exitCode=137 Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.554130 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" event={"ID":"c97f6f0e-16ab-439c-a14c-3d908758b1db","Type":"ContainerDied","Data":"749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d"} Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.554182 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" event={"ID":"c97f6f0e-16ab-439c-a14c-3d908758b1db","Type":"ContainerDied","Data":"ef77b807b50570f53289b41354f9f01947db881fc7e634a6426df9eede9623af"} Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.554259 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-795c6654c6-z72r6" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.577576 4909 scope.go:117] "RemoveContainer" containerID="0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.594431 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6dd95bf6f-66vhd"] Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.603972 4909 scope.go:117] "RemoveContainer" containerID="4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7" Feb 02 10:55:18 crc kubenswrapper[4909]: E0202 10:55:18.604498 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7\": container with ID starting with 4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7 not found: ID does not exist" containerID="4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.604554 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7"} err="failed to get container status \"4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7\": rpc error: code = NotFound desc = could not find container \"4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7\": container with ID starting with 4c35b87cac47569ce3e5e02c42cec56c3c67aa0f6401a56069d3f09d521254e7 not found: ID does not exist" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.604587 4909 scope.go:117] "RemoveContainer" containerID="0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3" Feb 02 10:55:18 crc kubenswrapper[4909]: E0202 10:55:18.605094 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3\": container with ID starting with 0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3 not found: ID does not exist" containerID="0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.605128 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3"} err="failed to get container status \"0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3\": rpc error: code = NotFound desc = could not find container \"0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3\": container with ID starting with 0cf32bc67d8b4fdffda88ed8894f8d0cadb2655deceb75283d5998db010679a3 not found: ID does not exist" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.605156 4909 scope.go:117] "RemoveContainer" containerID="749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.605661 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6dd95bf6f-66vhd"] Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.611743 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-795c6654c6-z72r6"] Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.617561 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-795c6654c6-z72r6"] Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.624234 4909 scope.go:117] "RemoveContainer" containerID="20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.642885 4909 scope.go:117] "RemoveContainer" containerID="749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d" Feb 02 10:55:18 crc kubenswrapper[4909]: E0202 10:55:18.643362 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d\": container with ID starting with 749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d not found: ID does not exist" containerID="749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.643396 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d"} err="failed to get container status \"749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d\": rpc error: code = NotFound desc = could not find container \"749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d\": container with ID starting with 749e1330fe36f4d0f7bc0c2d5e47eda30327a3bcf7ea6f2a6f0282c8a6fdb83d not found: ID does not exist" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.643417 4909 scope.go:117] "RemoveContainer" containerID="20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1" Feb 02 10:55:18 crc kubenswrapper[4909]: E0202 10:55:18.643683 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1\": container with ID starting with 20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1 not found: ID does not exist" containerID="20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1" Feb 02 10:55:18 crc kubenswrapper[4909]: I0202 10:55:18.643706 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1"} err="failed to get container status \"20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1\": rpc error: code = NotFound desc = could not find container \"20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1\": container with ID starting with 20a8c8c1f2f7bfde54104c71ce26b83d01071f2b9fd01e11e52ed684135e3ed1 not found: ID does not exist" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.025986 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65028b8f-2d3c-40f3-8c17-239856623f4e" path="/var/lib/kubelet/pods/65028b8f-2d3c-40f3-8c17-239856623f4e/volumes" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.026569 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c97f6f0e-16ab-439c-a14c-3d908758b1db" path="/var/lib/kubelet/pods/c97f6f0e-16ab-439c-a14c-3d908758b1db/volumes" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.926852 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-58m5n"] Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927162 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97f6f0e-16ab-439c-a14c-3d908758b1db" containerName="barbican-keystone-listener" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927179 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97f6f0e-16ab-439c-a14c-3d908758b1db" containerName="barbican-keystone-listener" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927203 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-server" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927210 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-server" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927223 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-updater" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927231 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-updater" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927241 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-replicator" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927248 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-replicator" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927257 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerName="ovn-northd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927265 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerName="ovn-northd" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927276 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerName="openstack-network-exporter" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927282 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerName="openstack-network-exporter" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927294 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" containerName="barbican-keystone-listener-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927300 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" containerName="barbican-keystone-listener-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927312 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e55a25-f51a-49a9-af91-ffbab9ad611e" containerName="galera" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927318 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e55a25-f51a-49a9-af91-ffbab9ad611e" containerName="galera" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927332 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovs-vswitchd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927338 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovs-vswitchd" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927347 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-expirer" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927353 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-expirer" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927365 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerName="neutron-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927373 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerName="neutron-api" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927382 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927389 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927399 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4589304-68d2-48c9-a691-e34a9cb4c75b" containerName="barbican-worker" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927406 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4589304-68d2-48c9-a691-e34a9cb4c75b" containerName="barbican-worker" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927419 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerName="cinder-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927426 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerName="cinder-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927439 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-auditor" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927456 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-auditor" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927466 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" containerName="mariadb-account-create-update" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927473 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" containerName="mariadb-account-create-update" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927483 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b24f572-8a70-4a46-b3cf-e50ae4859892" containerName="probe" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927490 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b24f572-8a70-4a46-b3cf-e50ae4859892" containerName="probe" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927501 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="ceilometer-central-agent" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927507 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="ceilometer-central-agent" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927515 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="sg-core" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927522 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="sg-core" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927530 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b441d32f-f76f-4e7b-b3fe-40e93b126567" containerName="setup-container" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927536 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b441d32f-f76f-4e7b-b3fe-40e93b126567" containerName="setup-container" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927548 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4589304-68d2-48c9-a691-e34a9cb4c75b" containerName="barbican-worker-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927555 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4589304-68d2-48c9-a691-e34a9cb4c75b" containerName="barbican-worker-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927561 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerName="neutron-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927568 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerName="neutron-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927580 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" containerName="glance-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927586 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" containerName="glance-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927593 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86bc749f-73e5-4bcc-8079-7c9b053e0318" containerName="keystone-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927600 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="86bc749f-73e5-4bcc-8079-7c9b053e0318" containerName="keystone-api" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927607 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="proxy-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927614 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="proxy-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927625 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-server" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927631 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-server" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927644 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerName="nova-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927651 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerName="nova-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927659 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server-init" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927668 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server-init" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927680 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c52b752-391b-4770-9191-3494df4e3999" containerName="glance-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927687 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c52b752-391b-4770-9191-3494df4e3999" containerName="glance-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927698 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927707 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927716 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" containerName="ovn-controller" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927723 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" containerName="ovn-controller" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927734 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" containerName="barbican-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927740 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" containerName="barbican-api" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927749 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" containerName="glance-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927755 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" containerName="glance-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927765 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c52b752-391b-4770-9191-3494df4e3999" containerName="glance-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927771 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c52b752-391b-4770-9191-3494df4e3999" containerName="glance-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927779 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2399f7a5-86e0-46bd-9f3d-624d1208b9cc" containerName="nova-cell0-conductor-conductor" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927785 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2399f7a5-86e0-46bd-9f3d-624d1208b9cc" containerName="nova-cell0-conductor-conductor" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927794 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-server" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927801 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-server" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927824 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" containerName="placement-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927830 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" containerName="placement-api" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927840 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerName="nova-api-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927846 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerName="nova-api-api" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927854 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65028b8f-2d3c-40f3-8c17-239856623f4e" containerName="barbican-worker" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927861 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="65028b8f-2d3c-40f3-8c17-239856623f4e" containerName="barbican-worker" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927868 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" containerName="barbican-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927875 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" containerName="barbican-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927886 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="swift-recon-cron" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927892 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="swift-recon-cron" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927902 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-replicator" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927908 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-replicator" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927916 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-replicator" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927922 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-replicator" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927930 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab15f72-b249-42d5-8698-273c5afc7758" containerName="rabbitmq" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927937 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab15f72-b249-42d5-8698-273c5afc7758" containerName="rabbitmq" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927943 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-updater" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927950 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-updater" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927962 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="ceilometer-notification-agent" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927969 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="ceilometer-notification-agent" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927977 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" containerName="barbican-keystone-listener" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.927983 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" containerName="barbican-keystone-listener" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.927993 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e55a25-f51a-49a9-af91-ffbab9ad611e" containerName="mysql-bootstrap" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928000 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e55a25-f51a-49a9-af91-ffbab9ad611e" containerName="mysql-bootstrap" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928011 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b441d32f-f76f-4e7b-b3fe-40e93b126567" containerName="rabbitmq" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928017 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b441d32f-f76f-4e7b-b3fe-40e93b126567" containerName="rabbitmq" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928025 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b24f572-8a70-4a46-b3cf-e50ae4859892" containerName="cinder-scheduler" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928031 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b24f572-8a70-4a46-b3cf-e50ae4859892" containerName="cinder-scheduler" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928041 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04232dcc-dda5-4774-b999-5104335f2da0" containerName="barbican-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928047 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="04232dcc-dda5-4774-b999-5104335f2da0" containerName="barbican-api" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928056 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65028b8f-2d3c-40f3-8c17-239856623f4e" containerName="barbican-worker-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928063 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="65028b8f-2d3c-40f3-8c17-239856623f4e" containerName="barbican-worker-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928073 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ab15f72-b249-42d5-8698-273c5afc7758" containerName="setup-container" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928079 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ab15f72-b249-42d5-8698-273c5afc7758" containerName="setup-container" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928090 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="236e10c6-5b7d-4f9e-b82a-5c68edc93692" containerName="kube-state-metrics" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928097 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="236e10c6-5b7d-4f9e-b82a-5c68edc93692" containerName="kube-state-metrics" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928105 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-metadata" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928111 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-metadata" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928119 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" containerName="placement-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928126 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" containerName="placement-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928135 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerName="cinder-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928141 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerName="cinder-api" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928149 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba559652-0584-49c5-91d6-7d7fdd596dc2" containerName="nova-scheduler-scheduler" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928156 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba559652-0584-49c5-91d6-7d7fdd596dc2" containerName="nova-scheduler-scheduler" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928164 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dab4432-0762-45a8-88ab-3a99217a790f" containerName="memcached" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928170 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dab4432-0762-45a8-88ab-3a99217a790f" containerName="memcached" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928179 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04232dcc-dda5-4774-b999-5104335f2da0" containerName="barbican-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928185 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="04232dcc-dda5-4774-b999-5104335f2da0" containerName="barbican-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928194 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-auditor" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928200 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-auditor" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928209 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97f6f0e-16ab-439c-a14c-3d908758b1db" containerName="barbican-keystone-listener-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928216 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97f6f0e-16ab-439c-a14c-3d908758b1db" containerName="barbican-keystone-listener-log" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928227 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-auditor" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928235 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-auditor" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928242 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="rsync" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928250 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="rsync" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.928262 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-reaper" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928270 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-reaper" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928407 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovs-vswitchd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928418 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2399f7a5-86e0-46bd-9f3d-624d1208b9cc" containerName="nova-cell0-conductor-conductor" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928427 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4589304-68d2-48c9-a691-e34a9cb4c75b" containerName="barbican-worker" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928437 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-updater" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928448 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerName="openstack-network-exporter" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928461 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-replicator" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928471 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dab4432-0762-45a8-88ab-3a99217a790f" containerName="memcached" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928476 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="86bc749f-73e5-4bcc-8079-7c9b053e0318" containerName="keystone-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928486 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="proxy-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928494 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerName="cinder-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928501 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-updater" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928508 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-expirer" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928516 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c52b752-391b-4770-9191-3494df4e3999" containerName="glance-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928524 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5a90ca-c133-400b-b869-becc0b1f60a0" containerName="ovsdb-server" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928533 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c52b752-391b-4770-9191-3494df4e3999" containerName="glance-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928542 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="04232dcc-dda5-4774-b999-5104335f2da0" containerName="barbican-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928554 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="ceilometer-notification-agent" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928561 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b24f572-8a70-4a46-b3cf-e50ae4859892" containerName="probe" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928571 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="ceilometer-central-agent" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928580 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e55a25-f51a-49a9-af91-ffbab9ad611e" containerName="galera" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928588 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="87203850-864b-4fff-b340-25e4f5c6e7c9" containerName="ovn-northd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928598 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" containerName="glance-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928604 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-replicator" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928614 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-auditor" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928625 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" containerName="barbican-keystone-listener-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928632 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="236e10c6-5b7d-4f9e-b82a-5c68edc93692" containerName="kube-state-metrics" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928650 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="65028b8f-2d3c-40f3-8c17-239856623f4e" containerName="barbican-worker" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928660 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-server" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928668 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a8699a-66db-48c6-8834-bda4e21ef1d9" containerName="cinder-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928678 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b441d32f-f76f-4e7b-b3fe-40e93b126567" containerName="rabbitmq" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928685 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-metadata" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928695 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f0cf9e9-b663-4be3-a435-c7dd6deea228" containerName="nova-metadata-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928703 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="65028b8f-2d3c-40f3-8c17-239856623f4e" containerName="barbican-worker-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928709 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" containerName="barbican-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928717 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" containerName="placement-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928725 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddda8a20-60ba-4ae9-837a-44fa44518b8a" containerName="barbican-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928734 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerName="neutron-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928743 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-server" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928751 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-server" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928763 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerName="nova-api-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928773 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c97f6f0e-16ab-439c-a14c-3d908758b1db" containerName="barbican-keystone-listener" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928783 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6be13cc-01ed-441f-b2c9-dc024fcb4b18" containerName="placement-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928790 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="swift-recon-cron" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928798 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="04232dcc-dda5-4774-b999-5104335f2da0" containerName="barbican-api" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928821 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" containerName="mariadb-account-create-update" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928830 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7202de6a-156c-4c06-9e08-3e62cfcf367e" containerName="nova-api-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928841 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b24f572-8a70-4a46-b3cf-e50ae4859892" containerName="cinder-scheduler" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928847 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a43fc6d-3442-4921-93bc-ef5ab2273a78" containerName="sg-core" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928855 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c97f6f0e-16ab-439c-a14c-3d908758b1db" containerName="barbican-keystone-listener-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928864 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-reaper" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928872 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" containerName="mariadb-account-create-update" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928882 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba559652-0584-49c5-91d6-7d7fdd596dc2" containerName="nova-scheduler-scheduler" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928894 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ab15f72-b249-42d5-8698-273c5afc7758" containerName="rabbitmq" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928904 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="account-replicator" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928917 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="object-auditor" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928926 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4589304-68d2-48c9-a691-e34a9cb4c75b" containerName="barbican-worker-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928933 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="container-auditor" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928940 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc6ef35e-2d40-46c4-9cf7-e2e8adae067a" containerName="barbican-keystone-listener" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928947 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8dd51c-207c-4fae-8d5a-7271a159f0ff" containerName="ovn-controller" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928956 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="97409ffd-f1ab-4a1a-9939-a041a4085b1a" containerName="glance-log" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928966 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c400ec0-7faf-4151-b34e-ee28044b89e7" containerName="rsync" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.928972 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1145da4-90e5-422b-917a-33473a9c5d6a" containerName="neutron-httpd" Feb 02 10:55:19 crc kubenswrapper[4909]: E0202 10:55:19.929083 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" containerName="mariadb-account-create-update" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.929092 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e2e76d-5e02-4488-ae0d-5acbdb1aa060" containerName="mariadb-account-create-update" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.930037 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:19 crc kubenswrapper[4909]: I0202 10:55:19.937791 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-58m5n"] Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.006663 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-catalog-content\") pod \"redhat-operators-58m5n\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.006752 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx47p\" (UniqueName: \"kubernetes.io/projected/73ea6517-71cb-47cb-9189-4831c9074637-kube-api-access-lx47p\") pod \"redhat-operators-58m5n\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.006835 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-utilities\") pod \"redhat-operators-58m5n\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.108860 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-utilities\") pod \"redhat-operators-58m5n\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.108977 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-catalog-content\") pod \"redhat-operators-58m5n\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.109014 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx47p\" (UniqueName: \"kubernetes.io/projected/73ea6517-71cb-47cb-9189-4831c9074637-kube-api-access-lx47p\") pod \"redhat-operators-58m5n\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.109678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-utilities\") pod \"redhat-operators-58m5n\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.109970 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-catalog-content\") pod \"redhat-operators-58m5n\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.133039 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx47p\" (UniqueName: \"kubernetes.io/projected/73ea6517-71cb-47cb-9189-4831c9074637-kube-api-access-lx47p\") pod \"redhat-operators-58m5n\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.249095 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.398715 4909 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod05143579-706d-4107-9d7a-a63b4a13c187"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod05143579-706d-4107-9d7a-a63b4a13c187] : Timed out while waiting for systemd to remove kubepods-besteffort-pod05143579_706d_4107_9d7a_a63b4a13c187.slice" Feb 02 10:55:20 crc kubenswrapper[4909]: E0202 10:55:20.399124 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod05143579-706d-4107-9d7a-a63b4a13c187] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod05143579-706d-4107-9d7a-a63b4a13c187] : Timed out while waiting for systemd to remove kubepods-besteffort-pod05143579_706d_4107_9d7a_a63b4a13c187.slice" pod="openstack/nova-cell1-conductor-0" podUID="05143579-706d-4107-9d7a-a63b4a13c187" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.569639 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.597996 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.606027 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 10:55:20 crc kubenswrapper[4909]: I0202 10:55:20.666779 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-58m5n"] Feb 02 10:55:20 crc kubenswrapper[4909]: W0202 10:55:20.668920 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73ea6517_71cb_47cb_9189_4831c9074637.slice/crio-e080e7927a3a7c95771514c4ecf6c336b7cd4fd6638d67b3baefb55d72186b32 WatchSource:0}: Error finding container e080e7927a3a7c95771514c4ecf6c336b7cd4fd6638d67b3baefb55d72186b32: Status 404 returned error can't find the container with id e080e7927a3a7c95771514c4ecf6c336b7cd4fd6638d67b3baefb55d72186b32 Feb 02 10:55:21 crc kubenswrapper[4909]: I0202 10:55:21.024222 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05143579-706d-4107-9d7a-a63b4a13c187" path="/var/lib/kubelet/pods/05143579-706d-4107-9d7a-a63b4a13c187/volumes" Feb 02 10:55:21 crc kubenswrapper[4909]: I0202 10:55:21.579473 4909 generic.go:334] "Generic (PLEG): container finished" podID="73ea6517-71cb-47cb-9189-4831c9074637" containerID="d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187" exitCode=0 Feb 02 10:55:21 crc kubenswrapper[4909]: I0202 10:55:21.579509 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58m5n" event={"ID":"73ea6517-71cb-47cb-9189-4831c9074637","Type":"ContainerDied","Data":"d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187"} Feb 02 10:55:21 crc kubenswrapper[4909]: I0202 10:55:21.579559 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58m5n" event={"ID":"73ea6517-71cb-47cb-9189-4831c9074637","Type":"ContainerStarted","Data":"e080e7927a3a7c95771514c4ecf6c336b7cd4fd6638d67b3baefb55d72186b32"} Feb 02 10:55:22 crc kubenswrapper[4909]: I0202 10:55:22.595830 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58m5n" event={"ID":"73ea6517-71cb-47cb-9189-4831c9074637","Type":"ContainerStarted","Data":"a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def"} Feb 02 10:55:23 crc kubenswrapper[4909]: I0202 10:55:23.606358 4909 generic.go:334] "Generic (PLEG): container finished" podID="73ea6517-71cb-47cb-9189-4831c9074637" containerID="a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def" exitCode=0 Feb 02 10:55:23 crc kubenswrapper[4909]: I0202 10:55:23.606431 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58m5n" event={"ID":"73ea6517-71cb-47cb-9189-4831c9074637","Type":"ContainerDied","Data":"a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def"} Feb 02 10:55:24 crc kubenswrapper[4909]: I0202 10:55:24.619146 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58m5n" event={"ID":"73ea6517-71cb-47cb-9189-4831c9074637","Type":"ContainerStarted","Data":"bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3"} Feb 02 10:55:24 crc kubenswrapper[4909]: I0202 10:55:24.647513 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-58m5n" podStartSLOduration=3.006828519 podStartE2EDuration="5.647497565s" podCreationTimestamp="2026-02-02 10:55:19 +0000 UTC" firstStartedPulling="2026-02-02 10:55:21.581712156 +0000 UTC m=+1447.327812881" lastFinishedPulling="2026-02-02 10:55:24.222381192 +0000 UTC m=+1449.968481927" observedRunningTime="2026-02-02 10:55:24.644339435 +0000 UTC m=+1450.390440170" watchObservedRunningTime="2026-02-02 10:55:24.647497565 +0000 UTC m=+1450.393598300" Feb 02 10:55:30 crc kubenswrapper[4909]: I0202 10:55:30.249441 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:30 crc kubenswrapper[4909]: I0202 10:55:30.249797 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:30 crc kubenswrapper[4909]: I0202 10:55:30.295636 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:30 crc kubenswrapper[4909]: I0202 10:55:30.721141 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:30 crc kubenswrapper[4909]: I0202 10:55:30.767319 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-58m5n"] Feb 02 10:55:32 crc kubenswrapper[4909]: I0202 10:55:32.694071 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-58m5n" podUID="73ea6517-71cb-47cb-9189-4831c9074637" containerName="registry-server" containerID="cri-o://bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3" gracePeriod=2 Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.067426 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.197797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx47p\" (UniqueName: \"kubernetes.io/projected/73ea6517-71cb-47cb-9189-4831c9074637-kube-api-access-lx47p\") pod \"73ea6517-71cb-47cb-9189-4831c9074637\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.197920 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-catalog-content\") pod \"73ea6517-71cb-47cb-9189-4831c9074637\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.197959 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-utilities\") pod \"73ea6517-71cb-47cb-9189-4831c9074637\" (UID: \"73ea6517-71cb-47cb-9189-4831c9074637\") " Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.199037 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-utilities" (OuterVolumeSpecName: "utilities") pod "73ea6517-71cb-47cb-9189-4831c9074637" (UID: "73ea6517-71cb-47cb-9189-4831c9074637"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.203079 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ea6517-71cb-47cb-9189-4831c9074637-kube-api-access-lx47p" (OuterVolumeSpecName: "kube-api-access-lx47p") pod "73ea6517-71cb-47cb-9189-4831c9074637" (UID: "73ea6517-71cb-47cb-9189-4831c9074637"). InnerVolumeSpecName "kube-api-access-lx47p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.300066 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.300112 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx47p\" (UniqueName: \"kubernetes.io/projected/73ea6517-71cb-47cb-9189-4831c9074637-kube-api-access-lx47p\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.703186 4909 generic.go:334] "Generic (PLEG): container finished" podID="73ea6517-71cb-47cb-9189-4831c9074637" containerID="bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3" exitCode=0 Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.703235 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58m5n" event={"ID":"73ea6517-71cb-47cb-9189-4831c9074637","Type":"ContainerDied","Data":"bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3"} Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.703262 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-58m5n" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.703274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-58m5n" event={"ID":"73ea6517-71cb-47cb-9189-4831c9074637","Type":"ContainerDied","Data":"e080e7927a3a7c95771514c4ecf6c336b7cd4fd6638d67b3baefb55d72186b32"} Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.703301 4909 scope.go:117] "RemoveContainer" containerID="bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.741011 4909 scope.go:117] "RemoveContainer" containerID="a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.766947 4909 scope.go:117] "RemoveContainer" containerID="d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.786232 4909 scope.go:117] "RemoveContainer" containerID="bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3" Feb 02 10:55:33 crc kubenswrapper[4909]: E0202 10:55:33.786706 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3\": container with ID starting with bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3 not found: ID does not exist" containerID="bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.786827 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3"} err="failed to get container status \"bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3\": rpc error: code = NotFound desc = could not find container \"bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3\": container with ID starting with bd8023e5f145b1d68fdf8978b0e5b29f457d868f90a5900743e67eb6541be3c3 not found: ID does not exist" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.786908 4909 scope.go:117] "RemoveContainer" containerID="a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def" Feb 02 10:55:33 crc kubenswrapper[4909]: E0202 10:55:33.787373 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def\": container with ID starting with a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def not found: ID does not exist" containerID="a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.787425 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def"} err="failed to get container status \"a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def\": rpc error: code = NotFound desc = could not find container \"a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def\": container with ID starting with a5816146584e270e783e621a715f643c486c4185640cb67cebfbeb38560f1def not found: ID does not exist" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.787456 4909 scope.go:117] "RemoveContainer" containerID="d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187" Feb 02 10:55:33 crc kubenswrapper[4909]: E0202 10:55:33.787791 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187\": container with ID starting with d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187 not found: ID does not exist" containerID="d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187" Feb 02 10:55:33 crc kubenswrapper[4909]: I0202 10:55:33.787840 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187"} err="failed to get container status \"d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187\": rpc error: code = NotFound desc = could not find container \"d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187\": container with ID starting with d2e8355d49ee3b57df6f3e13363040933e3ad0c05a5ff5a47aa2a2a85bfd0187 not found: ID does not exist" Feb 02 10:55:34 crc kubenswrapper[4909]: I0202 10:55:34.291882 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73ea6517-71cb-47cb-9189-4831c9074637" (UID: "73ea6517-71cb-47cb-9189-4831c9074637"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:55:34 crc kubenswrapper[4909]: I0202 10:55:34.316706 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73ea6517-71cb-47cb-9189-4831c9074637-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:55:34 crc kubenswrapper[4909]: I0202 10:55:34.343928 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-58m5n"] Feb 02 10:55:34 crc kubenswrapper[4909]: I0202 10:55:34.349749 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-58m5n"] Feb 02 10:55:35 crc kubenswrapper[4909]: I0202 10:55:35.029713 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ea6517-71cb-47cb-9189-4831c9074637" path="/var/lib/kubelet/pods/73ea6517-71cb-47cb-9189-4831c9074637/volumes" Feb 02 10:55:49 crc kubenswrapper[4909]: I0202 10:55:49.510632 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:55:49 crc kubenswrapper[4909]: I0202 10:55:49.511202 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.071718 4909 scope.go:117] "RemoveContainer" containerID="4f99a6d61521e8ea4cc86c6b2d1f469b35483740c9efb1d2feef012e700ba6d8" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.104449 4909 scope.go:117] "RemoveContainer" containerID="cb02a83bbb4061df53b24177176c6e857d4ccc4c9e403fa8c36ecca2742b0d60" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.129780 4909 scope.go:117] "RemoveContainer" containerID="4c278d31a61f6e425c77fae2f90acc6e7da317a6a430fd7fad14cf538e9f3d18" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.156832 4909 scope.go:117] "RemoveContainer" containerID="c7b0f3850dc7a45c5f81260746518a4b0fb79c1686ae003921f7039e6da8ea56" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.179941 4909 scope.go:117] "RemoveContainer" containerID="8f5bb18b7321547b528f5af5b21f281e2f479493379bee20dc5095a16bc7f839" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.207470 4909 scope.go:117] "RemoveContainer" containerID="9e1decd1c118f6f232ecbb6ee4515785b72d98434f58ddfbfecc9b53e994c7ac" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.234842 4909 scope.go:117] "RemoveContainer" containerID="32bfe0b197161f5feef754bfc301d5f4c34647785352315e9754fe2f1b3e279c" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.252191 4909 scope.go:117] "RemoveContainer" containerID="ad4f42df5133abae699ddc4b8cf0d5ba9d9b30b6dca926531a75eee4db6df541" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.274762 4909 scope.go:117] "RemoveContainer" containerID="5892737acb052e6a3f93ad5723bf350253f3421da0cd429a24a03c2315a6b594" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.291890 4909 scope.go:117] "RemoveContainer" containerID="f85aea9a9630fd571f870636f8d0a16bc08fbc31cf3802093c4a240a71c26962" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.315856 4909 scope.go:117] "RemoveContainer" containerID="f7f1646054ee4622c5aa78a07f3d9c809ee0e9476cf483d1d536142af4148580" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.331956 4909 scope.go:117] "RemoveContainer" containerID="e616e96154cc54a51e3703d10ef0dafd77cf698728a725fab58c048715a29bb9" Feb 02 10:56:17 crc kubenswrapper[4909]: I0202 10:56:17.352942 4909 scope.go:117] "RemoveContainer" containerID="1852ba1ba50c691b49413bcf9313ce9948ed4b0019ad1761ebd8f1c4dc0d19b4" Feb 02 10:56:19 crc kubenswrapper[4909]: I0202 10:56:19.510979 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:56:19 crc kubenswrapper[4909]: I0202 10:56:19.511038 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:56:49 crc kubenswrapper[4909]: I0202 10:56:49.510687 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 10:56:49 crc kubenswrapper[4909]: I0202 10:56:49.511312 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 10:56:49 crc kubenswrapper[4909]: I0202 10:56:49.511353 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 10:56:49 crc kubenswrapper[4909]: I0202 10:56:49.511938 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 10:56:49 crc kubenswrapper[4909]: I0202 10:56:49.511992 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" gracePeriod=600 Feb 02 10:56:49 crc kubenswrapper[4909]: E0202 10:56:49.627772 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:56:50 crc kubenswrapper[4909]: I0202 10:56:50.291209 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" exitCode=0 Feb 02 10:56:50 crc kubenswrapper[4909]: I0202 10:56:50.291290 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3"} Feb 02 10:56:50 crc kubenswrapper[4909]: I0202 10:56:50.291350 4909 scope.go:117] "RemoveContainer" containerID="ad73216e79d924c2922053100514e06765aa5c63e49cfea0b056d73eebae4d59" Feb 02 10:56:50 crc kubenswrapper[4909]: I0202 10:56:50.291832 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:56:50 crc kubenswrapper[4909]: E0202 10:56:50.292069 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:57:02 crc kubenswrapper[4909]: I0202 10:57:02.017153 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:57:02 crc kubenswrapper[4909]: E0202 10:57:02.018007 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.016443 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:57:17 crc kubenswrapper[4909]: E0202 10:57:17.017664 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.599547 4909 scope.go:117] "RemoveContainer" containerID="0395a622a8f92c7a1198e417ae8f5716c8ea8837376e2548a497195b64fc2694" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.621230 4909 scope.go:117] "RemoveContainer" containerID="ab7cb2bbb65c4d7634f9a5e8e479f9fc0694a5bfd80908df979d2152969dd51d" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.662217 4909 scope.go:117] "RemoveContainer" containerID="311a7b10742f33d16e743d03813d2f4a91673f1da23226395322a7b04d1bbfb8" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.714267 4909 scope.go:117] "RemoveContainer" containerID="53c671e3885cedcf174cef191a0d515bd5fef3560bab4d01f99eaa50ccab077c" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.746144 4909 scope.go:117] "RemoveContainer" containerID="65512ab62f349dbb8f62b135a336a5935d43e474385b936dc740477c5a0ddcb9" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.772210 4909 scope.go:117] "RemoveContainer" containerID="45eafabca7b2a34226244d7ec3e8248a5f7fb392fbe4f2ca423e285105b0e989" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.805578 4909 scope.go:117] "RemoveContainer" containerID="6548681ec6a0ff65f9e2a7c5a769dc2781eeae61756fd4aedf64e11796338ff8" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.830079 4909 scope.go:117] "RemoveContainer" containerID="b8c6530aa64d5e6ef2bc02008781ac63633b0840657292c0f1c40fded0ce080d" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.848625 4909 scope.go:117] "RemoveContainer" containerID="7ec049f0d04a0e397b9675c802ee25f2308b6c17de78026626e4edae11e67077" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.872334 4909 scope.go:117] "RemoveContainer" containerID="9cf5e3332653530749e0dd6aada31d63b66c5777b51173a45a0c7aa3c4572f39" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.888771 4909 scope.go:117] "RemoveContainer" containerID="a11c0ce704f2fc246b98371f13e40088f3fdc710ddd0a844e54c8d408f25ead1" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.909318 4909 scope.go:117] "RemoveContainer" containerID="9605881602e8e9802371144790cb626f296ca0ea0fffdab156c40c6988d7021c" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.943868 4909 scope.go:117] "RemoveContainer" containerID="aa704490c8fecb94ae45cd2a957887fc69b4f0520c85a9a5d292b5f18c60a975" Feb 02 10:57:17 crc kubenswrapper[4909]: I0202 10:57:17.988761 4909 scope.go:117] "RemoveContainer" containerID="a76a4b10b76a12071750070b74105df86f63621b5651becc758ed36c8b2db6f5" Feb 02 10:57:32 crc kubenswrapper[4909]: I0202 10:57:32.017021 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:57:32 crc kubenswrapper[4909]: E0202 10:57:32.017729 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:57:43 crc kubenswrapper[4909]: I0202 10:57:43.017091 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:57:43 crc kubenswrapper[4909]: E0202 10:57:43.017938 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:57:56 crc kubenswrapper[4909]: I0202 10:57:56.016239 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:57:56 crc kubenswrapper[4909]: E0202 10:57:56.016700 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:58:08 crc kubenswrapper[4909]: I0202 10:58:08.016583 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:58:08 crc kubenswrapper[4909]: E0202 10:58:08.017356 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.181293 4909 scope.go:117] "RemoveContainer" containerID="806aaa0ff1451eccc6cc80605c3b759d86a7c3041d5b3685c9a6764ed0b9dda2" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.223054 4909 scope.go:117] "RemoveContainer" containerID="54beb5c8ca4e401e90a1f311f21072bcb862ee38775a25ba04703ad4a2386559" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.262591 4909 scope.go:117] "RemoveContainer" containerID="ce78989884566e501936aa0f9c402244d1dd8a9250194989d7173d57f7bcf1f8" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.286217 4909 scope.go:117] "RemoveContainer" containerID="2aac88e4f9abc70098ab49a36bed3602ad9256a56ae95b31d40125b608196fa8" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.303194 4909 scope.go:117] "RemoveContainer" containerID="0de2c93e8b03385a8e5852fd9c213a372e98af76b96beea748839bfdc6ed33cd" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.319995 4909 scope.go:117] "RemoveContainer" containerID="d673f64542822bb2e56ba036a91f3532349c8a50f5cec241f137474be541104b" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.343236 4909 scope.go:117] "RemoveContainer" containerID="a0b641eb35c253bddcb998df3a1e8bb281e92fcbf9ed7e43633609aff1d97578" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.372679 4909 scope.go:117] "RemoveContainer" containerID="7573e7b7356e01328ac6964e4639c84a155f497ab30f7801c09c8069b1ba1175" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.389187 4909 scope.go:117] "RemoveContainer" containerID="5181b71ba760ed91aeb7b3915c3d7814e4e85e3d4983092599caaa351d56bd8b" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.406725 4909 scope.go:117] "RemoveContainer" containerID="058be1382e5717fb1e57cdf585e0153bea98324cf6703943f27f9e9a5d4a650c" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.422636 4909 scope.go:117] "RemoveContainer" containerID="9b97792ee0643776ed45dd9bce3742b6344f3f3eb329b31a1d413958f371ef6e" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.442352 4909 scope.go:117] "RemoveContainer" containerID="abe9b83cee5ffbc3e19cf902b840a1d2bf660fa5d7600b5b0657196354a95639" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.458325 4909 scope.go:117] "RemoveContainer" containerID="e41df562f1a41e47706274614d03c13faaba64903b9ae50a415bad9f51e06946" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.488498 4909 scope.go:117] "RemoveContainer" containerID="453b20a20c9169f3f80a085b62a851af4591bf9be5cc667f73032d0f885d7114" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.525202 4909 scope.go:117] "RemoveContainer" containerID="e29f67614fb9cccc18985f89282ec44019b367ff92f09a6e3769b0344cbb1193" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.544018 4909 scope.go:117] "RemoveContainer" containerID="aa861435346ae6420a59ce2b05c25881fc8f592ed7abec3f230f2dfe7c08cf09" Feb 02 10:58:18 crc kubenswrapper[4909]: I0202 10:58:18.558029 4909 scope.go:117] "RemoveContainer" containerID="39650f1fdde2fdbf3bd917a0f731e9a813c0c766ece71f265bc33c3787265ed1" Feb 02 10:58:19 crc kubenswrapper[4909]: I0202 10:58:19.017062 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:58:19 crc kubenswrapper[4909]: E0202 10:58:19.017625 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.331061 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vmn2m"] Feb 02 10:58:24 crc kubenswrapper[4909]: E0202 10:58:24.331693 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ea6517-71cb-47cb-9189-4831c9074637" containerName="registry-server" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.331708 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ea6517-71cb-47cb-9189-4831c9074637" containerName="registry-server" Feb 02 10:58:24 crc kubenswrapper[4909]: E0202 10:58:24.331739 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ea6517-71cb-47cb-9189-4831c9074637" containerName="extract-utilities" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.331747 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ea6517-71cb-47cb-9189-4831c9074637" containerName="extract-utilities" Feb 02 10:58:24 crc kubenswrapper[4909]: E0202 10:58:24.331759 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ea6517-71cb-47cb-9189-4831c9074637" containerName="extract-content" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.331769 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ea6517-71cb-47cb-9189-4831c9074637" containerName="extract-content" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.331932 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ea6517-71cb-47cb-9189-4831c9074637" containerName="registry-server" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.333128 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.342839 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vmn2m"] Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.443968 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-catalog-content\") pod \"certified-operators-vmn2m\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.444051 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf6ft\" (UniqueName: \"kubernetes.io/projected/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-kube-api-access-wf6ft\") pod \"certified-operators-vmn2m\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.444263 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-utilities\") pod \"certified-operators-vmn2m\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.545401 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-catalog-content\") pod \"certified-operators-vmn2m\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.545497 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf6ft\" (UniqueName: \"kubernetes.io/projected/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-kube-api-access-wf6ft\") pod \"certified-operators-vmn2m\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.545565 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-utilities\") pod \"certified-operators-vmn2m\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.545950 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-catalog-content\") pod \"certified-operators-vmn2m\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.546007 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-utilities\") pod \"certified-operators-vmn2m\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.565153 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf6ft\" (UniqueName: \"kubernetes.io/projected/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-kube-api-access-wf6ft\") pod \"certified-operators-vmn2m\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.651958 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.914063 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vmn2m"] Feb 02 10:58:24 crc kubenswrapper[4909]: I0202 10:58:24.967627 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmn2m" event={"ID":"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57","Type":"ContainerStarted","Data":"8ba8db10ae0874cbb2a8685260c06a02c65b92ecc1d5cf137f63e4962f7047cb"} Feb 02 10:58:25 crc kubenswrapper[4909]: I0202 10:58:25.993000 4909 generic.go:334] "Generic (PLEG): container finished" podID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerID="668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57" exitCode=0 Feb 02 10:58:25 crc kubenswrapper[4909]: I0202 10:58:25.993047 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmn2m" event={"ID":"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57","Type":"ContainerDied","Data":"668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57"} Feb 02 10:58:27 crc kubenswrapper[4909]: I0202 10:58:27.002470 4909 generic.go:334] "Generic (PLEG): container finished" podID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerID="a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077" exitCode=0 Feb 02 10:58:27 crc kubenswrapper[4909]: I0202 10:58:27.002513 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmn2m" event={"ID":"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57","Type":"ContainerDied","Data":"a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077"} Feb 02 10:58:28 crc kubenswrapper[4909]: I0202 10:58:28.011836 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmn2m" event={"ID":"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57","Type":"ContainerStarted","Data":"8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f"} Feb 02 10:58:28 crc kubenswrapper[4909]: I0202 10:58:28.030568 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vmn2m" podStartSLOduration=2.582643073 podStartE2EDuration="4.030549453s" podCreationTimestamp="2026-02-02 10:58:24 +0000 UTC" firstStartedPulling="2026-02-02 10:58:25.995443177 +0000 UTC m=+1631.741543912" lastFinishedPulling="2026-02-02 10:58:27.443349557 +0000 UTC m=+1633.189450292" observedRunningTime="2026-02-02 10:58:28.027901278 +0000 UTC m=+1633.774002013" watchObservedRunningTime="2026-02-02 10:58:28.030549453 +0000 UTC m=+1633.776650198" Feb 02 10:58:31 crc kubenswrapper[4909]: I0202 10:58:31.016716 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:58:31 crc kubenswrapper[4909]: E0202 10:58:31.017197 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:58:34 crc kubenswrapper[4909]: I0202 10:58:34.652840 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:34 crc kubenswrapper[4909]: I0202 10:58:34.653155 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:34 crc kubenswrapper[4909]: I0202 10:58:34.699049 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:35 crc kubenswrapper[4909]: I0202 10:58:35.100535 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:35 crc kubenswrapper[4909]: I0202 10:58:35.145581 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vmn2m"] Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.075329 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vmn2m" podUID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerName="registry-server" containerID="cri-o://8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f" gracePeriod=2 Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.461102 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.529061 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf6ft\" (UniqueName: \"kubernetes.io/projected/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-kube-api-access-wf6ft\") pod \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.529139 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-utilities\") pod \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.529207 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-catalog-content\") pod \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\" (UID: \"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57\") " Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.530064 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-utilities" (OuterVolumeSpecName: "utilities") pod "5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" (UID: "5b3eb4b1-7b61-4b9d-9787-aae2f11eda57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.534945 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-kube-api-access-wf6ft" (OuterVolumeSpecName: "kube-api-access-wf6ft") pod "5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" (UID: "5b3eb4b1-7b61-4b9d-9787-aae2f11eda57"). InnerVolumeSpecName "kube-api-access-wf6ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.581345 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" (UID: "5b3eb4b1-7b61-4b9d-9787-aae2f11eda57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.631080 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf6ft\" (UniqueName: \"kubernetes.io/projected/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-kube-api-access-wf6ft\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.631125 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:37 crc kubenswrapper[4909]: I0202 10:58:37.631139 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.084901 4909 generic.go:334] "Generic (PLEG): container finished" podID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerID="8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f" exitCode=0 Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.084956 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vmn2m" Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.084980 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmn2m" event={"ID":"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57","Type":"ContainerDied","Data":"8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f"} Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.085232 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vmn2m" event={"ID":"5b3eb4b1-7b61-4b9d-9787-aae2f11eda57","Type":"ContainerDied","Data":"8ba8db10ae0874cbb2a8685260c06a02c65b92ecc1d5cf137f63e4962f7047cb"} Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.085250 4909 scope.go:117] "RemoveContainer" containerID="8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f" Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.109398 4909 scope.go:117] "RemoveContainer" containerID="a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077" Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.121934 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vmn2m"] Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.125574 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vmn2m"] Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.130208 4909 scope.go:117] "RemoveContainer" containerID="668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57" Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.151712 4909 scope.go:117] "RemoveContainer" containerID="8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f" Feb 02 10:58:38 crc kubenswrapper[4909]: E0202 10:58:38.152140 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f\": container with ID starting with 8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f not found: ID does not exist" containerID="8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f" Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.152178 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f"} err="failed to get container status \"8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f\": rpc error: code = NotFound desc = could not find container \"8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f\": container with ID starting with 8f3aa85eb0564da5126c8ffdd69216e06218c4ebac79d2f98094dfdb8d0c0b4f not found: ID does not exist" Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.152229 4909 scope.go:117] "RemoveContainer" containerID="a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077" Feb 02 10:58:38 crc kubenswrapper[4909]: E0202 10:58:38.152598 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077\": container with ID starting with a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077 not found: ID does not exist" containerID="a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077" Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.152631 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077"} err="failed to get container status \"a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077\": rpc error: code = NotFound desc = could not find container \"a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077\": container with ID starting with a2967b8dd13ee1ad0dbff88f8b3ba8c1fca6ecbda1bd2b3538f6b74092670077 not found: ID does not exist" Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.152654 4909 scope.go:117] "RemoveContainer" containerID="668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57" Feb 02 10:58:38 crc kubenswrapper[4909]: E0202 10:58:38.153560 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57\": container with ID starting with 668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57 not found: ID does not exist" containerID="668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57" Feb 02 10:58:38 crc kubenswrapper[4909]: I0202 10:58:38.153584 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57"} err="failed to get container status \"668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57\": rpc error: code = NotFound desc = could not find container \"668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57\": container with ID starting with 668c88a8eb1b4bba53f4e26e0e45df52bbd813f09cf93677378dc793ea657a57 not found: ID does not exist" Feb 02 10:58:39 crc kubenswrapper[4909]: I0202 10:58:39.023701 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" path="/var/lib/kubelet/pods/5b3eb4b1-7b61-4b9d-9787-aae2f11eda57/volumes" Feb 02 10:58:43 crc kubenswrapper[4909]: I0202 10:58:43.016340 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:58:43 crc kubenswrapper[4909]: E0202 10:58:43.016874 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:58:57 crc kubenswrapper[4909]: I0202 10:58:57.016991 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:58:57 crc kubenswrapper[4909]: E0202 10:58:57.018392 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:59:12 crc kubenswrapper[4909]: I0202 10:59:12.016787 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:59:12 crc kubenswrapper[4909]: E0202 10:59:12.019632 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:59:18 crc kubenswrapper[4909]: I0202 10:59:18.706913 4909 scope.go:117] "RemoveContainer" containerID="e94e4dc6904fd62932a918a1984c4c0852316abca98ec3e71a4141a0396799d0" Feb 02 10:59:18 crc kubenswrapper[4909]: I0202 10:59:18.749111 4909 scope.go:117] "RemoveContainer" containerID="ecc6af1fec99e4b5728487ec159109b228720e7e811d71b0bc2fd00d2e8a68cc" Feb 02 10:59:18 crc kubenswrapper[4909]: I0202 10:59:18.780915 4909 scope.go:117] "RemoveContainer" containerID="a4aa6b07c367fc7cfccceb7d438ff24f47d22ec8fb03f3acd437ee89b27c8ba1" Feb 02 10:59:18 crc kubenswrapper[4909]: I0202 10:59:18.797851 4909 scope.go:117] "RemoveContainer" containerID="1ea9ba4798372399476cd86a9d7248e0dc1ac843d30751cd5067d09ad4963c62" Feb 02 10:59:18 crc kubenswrapper[4909]: I0202 10:59:18.814574 4909 scope.go:117] "RemoveContainer" containerID="3f7435b37c532d142cc3a2e44d9a5f007446b4a37f3b1f4607541c257406cba7" Feb 02 10:59:18 crc kubenswrapper[4909]: I0202 10:59:18.837833 4909 scope.go:117] "RemoveContainer" containerID="174be2f9179aec2d0d5dfb3b6c0efd4bcdf236b7cca281a9e2eacfe94339df84" Feb 02 10:59:18 crc kubenswrapper[4909]: I0202 10:59:18.883788 4909 scope.go:117] "RemoveContainer" containerID="806bbf2f7813821e387fd50d1030b913b4e7caeb65653bb90e0d90a42508f01e" Feb 02 10:59:18 crc kubenswrapper[4909]: I0202 10:59:18.900842 4909 scope.go:117] "RemoveContainer" containerID="e73ee1c3eb5c48046de751f50517094408c4995c4f0ca716981bc10d3c559f1f" Feb 02 10:59:18 crc kubenswrapper[4909]: I0202 10:59:18.929975 4909 scope.go:117] "RemoveContainer" containerID="b05c5b98d511e2a3da5be2387945a639c6a9aff97a09b61b93ceef5e4e6bd022" Feb 02 10:59:18 crc kubenswrapper[4909]: I0202 10:59:18.944973 4909 scope.go:117] "RemoveContainer" containerID="79243ba3eb5d42f1bd0f138c616462a4f129bde3f205dc9026f0724a065bb0f7" Feb 02 10:59:25 crc kubenswrapper[4909]: I0202 10:59:25.025564 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:59:25 crc kubenswrapper[4909]: E0202 10:59:25.026424 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:59:36 crc kubenswrapper[4909]: I0202 10:59:36.016669 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:59:36 crc kubenswrapper[4909]: E0202 10:59:36.017465 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 10:59:49 crc kubenswrapper[4909]: I0202 10:59:49.016743 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 10:59:49 crc kubenswrapper[4909]: E0202 10:59:49.018048 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.140627 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld"] Feb 02 11:00:00 crc kubenswrapper[4909]: E0202 11:00:00.141656 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerName="registry-server" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.141677 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerName="registry-server" Feb 02 11:00:00 crc kubenswrapper[4909]: E0202 11:00:00.141705 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerName="extract-content" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.141715 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerName="extract-content" Feb 02 11:00:00 crc kubenswrapper[4909]: E0202 11:00:00.141733 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerName="extract-utilities" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.141744 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerName="extract-utilities" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.141994 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b3eb4b1-7b61-4b9d-9787-aae2f11eda57" containerName="registry-server" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.142693 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.145389 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.145796 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.148363 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld"] Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.222993 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-secret-volume\") pod \"collect-profiles-29500500-dr6ld\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.223314 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwwcb\" (UniqueName: \"kubernetes.io/projected/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-kube-api-access-mwwcb\") pod \"collect-profiles-29500500-dr6ld\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.223347 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-config-volume\") pod \"collect-profiles-29500500-dr6ld\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.324368 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-secret-volume\") pod \"collect-profiles-29500500-dr6ld\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.324432 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwwcb\" (UniqueName: \"kubernetes.io/projected/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-kube-api-access-mwwcb\") pod \"collect-profiles-29500500-dr6ld\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.324471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-config-volume\") pod \"collect-profiles-29500500-dr6ld\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.325426 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-config-volume\") pod \"collect-profiles-29500500-dr6ld\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.338880 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-secret-volume\") pod \"collect-profiles-29500500-dr6ld\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.341401 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwwcb\" (UniqueName: \"kubernetes.io/projected/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-kube-api-access-mwwcb\") pod \"collect-profiles-29500500-dr6ld\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.463046 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:00 crc kubenswrapper[4909]: I0202 11:00:00.856933 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld"] Feb 02 11:00:00 crc kubenswrapper[4909]: W0202 11:00:00.868991 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5908b427_7fb4_4548_bb65_68c1f0f2ec7e.slice/crio-ae0a012e69c7df19b871a70de4491bbff02db0ef2baec7a8eb6247f0659f8b19 WatchSource:0}: Error finding container ae0a012e69c7df19b871a70de4491bbff02db0ef2baec7a8eb6247f0659f8b19: Status 404 returned error can't find the container with id ae0a012e69c7df19b871a70de4491bbff02db0ef2baec7a8eb6247f0659f8b19 Feb 02 11:00:01 crc kubenswrapper[4909]: I0202 11:00:01.661560 4909 generic.go:334] "Generic (PLEG): container finished" podID="5908b427-7fb4-4548-bb65-68c1f0f2ec7e" containerID="de856560113eeb3fb23d80f25fdd30eb5948e9ee2d5c213f010b5d04ef2e8877" exitCode=0 Feb 02 11:00:01 crc kubenswrapper[4909]: I0202 11:00:01.661621 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" event={"ID":"5908b427-7fb4-4548-bb65-68c1f0f2ec7e","Type":"ContainerDied","Data":"de856560113eeb3fb23d80f25fdd30eb5948e9ee2d5c213f010b5d04ef2e8877"} Feb 02 11:00:01 crc kubenswrapper[4909]: I0202 11:00:01.661873 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" event={"ID":"5908b427-7fb4-4548-bb65-68c1f0f2ec7e","Type":"ContainerStarted","Data":"ae0a012e69c7df19b871a70de4491bbff02db0ef2baec7a8eb6247f0659f8b19"} Feb 02 11:00:02 crc kubenswrapper[4909]: I0202 11:00:02.016783 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:00:02 crc kubenswrapper[4909]: E0202 11:00:02.017129 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:00:02 crc kubenswrapper[4909]: I0202 11:00:02.903286 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:02 crc kubenswrapper[4909]: I0202 11:00:02.957773 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwwcb\" (UniqueName: \"kubernetes.io/projected/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-kube-api-access-mwwcb\") pod \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " Feb 02 11:00:02 crc kubenswrapper[4909]: I0202 11:00:02.957890 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-config-volume\") pod \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " Feb 02 11:00:02 crc kubenswrapper[4909]: I0202 11:00:02.957934 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-secret-volume\") pod \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\" (UID: \"5908b427-7fb4-4548-bb65-68c1f0f2ec7e\") " Feb 02 11:00:02 crc kubenswrapper[4909]: I0202 11:00:02.959014 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-config-volume" (OuterVolumeSpecName: "config-volume") pod "5908b427-7fb4-4548-bb65-68c1f0f2ec7e" (UID: "5908b427-7fb4-4548-bb65-68c1f0f2ec7e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:00:02 crc kubenswrapper[4909]: I0202 11:00:02.963036 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-kube-api-access-mwwcb" (OuterVolumeSpecName: "kube-api-access-mwwcb") pod "5908b427-7fb4-4548-bb65-68c1f0f2ec7e" (UID: "5908b427-7fb4-4548-bb65-68c1f0f2ec7e"). InnerVolumeSpecName "kube-api-access-mwwcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:00:02 crc kubenswrapper[4909]: I0202 11:00:02.963109 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5908b427-7fb4-4548-bb65-68c1f0f2ec7e" (UID: "5908b427-7fb4-4548-bb65-68c1f0f2ec7e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:00:03 crc kubenswrapper[4909]: I0202 11:00:03.059826 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:03 crc kubenswrapper[4909]: I0202 11:00:03.059856 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:03 crc kubenswrapper[4909]: I0202 11:00:03.059867 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwwcb\" (UniqueName: \"kubernetes.io/projected/5908b427-7fb4-4548-bb65-68c1f0f2ec7e-kube-api-access-mwwcb\") on node \"crc\" DevicePath \"\"" Feb 02 11:00:03 crc kubenswrapper[4909]: I0202 11:00:03.676615 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" event={"ID":"5908b427-7fb4-4548-bb65-68c1f0f2ec7e","Type":"ContainerDied","Data":"ae0a012e69c7df19b871a70de4491bbff02db0ef2baec7a8eb6247f0659f8b19"} Feb 02 11:00:03 crc kubenswrapper[4909]: I0202 11:00:03.676658 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae0a012e69c7df19b871a70de4491bbff02db0ef2baec7a8eb6247f0659f8b19" Feb 02 11:00:03 crc kubenswrapper[4909]: I0202 11:00:03.676680 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld" Feb 02 11:00:16 crc kubenswrapper[4909]: I0202 11:00:16.016063 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:00:16 crc kubenswrapper[4909]: E0202 11:00:16.016712 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:00:19 crc kubenswrapper[4909]: I0202 11:00:19.070801 4909 scope.go:117] "RemoveContainer" containerID="a20507244041d23b32f190d2a807a0eee356b71acfba50ced33b0ba77be75a3b" Feb 02 11:00:19 crc kubenswrapper[4909]: I0202 11:00:19.094772 4909 scope.go:117] "RemoveContainer" containerID="428fe0a93c4288c4b421df226520af6098fbd2af1d876fd098d9c807b6800246" Feb 02 11:00:19 crc kubenswrapper[4909]: I0202 11:00:19.138039 4909 scope.go:117] "RemoveContainer" containerID="1a3c44ae75c0e82a41ac845e6bc75bd78044db6d50d81a3cf25783da569108b8" Feb 02 11:00:19 crc kubenswrapper[4909]: I0202 11:00:19.154885 4909 scope.go:117] "RemoveContainer" containerID="7f5547c30d19b6a3e624654a6b30546acaaa5bb4e35836a31ce664cbd4d3448e" Feb 02 11:00:19 crc kubenswrapper[4909]: I0202 11:00:19.173649 4909 scope.go:117] "RemoveContainer" containerID="dfe3f218431d62cbf5802f23a9e8f8dfdc1b311556035b91ad8568910dc22027" Feb 02 11:00:19 crc kubenswrapper[4909]: I0202 11:00:19.190984 4909 scope.go:117] "RemoveContainer" containerID="a5429c7323d593269c829013791c5ec02f6c2311d2ea72699e259a5d72454bac" Feb 02 11:00:19 crc kubenswrapper[4909]: I0202 11:00:19.207991 4909 scope.go:117] "RemoveContainer" containerID="9146b7fe892e3b1b3b9341aaa935f843c8006e1d193949290905af61bbfd863a" Feb 02 11:00:29 crc kubenswrapper[4909]: I0202 11:00:29.016728 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:00:29 crc kubenswrapper[4909]: E0202 11:00:29.017459 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:00:42 crc kubenswrapper[4909]: I0202 11:00:42.017152 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:00:42 crc kubenswrapper[4909]: E0202 11:00:42.017906 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:00:57 crc kubenswrapper[4909]: I0202 11:00:57.016299 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:00:57 crc kubenswrapper[4909]: E0202 11:00:57.017089 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:01:08 crc kubenswrapper[4909]: I0202 11:01:08.016685 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:01:08 crc kubenswrapper[4909]: E0202 11:01:08.017431 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:01:19 crc kubenswrapper[4909]: I0202 11:01:19.288630 4909 scope.go:117] "RemoveContainer" containerID="00d102ada4d1a621cc75e76f5d11972c92c765cc655c0852af742709db540a93" Feb 02 11:01:21 crc kubenswrapper[4909]: I0202 11:01:21.017355 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:01:21 crc kubenswrapper[4909]: E0202 11:01:21.018181 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:01:36 crc kubenswrapper[4909]: I0202 11:01:36.016067 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:01:36 crc kubenswrapper[4909]: E0202 11:01:36.016821 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:01:48 crc kubenswrapper[4909]: I0202 11:01:48.016307 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:01:48 crc kubenswrapper[4909]: E0202 11:01:48.017073 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:02:00 crc kubenswrapper[4909]: I0202 11:02:00.017305 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:02:00 crc kubenswrapper[4909]: I0202 11:02:00.472771 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"9767fc57bc0ed3d797c691f994688f78f00d227a02855739406583aff93fd8c2"} Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.370215 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-92w2b"] Feb 02 11:04:16 crc kubenswrapper[4909]: E0202 11:04:16.371067 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5908b427-7fb4-4548-bb65-68c1f0f2ec7e" containerName="collect-profiles" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.371149 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5908b427-7fb4-4548-bb65-68c1f0f2ec7e" containerName="collect-profiles" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.371292 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5908b427-7fb4-4548-bb65-68c1f0f2ec7e" containerName="collect-profiles" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.372226 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.387734 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-92w2b"] Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.434567 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpw9m\" (UniqueName: \"kubernetes.io/projected/fe3d03cd-3fce-4986-8c4c-2956373475ea-kube-api-access-wpw9m\") pod \"redhat-marketplace-92w2b\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.434642 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-utilities\") pod \"redhat-marketplace-92w2b\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.434709 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-catalog-content\") pod \"redhat-marketplace-92w2b\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.536220 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpw9m\" (UniqueName: \"kubernetes.io/projected/fe3d03cd-3fce-4986-8c4c-2956373475ea-kube-api-access-wpw9m\") pod \"redhat-marketplace-92w2b\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.536275 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-utilities\") pod \"redhat-marketplace-92w2b\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.536307 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-catalog-content\") pod \"redhat-marketplace-92w2b\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.536945 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-utilities\") pod \"redhat-marketplace-92w2b\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.536985 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-catalog-content\") pod \"redhat-marketplace-92w2b\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.555846 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpw9m\" (UniqueName: \"kubernetes.io/projected/fe3d03cd-3fce-4986-8c4c-2956373475ea-kube-api-access-wpw9m\") pod \"redhat-marketplace-92w2b\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:16 crc kubenswrapper[4909]: I0202 11:04:16.695129 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:17 crc kubenswrapper[4909]: I0202 11:04:17.131413 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-92w2b"] Feb 02 11:04:17 crc kubenswrapper[4909]: I0202 11:04:17.392074 4909 generic.go:334] "Generic (PLEG): container finished" podID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerID="844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308" exitCode=0 Feb 02 11:04:17 crc kubenswrapper[4909]: I0202 11:04:17.392124 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92w2b" event={"ID":"fe3d03cd-3fce-4986-8c4c-2956373475ea","Type":"ContainerDied","Data":"844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308"} Feb 02 11:04:17 crc kubenswrapper[4909]: I0202 11:04:17.392189 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92w2b" event={"ID":"fe3d03cd-3fce-4986-8c4c-2956373475ea","Type":"ContainerStarted","Data":"45fc1b0e9c6ddaa5601ec5b4dcd6e4bd253f6439b685df42a2108c6485e6ba97"} Feb 02 11:04:17 crc kubenswrapper[4909]: I0202 11:04:17.400329 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:04:18 crc kubenswrapper[4909]: I0202 11:04:18.402165 4909 generic.go:334] "Generic (PLEG): container finished" podID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerID="a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034" exitCode=0 Feb 02 11:04:18 crc kubenswrapper[4909]: I0202 11:04:18.402251 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92w2b" event={"ID":"fe3d03cd-3fce-4986-8c4c-2956373475ea","Type":"ContainerDied","Data":"a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034"} Feb 02 11:04:19 crc kubenswrapper[4909]: I0202 11:04:19.411176 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92w2b" event={"ID":"fe3d03cd-3fce-4986-8c4c-2956373475ea","Type":"ContainerStarted","Data":"a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b"} Feb 02 11:04:19 crc kubenswrapper[4909]: I0202 11:04:19.436076 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-92w2b" podStartSLOduration=2.005282216 podStartE2EDuration="3.436056537s" podCreationTimestamp="2026-02-02 11:04:16 +0000 UTC" firstStartedPulling="2026-02-02 11:04:17.399952101 +0000 UTC m=+1983.146052836" lastFinishedPulling="2026-02-02 11:04:18.830726422 +0000 UTC m=+1984.576827157" observedRunningTime="2026-02-02 11:04:19.430139648 +0000 UTC m=+1985.176240383" watchObservedRunningTime="2026-02-02 11:04:19.436056537 +0000 UTC m=+1985.182157272" Feb 02 11:04:19 crc kubenswrapper[4909]: I0202 11:04:19.511140 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:04:19 crc kubenswrapper[4909]: I0202 11:04:19.511195 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:04:21 crc kubenswrapper[4909]: I0202 11:04:21.787984 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mz9tz"] Feb 02 11:04:21 crc kubenswrapper[4909]: I0202 11:04:21.793221 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mz9tz"] Feb 02 11:04:21 crc kubenswrapper[4909]: I0202 11:04:21.793355 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:21 crc kubenswrapper[4909]: I0202 11:04:21.917569 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-catalog-content\") pod \"community-operators-mz9tz\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:21 crc kubenswrapper[4909]: I0202 11:04:21.917632 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7ln2\" (UniqueName: \"kubernetes.io/projected/7dd444e4-debc-49d2-a673-2739d78fbbf3-kube-api-access-v7ln2\") pod \"community-operators-mz9tz\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:21 crc kubenswrapper[4909]: I0202 11:04:21.917652 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-utilities\") pod \"community-operators-mz9tz\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:22 crc kubenswrapper[4909]: I0202 11:04:22.019247 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-utilities\") pod \"community-operators-mz9tz\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:22 crc kubenswrapper[4909]: I0202 11:04:22.019402 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-catalog-content\") pod \"community-operators-mz9tz\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:22 crc kubenswrapper[4909]: I0202 11:04:22.019452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7ln2\" (UniqueName: \"kubernetes.io/projected/7dd444e4-debc-49d2-a673-2739d78fbbf3-kube-api-access-v7ln2\") pod \"community-operators-mz9tz\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:22 crc kubenswrapper[4909]: I0202 11:04:22.019849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-utilities\") pod \"community-operators-mz9tz\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:22 crc kubenswrapper[4909]: I0202 11:04:22.019868 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-catalog-content\") pod \"community-operators-mz9tz\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:22 crc kubenswrapper[4909]: I0202 11:04:22.044968 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7ln2\" (UniqueName: \"kubernetes.io/projected/7dd444e4-debc-49d2-a673-2739d78fbbf3-kube-api-access-v7ln2\") pod \"community-operators-mz9tz\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:22 crc kubenswrapper[4909]: I0202 11:04:22.143327 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:22 crc kubenswrapper[4909]: I0202 11:04:22.612707 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mz9tz"] Feb 02 11:04:23 crc kubenswrapper[4909]: I0202 11:04:23.443402 4909 generic.go:334] "Generic (PLEG): container finished" podID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerID="a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509" exitCode=0 Feb 02 11:04:23 crc kubenswrapper[4909]: I0202 11:04:23.443456 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz9tz" event={"ID":"7dd444e4-debc-49d2-a673-2739d78fbbf3","Type":"ContainerDied","Data":"a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509"} Feb 02 11:04:23 crc kubenswrapper[4909]: I0202 11:04:23.443501 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz9tz" event={"ID":"7dd444e4-debc-49d2-a673-2739d78fbbf3","Type":"ContainerStarted","Data":"9b2584f99db443c01d66b8d746990821ab2ef35de4e35e73226568922d624ec9"} Feb 02 11:04:24 crc kubenswrapper[4909]: I0202 11:04:24.453133 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz9tz" event={"ID":"7dd444e4-debc-49d2-a673-2739d78fbbf3","Type":"ContainerStarted","Data":"41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9"} Feb 02 11:04:25 crc kubenswrapper[4909]: I0202 11:04:25.462900 4909 generic.go:334] "Generic (PLEG): container finished" podID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerID="41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9" exitCode=0 Feb 02 11:04:25 crc kubenswrapper[4909]: I0202 11:04:25.462971 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz9tz" event={"ID":"7dd444e4-debc-49d2-a673-2739d78fbbf3","Type":"ContainerDied","Data":"41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9"} Feb 02 11:04:26 crc kubenswrapper[4909]: I0202 11:04:26.473660 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz9tz" event={"ID":"7dd444e4-debc-49d2-a673-2739d78fbbf3","Type":"ContainerStarted","Data":"10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b"} Feb 02 11:04:26 crc kubenswrapper[4909]: I0202 11:04:26.504347 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mz9tz" podStartSLOduration=3.040965587 podStartE2EDuration="5.504310966s" podCreationTimestamp="2026-02-02 11:04:21 +0000 UTC" firstStartedPulling="2026-02-02 11:04:23.446572515 +0000 UTC m=+1989.192673250" lastFinishedPulling="2026-02-02 11:04:25.909917894 +0000 UTC m=+1991.656018629" observedRunningTime="2026-02-02 11:04:26.503558805 +0000 UTC m=+1992.249659540" watchObservedRunningTime="2026-02-02 11:04:26.504310966 +0000 UTC m=+1992.250411701" Feb 02 11:04:26 crc kubenswrapper[4909]: I0202 11:04:26.696219 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:26 crc kubenswrapper[4909]: I0202 11:04:26.696275 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:26 crc kubenswrapper[4909]: I0202 11:04:26.739199 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:27 crc kubenswrapper[4909]: I0202 11:04:27.518584 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:28 crc kubenswrapper[4909]: I0202 11:04:28.947145 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-92w2b"] Feb 02 11:04:29 crc kubenswrapper[4909]: I0202 11:04:29.494384 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-92w2b" podUID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerName="registry-server" containerID="cri-o://a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b" gracePeriod=2 Feb 02 11:04:29 crc kubenswrapper[4909]: I0202 11:04:29.920501 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.034919 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpw9m\" (UniqueName: \"kubernetes.io/projected/fe3d03cd-3fce-4986-8c4c-2956373475ea-kube-api-access-wpw9m\") pod \"fe3d03cd-3fce-4986-8c4c-2956373475ea\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.034989 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-catalog-content\") pod \"fe3d03cd-3fce-4986-8c4c-2956373475ea\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.035108 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-utilities\") pod \"fe3d03cd-3fce-4986-8c4c-2956373475ea\" (UID: \"fe3d03cd-3fce-4986-8c4c-2956373475ea\") " Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.036399 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-utilities" (OuterVolumeSpecName: "utilities") pod "fe3d03cd-3fce-4986-8c4c-2956373475ea" (UID: "fe3d03cd-3fce-4986-8c4c-2956373475ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.040417 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe3d03cd-3fce-4986-8c4c-2956373475ea-kube-api-access-wpw9m" (OuterVolumeSpecName: "kube-api-access-wpw9m") pod "fe3d03cd-3fce-4986-8c4c-2956373475ea" (UID: "fe3d03cd-3fce-4986-8c4c-2956373475ea"). InnerVolumeSpecName "kube-api-access-wpw9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.061097 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe3d03cd-3fce-4986-8c4c-2956373475ea" (UID: "fe3d03cd-3fce-4986-8c4c-2956373475ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.137315 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.137346 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpw9m\" (UniqueName: \"kubernetes.io/projected/fe3d03cd-3fce-4986-8c4c-2956373475ea-kube-api-access-wpw9m\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.137372 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe3d03cd-3fce-4986-8c4c-2956373475ea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.502092 4909 generic.go:334] "Generic (PLEG): container finished" podID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerID="a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b" exitCode=0 Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.502133 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92w2b" event={"ID":"fe3d03cd-3fce-4986-8c4c-2956373475ea","Type":"ContainerDied","Data":"a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b"} Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.502161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92w2b" event={"ID":"fe3d03cd-3fce-4986-8c4c-2956373475ea","Type":"ContainerDied","Data":"45fc1b0e9c6ddaa5601ec5b4dcd6e4bd253f6439b685df42a2108c6485e6ba97"} Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.502179 4909 scope.go:117] "RemoveContainer" containerID="a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.502280 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92w2b" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.539937 4909 scope.go:117] "RemoveContainer" containerID="a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.545160 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-92w2b"] Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.552354 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-92w2b"] Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.561409 4909 scope.go:117] "RemoveContainer" containerID="844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.586909 4909 scope.go:117] "RemoveContainer" containerID="a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b" Feb 02 11:04:30 crc kubenswrapper[4909]: E0202 11:04:30.587576 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b\": container with ID starting with a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b not found: ID does not exist" containerID="a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.587616 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b"} err="failed to get container status \"a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b\": rpc error: code = NotFound desc = could not find container \"a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b\": container with ID starting with a82edd166fb1d836e6963934855124a43ac4c1644396bc7eb4d2aabba3f3727b not found: ID does not exist" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.587662 4909 scope.go:117] "RemoveContainer" containerID="a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034" Feb 02 11:04:30 crc kubenswrapper[4909]: E0202 11:04:30.588146 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034\": container with ID starting with a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034 not found: ID does not exist" containerID="a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.588244 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034"} err="failed to get container status \"a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034\": rpc error: code = NotFound desc = could not find container \"a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034\": container with ID starting with a7c10c03f8443ec578b07fb3b57dfb88a4e088d0cd10bf6153042c1b11f2c034 not found: ID does not exist" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.588344 4909 scope.go:117] "RemoveContainer" containerID="844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308" Feb 02 11:04:30 crc kubenswrapper[4909]: E0202 11:04:30.588678 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308\": container with ID starting with 844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308 not found: ID does not exist" containerID="844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308" Feb 02 11:04:30 crc kubenswrapper[4909]: I0202 11:04:30.588717 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308"} err="failed to get container status \"844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308\": rpc error: code = NotFound desc = could not find container \"844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308\": container with ID starting with 844219de9a743aeaa26aadcf9817d71d5b1f2ef25486c8f9a44f159a019ef308 not found: ID does not exist" Feb 02 11:04:31 crc kubenswrapper[4909]: I0202 11:04:31.027324 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe3d03cd-3fce-4986-8c4c-2956373475ea" path="/var/lib/kubelet/pods/fe3d03cd-3fce-4986-8c4c-2956373475ea/volumes" Feb 02 11:04:32 crc kubenswrapper[4909]: I0202 11:04:32.143744 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:32 crc kubenswrapper[4909]: I0202 11:04:32.144216 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:32 crc kubenswrapper[4909]: I0202 11:04:32.209421 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:32 crc kubenswrapper[4909]: I0202 11:04:32.560347 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:33 crc kubenswrapper[4909]: I0202 11:04:33.347392 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mz9tz"] Feb 02 11:04:34 crc kubenswrapper[4909]: I0202 11:04:34.529415 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mz9tz" podUID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerName="registry-server" containerID="cri-o://10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b" gracePeriod=2 Feb 02 11:04:34 crc kubenswrapper[4909]: I0202 11:04:34.890983 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.001205 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-utilities\") pod \"7dd444e4-debc-49d2-a673-2739d78fbbf3\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.001417 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7ln2\" (UniqueName: \"kubernetes.io/projected/7dd444e4-debc-49d2-a673-2739d78fbbf3-kube-api-access-v7ln2\") pod \"7dd444e4-debc-49d2-a673-2739d78fbbf3\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.001466 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-catalog-content\") pod \"7dd444e4-debc-49d2-a673-2739d78fbbf3\" (UID: \"7dd444e4-debc-49d2-a673-2739d78fbbf3\") " Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.002517 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-utilities" (OuterVolumeSpecName: "utilities") pod "7dd444e4-debc-49d2-a673-2739d78fbbf3" (UID: "7dd444e4-debc-49d2-a673-2739d78fbbf3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.053975 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dd444e4-debc-49d2-a673-2739d78fbbf3" (UID: "7dd444e4-debc-49d2-a673-2739d78fbbf3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.103663 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.104026 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dd444e4-debc-49d2-a673-2739d78fbbf3-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.429705 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd444e4-debc-49d2-a673-2739d78fbbf3-kube-api-access-v7ln2" (OuterVolumeSpecName: "kube-api-access-v7ln2") pod "7dd444e4-debc-49d2-a673-2739d78fbbf3" (UID: "7dd444e4-debc-49d2-a673-2739d78fbbf3"). InnerVolumeSpecName "kube-api-access-v7ln2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.510395 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7ln2\" (UniqueName: \"kubernetes.io/projected/7dd444e4-debc-49d2-a673-2739d78fbbf3-kube-api-access-v7ln2\") on node \"crc\" DevicePath \"\"" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.539049 4909 generic.go:334] "Generic (PLEG): container finished" podID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerID="10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b" exitCode=0 Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.539369 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz9tz" event={"ID":"7dd444e4-debc-49d2-a673-2739d78fbbf3","Type":"ContainerDied","Data":"10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b"} Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.539395 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mz9tz" event={"ID":"7dd444e4-debc-49d2-a673-2739d78fbbf3","Type":"ContainerDied","Data":"9b2584f99db443c01d66b8d746990821ab2ef35de4e35e73226568922d624ec9"} Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.539410 4909 scope.go:117] "RemoveContainer" containerID="10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.539546 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mz9tz" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.556525 4909 scope.go:117] "RemoveContainer" containerID="41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.572896 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mz9tz"] Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.578111 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mz9tz"] Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.594202 4909 scope.go:117] "RemoveContainer" containerID="a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.614289 4909 scope.go:117] "RemoveContainer" containerID="10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b" Feb 02 11:04:35 crc kubenswrapper[4909]: E0202 11:04:35.614752 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b\": container with ID starting with 10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b not found: ID does not exist" containerID="10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.614785 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b"} err="failed to get container status \"10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b\": rpc error: code = NotFound desc = could not find container \"10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b\": container with ID starting with 10d4e4b81baaa985d43d48d7b725fec17e01d36752443f2a7f86ba44a72a0a1b not found: ID does not exist" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.614836 4909 scope.go:117] "RemoveContainer" containerID="41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9" Feb 02 11:04:35 crc kubenswrapper[4909]: E0202 11:04:35.615154 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9\": container with ID starting with 41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9 not found: ID does not exist" containerID="41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.615188 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9"} err="failed to get container status \"41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9\": rpc error: code = NotFound desc = could not find container \"41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9\": container with ID starting with 41eb4333644f12c3ace9028967535a42c51e2fc07127605864b450317b7eacb9 not found: ID does not exist" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.615207 4909 scope.go:117] "RemoveContainer" containerID="a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509" Feb 02 11:04:35 crc kubenswrapper[4909]: E0202 11:04:35.615441 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509\": container with ID starting with a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509 not found: ID does not exist" containerID="a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509" Feb 02 11:04:35 crc kubenswrapper[4909]: I0202 11:04:35.615456 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509"} err="failed to get container status \"a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509\": rpc error: code = NotFound desc = could not find container \"a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509\": container with ID starting with a5517b11a2a28fdde5258f83f7d11fd8587affa1817d864bfcd0641aec21d509 not found: ID does not exist" Feb 02 11:04:37 crc kubenswrapper[4909]: I0202 11:04:37.025310 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd444e4-debc-49d2-a673-2739d78fbbf3" path="/var/lib/kubelet/pods/7dd444e4-debc-49d2-a673-2739d78fbbf3/volumes" Feb 02 11:04:49 crc kubenswrapper[4909]: I0202 11:04:49.510523 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:04:49 crc kubenswrapper[4909]: I0202 11:04:49.511178 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:05:19 crc kubenswrapper[4909]: I0202 11:05:19.511367 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:05:19 crc kubenswrapper[4909]: I0202 11:05:19.512022 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:05:19 crc kubenswrapper[4909]: I0202 11:05:19.512070 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 11:05:19 crc kubenswrapper[4909]: I0202 11:05:19.512793 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9767fc57bc0ed3d797c691f994688f78f00d227a02855739406583aff93fd8c2"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:05:19 crc kubenswrapper[4909]: I0202 11:05:19.512911 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://9767fc57bc0ed3d797c691f994688f78f00d227a02855739406583aff93fd8c2" gracePeriod=600 Feb 02 11:05:19 crc kubenswrapper[4909]: I0202 11:05:19.845798 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="9767fc57bc0ed3d797c691f994688f78f00d227a02855739406583aff93fd8c2" exitCode=0 Feb 02 11:05:19 crc kubenswrapper[4909]: I0202 11:05:19.845898 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"9767fc57bc0ed3d797c691f994688f78f00d227a02855739406583aff93fd8c2"} Feb 02 11:05:19 crc kubenswrapper[4909]: I0202 11:05:19.846164 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca"} Feb 02 11:05:19 crc kubenswrapper[4909]: I0202 11:05:19.846194 4909 scope.go:117] "RemoveContainer" containerID="a863e8bee14b92256a3cfec97679d64c43ecf5ec36a1382f7f42c1f0553024f3" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.448663 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xh2gn"] Feb 02 11:05:22 crc kubenswrapper[4909]: E0202 11:05:22.449602 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerName="extract-content" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.449618 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerName="extract-content" Feb 02 11:05:22 crc kubenswrapper[4909]: E0202 11:05:22.449637 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerName="registry-server" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.449644 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerName="registry-server" Feb 02 11:05:22 crc kubenswrapper[4909]: E0202 11:05:22.449654 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerName="registry-server" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.449663 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerName="registry-server" Feb 02 11:05:22 crc kubenswrapper[4909]: E0202 11:05:22.449680 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerName="extract-content" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.449689 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerName="extract-content" Feb 02 11:05:22 crc kubenswrapper[4909]: E0202 11:05:22.449698 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerName="extract-utilities" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.449705 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerName="extract-utilities" Feb 02 11:05:22 crc kubenswrapper[4909]: E0202 11:05:22.449728 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerName="extract-utilities" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.449735 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerName="extract-utilities" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.449965 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe3d03cd-3fce-4986-8c4c-2956373475ea" containerName="registry-server" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.449983 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd444e4-debc-49d2-a673-2739d78fbbf3" containerName="registry-server" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.450992 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.472832 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xh2gn"] Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.480370 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2vlj\" (UniqueName: \"kubernetes.io/projected/d4853fa1-fb88-44d2-a56c-04b0510734d0-kube-api-access-j2vlj\") pod \"redhat-operators-xh2gn\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.480440 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-catalog-content\") pod \"redhat-operators-xh2gn\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.480460 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-utilities\") pod \"redhat-operators-xh2gn\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.582037 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2vlj\" (UniqueName: \"kubernetes.io/projected/d4853fa1-fb88-44d2-a56c-04b0510734d0-kube-api-access-j2vlj\") pod \"redhat-operators-xh2gn\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.582128 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-catalog-content\") pod \"redhat-operators-xh2gn\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.582153 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-utilities\") pod \"redhat-operators-xh2gn\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.582575 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-catalog-content\") pod \"redhat-operators-xh2gn\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.582692 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-utilities\") pod \"redhat-operators-xh2gn\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.615045 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2vlj\" (UniqueName: \"kubernetes.io/projected/d4853fa1-fb88-44d2-a56c-04b0510734d0-kube-api-access-j2vlj\") pod \"redhat-operators-xh2gn\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:22 crc kubenswrapper[4909]: I0202 11:05:22.783065 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:23 crc kubenswrapper[4909]: I0202 11:05:23.241475 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xh2gn"] Feb 02 11:05:23 crc kubenswrapper[4909]: W0202 11:05:23.243224 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4853fa1_fb88_44d2_a56c_04b0510734d0.slice/crio-d49c975773402534d7d623e7abf729a15f010e3c3550b4acab42f2a4548b9670 WatchSource:0}: Error finding container d49c975773402534d7d623e7abf729a15f010e3c3550b4acab42f2a4548b9670: Status 404 returned error can't find the container with id d49c975773402534d7d623e7abf729a15f010e3c3550b4acab42f2a4548b9670 Feb 02 11:05:23 crc kubenswrapper[4909]: I0202 11:05:23.875172 4909 generic.go:334] "Generic (PLEG): container finished" podID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerID="95ae11736efd89a045f6d1207f8b4fd2b3bc92ae16c15ab37abae58b7f48d2ca" exitCode=0 Feb 02 11:05:23 crc kubenswrapper[4909]: I0202 11:05:23.875273 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh2gn" event={"ID":"d4853fa1-fb88-44d2-a56c-04b0510734d0","Type":"ContainerDied","Data":"95ae11736efd89a045f6d1207f8b4fd2b3bc92ae16c15ab37abae58b7f48d2ca"} Feb 02 11:05:23 crc kubenswrapper[4909]: I0202 11:05:23.875483 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh2gn" event={"ID":"d4853fa1-fb88-44d2-a56c-04b0510734d0","Type":"ContainerStarted","Data":"d49c975773402534d7d623e7abf729a15f010e3c3550b4acab42f2a4548b9670"} Feb 02 11:05:24 crc kubenswrapper[4909]: I0202 11:05:24.882425 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh2gn" event={"ID":"d4853fa1-fb88-44d2-a56c-04b0510734d0","Type":"ContainerStarted","Data":"87e3fdb0f8721003f3e91d8f1d6fc77c232ea56137481a5d3e54c6402cca3e87"} Feb 02 11:05:25 crc kubenswrapper[4909]: I0202 11:05:25.889665 4909 generic.go:334] "Generic (PLEG): container finished" podID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerID="87e3fdb0f8721003f3e91d8f1d6fc77c232ea56137481a5d3e54c6402cca3e87" exitCode=0 Feb 02 11:05:25 crc kubenswrapper[4909]: I0202 11:05:25.889703 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh2gn" event={"ID":"d4853fa1-fb88-44d2-a56c-04b0510734d0","Type":"ContainerDied","Data":"87e3fdb0f8721003f3e91d8f1d6fc77c232ea56137481a5d3e54c6402cca3e87"} Feb 02 11:05:26 crc kubenswrapper[4909]: I0202 11:05:26.900551 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh2gn" event={"ID":"d4853fa1-fb88-44d2-a56c-04b0510734d0","Type":"ContainerStarted","Data":"54c10dd959fa3837729735c778f514c6dd53be2254c3ce6be9716e21bb79c4e0"} Feb 02 11:05:32 crc kubenswrapper[4909]: I0202 11:05:32.784070 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:32 crc kubenswrapper[4909]: I0202 11:05:32.784630 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:32 crc kubenswrapper[4909]: I0202 11:05:32.829463 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:32 crc kubenswrapper[4909]: I0202 11:05:32.851774 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xh2gn" podStartSLOduration=8.373776648 podStartE2EDuration="10.851753459s" podCreationTimestamp="2026-02-02 11:05:22 +0000 UTC" firstStartedPulling="2026-02-02 11:05:23.877081836 +0000 UTC m=+2049.623182571" lastFinishedPulling="2026-02-02 11:05:26.355058637 +0000 UTC m=+2052.101159382" observedRunningTime="2026-02-02 11:05:26.918345844 +0000 UTC m=+2052.664446589" watchObservedRunningTime="2026-02-02 11:05:32.851753459 +0000 UTC m=+2058.597854194" Feb 02 11:05:32 crc kubenswrapper[4909]: I0202 11:05:32.974431 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:33 crc kubenswrapper[4909]: I0202 11:05:33.060458 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xh2gn"] Feb 02 11:05:34 crc kubenswrapper[4909]: I0202 11:05:34.948281 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xh2gn" podUID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerName="registry-server" containerID="cri-o://54c10dd959fa3837729735c778f514c6dd53be2254c3ce6be9716e21bb79c4e0" gracePeriod=2 Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.050625 4909 generic.go:334] "Generic (PLEG): container finished" podID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerID="54c10dd959fa3837729735c778f514c6dd53be2254c3ce6be9716e21bb79c4e0" exitCode=0 Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.050936 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh2gn" event={"ID":"d4853fa1-fb88-44d2-a56c-04b0510734d0","Type":"ContainerDied","Data":"54c10dd959fa3837729735c778f514c6dd53be2254c3ce6be9716e21bb79c4e0"} Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.050960 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xh2gn" event={"ID":"d4853fa1-fb88-44d2-a56c-04b0510734d0","Type":"ContainerDied","Data":"d49c975773402534d7d623e7abf729a15f010e3c3550b4acab42f2a4548b9670"} Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.050973 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d49c975773402534d7d623e7abf729a15f010e3c3550b4acab42f2a4548b9670" Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.095502 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.157272 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-catalog-content\") pod \"d4853fa1-fb88-44d2-a56c-04b0510734d0\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.157394 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2vlj\" (UniqueName: \"kubernetes.io/projected/d4853fa1-fb88-44d2-a56c-04b0510734d0-kube-api-access-j2vlj\") pod \"d4853fa1-fb88-44d2-a56c-04b0510734d0\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.157452 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-utilities\") pod \"d4853fa1-fb88-44d2-a56c-04b0510734d0\" (UID: \"d4853fa1-fb88-44d2-a56c-04b0510734d0\") " Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.158608 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-utilities" (OuterVolumeSpecName: "utilities") pod "d4853fa1-fb88-44d2-a56c-04b0510734d0" (UID: "d4853fa1-fb88-44d2-a56c-04b0510734d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.163353 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4853fa1-fb88-44d2-a56c-04b0510734d0-kube-api-access-j2vlj" (OuterVolumeSpecName: "kube-api-access-j2vlj") pod "d4853fa1-fb88-44d2-a56c-04b0510734d0" (UID: "d4853fa1-fb88-44d2-a56c-04b0510734d0"). InnerVolumeSpecName "kube-api-access-j2vlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.259757 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2vlj\" (UniqueName: \"kubernetes.io/projected/d4853fa1-fb88-44d2-a56c-04b0510734d0-kube-api-access-j2vlj\") on node \"crc\" DevicePath \"\"" Feb 02 11:05:36 crc kubenswrapper[4909]: I0202 11:05:36.260206 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:05:37 crc kubenswrapper[4909]: I0202 11:05:37.057894 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xh2gn" Feb 02 11:05:37 crc kubenswrapper[4909]: I0202 11:05:37.112298 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4853fa1-fb88-44d2-a56c-04b0510734d0" (UID: "d4853fa1-fb88-44d2-a56c-04b0510734d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:05:37 crc kubenswrapper[4909]: I0202 11:05:37.171918 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4853fa1-fb88-44d2-a56c-04b0510734d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:05:37 crc kubenswrapper[4909]: I0202 11:05:37.391718 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xh2gn"] Feb 02 11:05:37 crc kubenswrapper[4909]: I0202 11:05:37.401096 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xh2gn"] Feb 02 11:05:39 crc kubenswrapper[4909]: I0202 11:05:39.025667 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4853fa1-fb88-44d2-a56c-04b0510734d0" path="/var/lib/kubelet/pods/d4853fa1-fb88-44d2-a56c-04b0510734d0/volumes" Feb 02 11:07:19 crc kubenswrapper[4909]: I0202 11:07:19.510986 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:07:19 crc kubenswrapper[4909]: I0202 11:07:19.512554 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:07:49 crc kubenswrapper[4909]: I0202 11:07:49.511078 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:07:49 crc kubenswrapper[4909]: I0202 11:07:49.511635 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:08:19 crc kubenswrapper[4909]: I0202 11:08:19.511006 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:08:19 crc kubenswrapper[4909]: I0202 11:08:19.511848 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:08:19 crc kubenswrapper[4909]: I0202 11:08:19.511940 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 11:08:19 crc kubenswrapper[4909]: I0202 11:08:19.512868 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:08:19 crc kubenswrapper[4909]: I0202 11:08:19.513062 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" gracePeriod=600 Feb 02 11:08:19 crc kubenswrapper[4909]: E0202 11:08:19.640082 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:08:20 crc kubenswrapper[4909]: I0202 11:08:20.093824 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" exitCode=0 Feb 02 11:08:20 crc kubenswrapper[4909]: I0202 11:08:20.093916 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca"} Feb 02 11:08:20 crc kubenswrapper[4909]: I0202 11:08:20.094266 4909 scope.go:117] "RemoveContainer" containerID="9767fc57bc0ed3d797c691f994688f78f00d227a02855739406583aff93fd8c2" Feb 02 11:08:20 crc kubenswrapper[4909]: I0202 11:08:20.094723 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:08:20 crc kubenswrapper[4909]: E0202 11:08:20.094988 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.430369 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kfxj9"] Feb 02 11:08:32 crc kubenswrapper[4909]: E0202 11:08:32.431056 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerName="extract-content" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.431455 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerName="extract-content" Feb 02 11:08:32 crc kubenswrapper[4909]: E0202 11:08:32.431476 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerName="extract-utilities" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.431482 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerName="extract-utilities" Feb 02 11:08:32 crc kubenswrapper[4909]: E0202 11:08:32.431499 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerName="registry-server" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.431506 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerName="registry-server" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.431683 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4853fa1-fb88-44d2-a56c-04b0510734d0" containerName="registry-server" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.432644 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.451845 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfxj9"] Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.462256 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s6vl\" (UniqueName: \"kubernetes.io/projected/63af9f1a-85d5-48f0-823c-f6a8e9979005-kube-api-access-5s6vl\") pod \"certified-operators-kfxj9\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.462537 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-catalog-content\") pod \"certified-operators-kfxj9\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.462637 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-utilities\") pod \"certified-operators-kfxj9\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.563218 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-catalog-content\") pod \"certified-operators-kfxj9\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.563260 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-utilities\") pod \"certified-operators-kfxj9\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.563327 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s6vl\" (UniqueName: \"kubernetes.io/projected/63af9f1a-85d5-48f0-823c-f6a8e9979005-kube-api-access-5s6vl\") pod \"certified-operators-kfxj9\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.563934 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-catalog-content\") pod \"certified-operators-kfxj9\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.564092 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-utilities\") pod \"certified-operators-kfxj9\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.582779 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s6vl\" (UniqueName: \"kubernetes.io/projected/63af9f1a-85d5-48f0-823c-f6a8e9979005-kube-api-access-5s6vl\") pod \"certified-operators-kfxj9\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:32 crc kubenswrapper[4909]: I0202 11:08:32.749760 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:33 crc kubenswrapper[4909]: I0202 11:08:33.026057 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kfxj9"] Feb 02 11:08:33 crc kubenswrapper[4909]: I0202 11:08:33.176821 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxj9" event={"ID":"63af9f1a-85d5-48f0-823c-f6a8e9979005","Type":"ContainerStarted","Data":"232fd6b839795562bca2ed7899b77a9c43cffdbafe7380d9493645285f87f209"} Feb 02 11:08:33 crc kubenswrapper[4909]: E0202 11:08:33.268053 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63af9f1a_85d5_48f0_823c_f6a8e9979005.slice/crio-f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:08:34 crc kubenswrapper[4909]: I0202 11:08:34.186778 4909 generic.go:334] "Generic (PLEG): container finished" podID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerID="f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97" exitCode=0 Feb 02 11:08:34 crc kubenswrapper[4909]: I0202 11:08:34.186933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxj9" event={"ID":"63af9f1a-85d5-48f0-823c-f6a8e9979005","Type":"ContainerDied","Data":"f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97"} Feb 02 11:08:35 crc kubenswrapper[4909]: I0202 11:08:35.020914 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:08:35 crc kubenswrapper[4909]: E0202 11:08:35.021245 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:08:35 crc kubenswrapper[4909]: I0202 11:08:35.200577 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxj9" event={"ID":"63af9f1a-85d5-48f0-823c-f6a8e9979005","Type":"ContainerStarted","Data":"5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217"} Feb 02 11:08:36 crc kubenswrapper[4909]: I0202 11:08:36.208053 4909 generic.go:334] "Generic (PLEG): container finished" podID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerID="5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217" exitCode=0 Feb 02 11:08:36 crc kubenswrapper[4909]: I0202 11:08:36.208144 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxj9" event={"ID":"63af9f1a-85d5-48f0-823c-f6a8e9979005","Type":"ContainerDied","Data":"5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217"} Feb 02 11:08:37 crc kubenswrapper[4909]: I0202 11:08:37.220147 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxj9" event={"ID":"63af9f1a-85d5-48f0-823c-f6a8e9979005","Type":"ContainerStarted","Data":"94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056"} Feb 02 11:08:37 crc kubenswrapper[4909]: I0202 11:08:37.243527 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kfxj9" podStartSLOduration=2.827522353 podStartE2EDuration="5.243496753s" podCreationTimestamp="2026-02-02 11:08:32 +0000 UTC" firstStartedPulling="2026-02-02 11:08:34.189797439 +0000 UTC m=+2239.935898174" lastFinishedPulling="2026-02-02 11:08:36.605771839 +0000 UTC m=+2242.351872574" observedRunningTime="2026-02-02 11:08:37.23988192 +0000 UTC m=+2242.985982675" watchObservedRunningTime="2026-02-02 11:08:37.243496753 +0000 UTC m=+2242.989597488" Feb 02 11:08:42 crc kubenswrapper[4909]: I0202 11:08:42.750449 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:42 crc kubenswrapper[4909]: I0202 11:08:42.751498 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:42 crc kubenswrapper[4909]: I0202 11:08:42.799254 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:43 crc kubenswrapper[4909]: I0202 11:08:43.296349 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:43 crc kubenswrapper[4909]: I0202 11:08:43.337637 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfxj9"] Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.270888 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kfxj9" podUID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerName="registry-server" containerID="cri-o://94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056" gracePeriod=2 Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.637783 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.708066 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s6vl\" (UniqueName: \"kubernetes.io/projected/63af9f1a-85d5-48f0-823c-f6a8e9979005-kube-api-access-5s6vl\") pod \"63af9f1a-85d5-48f0-823c-f6a8e9979005\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.708189 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-catalog-content\") pod \"63af9f1a-85d5-48f0-823c-f6a8e9979005\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.708223 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-utilities\") pod \"63af9f1a-85d5-48f0-823c-f6a8e9979005\" (UID: \"63af9f1a-85d5-48f0-823c-f6a8e9979005\") " Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.709317 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-utilities" (OuterVolumeSpecName: "utilities") pod "63af9f1a-85d5-48f0-823c-f6a8e9979005" (UID: "63af9f1a-85d5-48f0-823c-f6a8e9979005"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.713930 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63af9f1a-85d5-48f0-823c-f6a8e9979005-kube-api-access-5s6vl" (OuterVolumeSpecName: "kube-api-access-5s6vl") pod "63af9f1a-85d5-48f0-823c-f6a8e9979005" (UID: "63af9f1a-85d5-48f0-823c-f6a8e9979005"). InnerVolumeSpecName "kube-api-access-5s6vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.810327 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.810381 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s6vl\" (UniqueName: \"kubernetes.io/projected/63af9f1a-85d5-48f0-823c-f6a8e9979005-kube-api-access-5s6vl\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.867688 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63af9f1a-85d5-48f0-823c-f6a8e9979005" (UID: "63af9f1a-85d5-48f0-823c-f6a8e9979005"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:08:45 crc kubenswrapper[4909]: I0202 11:08:45.911377 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63af9f1a-85d5-48f0-823c-f6a8e9979005-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.016411 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:08:46 crc kubenswrapper[4909]: E0202 11:08:46.016736 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.283052 4909 generic.go:334] "Generic (PLEG): container finished" podID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerID="94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056" exitCode=0 Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.283108 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxj9" event={"ID":"63af9f1a-85d5-48f0-823c-f6a8e9979005","Type":"ContainerDied","Data":"94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056"} Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.283145 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kfxj9" event={"ID":"63af9f1a-85d5-48f0-823c-f6a8e9979005","Type":"ContainerDied","Data":"232fd6b839795562bca2ed7899b77a9c43cffdbafe7380d9493645285f87f209"} Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.283177 4909 scope.go:117] "RemoveContainer" containerID="94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.284930 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kfxj9" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.306143 4909 scope.go:117] "RemoveContainer" containerID="5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.327267 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kfxj9"] Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.333259 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kfxj9"] Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.334370 4909 scope.go:117] "RemoveContainer" containerID="f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.353870 4909 scope.go:117] "RemoveContainer" containerID="94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056" Feb 02 11:08:46 crc kubenswrapper[4909]: E0202 11:08:46.354365 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056\": container with ID starting with 94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056 not found: ID does not exist" containerID="94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.354425 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056"} err="failed to get container status \"94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056\": rpc error: code = NotFound desc = could not find container \"94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056\": container with ID starting with 94f7c7970744415cd640a9d2970886d92b73f43ef75eaf4594a6c3257dbb4056 not found: ID does not exist" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.354454 4909 scope.go:117] "RemoveContainer" containerID="5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217" Feb 02 11:08:46 crc kubenswrapper[4909]: E0202 11:08:46.354764 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217\": container with ID starting with 5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217 not found: ID does not exist" containerID="5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.354782 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217"} err="failed to get container status \"5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217\": rpc error: code = NotFound desc = could not find container \"5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217\": container with ID starting with 5dcffd4b9408dd4b200370724deda38f600c087ab5cf86511003deb879d48217 not found: ID does not exist" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.354797 4909 scope.go:117] "RemoveContainer" containerID="f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97" Feb 02 11:08:46 crc kubenswrapper[4909]: E0202 11:08:46.355093 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97\": container with ID starting with f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97 not found: ID does not exist" containerID="f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97" Feb 02 11:08:46 crc kubenswrapper[4909]: I0202 11:08:46.355122 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97"} err="failed to get container status \"f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97\": rpc error: code = NotFound desc = could not find container \"f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97\": container with ID starting with f6955ef8e4fcef661c38dad00800dc0cb6bf28dfee2ac4754c705a3c08247d97 not found: ID does not exist" Feb 02 11:08:47 crc kubenswrapper[4909]: I0202 11:08:47.025116 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63af9f1a-85d5-48f0-823c-f6a8e9979005" path="/var/lib/kubelet/pods/63af9f1a-85d5-48f0-823c-f6a8e9979005/volumes" Feb 02 11:08:58 crc kubenswrapper[4909]: I0202 11:08:58.016260 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:08:58 crc kubenswrapper[4909]: E0202 11:08:58.016990 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:09:09 crc kubenswrapper[4909]: I0202 11:09:09.017213 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:09:09 crc kubenswrapper[4909]: E0202 11:09:09.018042 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:09:24 crc kubenswrapper[4909]: I0202 11:09:24.016746 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:09:24 crc kubenswrapper[4909]: E0202 11:09:24.017462 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:09:36 crc kubenswrapper[4909]: I0202 11:09:36.015978 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:09:36 crc kubenswrapper[4909]: E0202 11:09:36.016793 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:09:50 crc kubenswrapper[4909]: I0202 11:09:50.016296 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:09:50 crc kubenswrapper[4909]: E0202 11:09:50.017097 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:10:01 crc kubenswrapper[4909]: I0202 11:10:01.016925 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:10:01 crc kubenswrapper[4909]: E0202 11:10:01.017722 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:10:12 crc kubenswrapper[4909]: I0202 11:10:12.016279 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:10:12 crc kubenswrapper[4909]: E0202 11:10:12.017479 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:10:27 crc kubenswrapper[4909]: I0202 11:10:27.021547 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:10:27 crc kubenswrapper[4909]: E0202 11:10:27.022330 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:10:39 crc kubenswrapper[4909]: I0202 11:10:39.017041 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:10:39 crc kubenswrapper[4909]: E0202 11:10:39.017870 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:10:53 crc kubenswrapper[4909]: I0202 11:10:53.016674 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:10:53 crc kubenswrapper[4909]: E0202 11:10:53.017408 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:11:05 crc kubenswrapper[4909]: I0202 11:11:05.016656 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:11:05 crc kubenswrapper[4909]: E0202 11:11:05.017482 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:11:19 crc kubenswrapper[4909]: I0202 11:11:19.016906 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:11:19 crc kubenswrapper[4909]: E0202 11:11:19.017641 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:11:34 crc kubenswrapper[4909]: I0202 11:11:34.016014 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:11:34 crc kubenswrapper[4909]: E0202 11:11:34.016773 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:11:48 crc kubenswrapper[4909]: I0202 11:11:48.016906 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:11:48 crc kubenswrapper[4909]: E0202 11:11:48.017617 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:12:03 crc kubenswrapper[4909]: I0202 11:12:03.017176 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:12:03 crc kubenswrapper[4909]: E0202 11:12:03.017888 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:12:16 crc kubenswrapper[4909]: I0202 11:12:16.015905 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:12:16 crc kubenswrapper[4909]: E0202 11:12:16.016551 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:12:19 crc kubenswrapper[4909]: I0202 11:12:19.506246 4909 scope.go:117] "RemoveContainer" containerID="87e3fdb0f8721003f3e91d8f1d6fc77c232ea56137481a5d3e54c6402cca3e87" Feb 02 11:12:19 crc kubenswrapper[4909]: I0202 11:12:19.528924 4909 scope.go:117] "RemoveContainer" containerID="95ae11736efd89a045f6d1207f8b4fd2b3bc92ae16c15ab37abae58b7f48d2ca" Feb 02 11:12:19 crc kubenswrapper[4909]: I0202 11:12:19.548234 4909 scope.go:117] "RemoveContainer" containerID="54c10dd959fa3837729735c778f514c6dd53be2254c3ce6be9716e21bb79c4e0" Feb 02 11:12:29 crc kubenswrapper[4909]: I0202 11:12:29.016307 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:12:29 crc kubenswrapper[4909]: E0202 11:12:29.017145 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:12:42 crc kubenswrapper[4909]: I0202 11:12:42.017511 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:12:42 crc kubenswrapper[4909]: E0202 11:12:42.018240 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:12:57 crc kubenswrapper[4909]: I0202 11:12:57.017667 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:12:57 crc kubenswrapper[4909]: E0202 11:12:57.018755 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:13:09 crc kubenswrapper[4909]: I0202 11:13:09.016643 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:13:09 crc kubenswrapper[4909]: E0202 11:13:09.018317 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:13:22 crc kubenswrapper[4909]: I0202 11:13:22.017255 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:13:23 crc kubenswrapper[4909]: I0202 11:13:23.099753 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"183542b16bfbba399a64fca4f881f43f3b512381f9b281b3fed5ed8cb6447db0"} Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.740789 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kzw2c"] Feb 02 11:14:40 crc kubenswrapper[4909]: E0202 11:14:40.741738 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerName="extract-content" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.741754 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerName="extract-content" Feb 02 11:14:40 crc kubenswrapper[4909]: E0202 11:14:40.741768 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerName="registry-server" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.741776 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerName="registry-server" Feb 02 11:14:40 crc kubenswrapper[4909]: E0202 11:14:40.741790 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerName="extract-utilities" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.741797 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerName="extract-utilities" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.741995 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="63af9f1a-85d5-48f0-823c-f6a8e9979005" containerName="registry-server" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.743145 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.748718 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzw2c"] Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.802480 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-utilities\") pod \"community-operators-kzw2c\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.802536 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-catalog-content\") pod \"community-operators-kzw2c\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.802559 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqj6n\" (UniqueName: \"kubernetes.io/projected/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-kube-api-access-mqj6n\") pod \"community-operators-kzw2c\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.904216 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-utilities\") pod \"community-operators-kzw2c\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.904270 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-catalog-content\") pod \"community-operators-kzw2c\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.904297 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqj6n\" (UniqueName: \"kubernetes.io/projected/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-kube-api-access-mqj6n\") pod \"community-operators-kzw2c\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.904852 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-utilities\") pod \"community-operators-kzw2c\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.904911 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-catalog-content\") pod \"community-operators-kzw2c\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:40 crc kubenswrapper[4909]: I0202 11:14:40.924398 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqj6n\" (UniqueName: \"kubernetes.io/projected/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-kube-api-access-mqj6n\") pod \"community-operators-kzw2c\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:41 crc kubenswrapper[4909]: I0202 11:14:41.060616 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:41 crc kubenswrapper[4909]: I0202 11:14:41.554376 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kzw2c"] Feb 02 11:14:41 crc kubenswrapper[4909]: I0202 11:14:41.629364 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzw2c" event={"ID":"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96","Type":"ContainerStarted","Data":"afc4c1984ea8f9864e9189ee05bd615ddb7b158ae750d9dff8761b9d9159e28e"} Feb 02 11:14:42 crc kubenswrapper[4909]: I0202 11:14:42.636899 4909 generic.go:334] "Generic (PLEG): container finished" podID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerID="08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0" exitCode=0 Feb 02 11:14:42 crc kubenswrapper[4909]: I0202 11:14:42.636947 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzw2c" event={"ID":"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96","Type":"ContainerDied","Data":"08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0"} Feb 02 11:14:42 crc kubenswrapper[4909]: I0202 11:14:42.638493 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:14:43 crc kubenswrapper[4909]: I0202 11:14:43.645590 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzw2c" event={"ID":"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96","Type":"ContainerStarted","Data":"19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7"} Feb 02 11:14:44 crc kubenswrapper[4909]: I0202 11:14:44.653010 4909 generic.go:334] "Generic (PLEG): container finished" podID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerID="19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7" exitCode=0 Feb 02 11:14:44 crc kubenswrapper[4909]: I0202 11:14:44.653055 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzw2c" event={"ID":"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96","Type":"ContainerDied","Data":"19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7"} Feb 02 11:14:45 crc kubenswrapper[4909]: I0202 11:14:45.660640 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzw2c" event={"ID":"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96","Type":"ContainerStarted","Data":"8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233"} Feb 02 11:14:45 crc kubenswrapper[4909]: I0202 11:14:45.680608 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kzw2c" podStartSLOduration=3.214144048 podStartE2EDuration="5.680589488s" podCreationTimestamp="2026-02-02 11:14:40 +0000 UTC" firstStartedPulling="2026-02-02 11:14:42.638232447 +0000 UTC m=+2608.384333182" lastFinishedPulling="2026-02-02 11:14:45.104677887 +0000 UTC m=+2610.850778622" observedRunningTime="2026-02-02 11:14:45.676956384 +0000 UTC m=+2611.423057119" watchObservedRunningTime="2026-02-02 11:14:45.680589488 +0000 UTC m=+2611.426690223" Feb 02 11:14:51 crc kubenswrapper[4909]: I0202 11:14:51.061785 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:51 crc kubenswrapper[4909]: I0202 11:14:51.062395 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:51 crc kubenswrapper[4909]: I0202 11:14:51.101655 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:51 crc kubenswrapper[4909]: I0202 11:14:51.743366 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:51 crc kubenswrapper[4909]: I0202 11:14:51.782111 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzw2c"] Feb 02 11:14:53 crc kubenswrapper[4909]: I0202 11:14:53.717876 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kzw2c" podUID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerName="registry-server" containerID="cri-o://8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233" gracePeriod=2 Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.659079 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.696973 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-utilities\") pod \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.697130 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-catalog-content\") pod \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.697274 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqj6n\" (UniqueName: \"kubernetes.io/projected/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-kube-api-access-mqj6n\") pod \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\" (UID: \"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96\") " Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.701032 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-utilities" (OuterVolumeSpecName: "utilities") pod "fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" (UID: "fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.713298 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-kube-api-access-mqj6n" (OuterVolumeSpecName: "kube-api-access-mqj6n") pod "fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" (UID: "fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96"). InnerVolumeSpecName "kube-api-access-mqj6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.729846 4909 generic.go:334] "Generic (PLEG): container finished" podID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerID="8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233" exitCode=0 Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.729888 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzw2c" event={"ID":"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96","Type":"ContainerDied","Data":"8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233"} Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.729913 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kzw2c" event={"ID":"fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96","Type":"ContainerDied","Data":"afc4c1984ea8f9864e9189ee05bd615ddb7b158ae750d9dff8761b9d9159e28e"} Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.729932 4909 scope.go:117] "RemoveContainer" containerID="8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.730658 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kzw2c" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.749574 4909 scope.go:117] "RemoveContainer" containerID="19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.754753 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" (UID: "fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.769184 4909 scope.go:117] "RemoveContainer" containerID="08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.789689 4909 scope.go:117] "RemoveContainer" containerID="8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233" Feb 02 11:14:54 crc kubenswrapper[4909]: E0202 11:14:54.790273 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233\": container with ID starting with 8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233 not found: ID does not exist" containerID="8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.790323 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233"} err="failed to get container status \"8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233\": rpc error: code = NotFound desc = could not find container \"8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233\": container with ID starting with 8c3c1be8ab2e9a344c35e02420b27dae48515a50e8c34a90be8961a5e2bf9233 not found: ID does not exist" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.790351 4909 scope.go:117] "RemoveContainer" containerID="19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7" Feb 02 11:14:54 crc kubenswrapper[4909]: E0202 11:14:54.790726 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7\": container with ID starting with 19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7 not found: ID does not exist" containerID="19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.790765 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7"} err="failed to get container status \"19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7\": rpc error: code = NotFound desc = could not find container \"19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7\": container with ID starting with 19d283715841c374838548c9dd189b23454336570b1241f30f3047f328490ec7 not found: ID does not exist" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.790791 4909 scope.go:117] "RemoveContainer" containerID="08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0" Feb 02 11:14:54 crc kubenswrapper[4909]: E0202 11:14:54.791120 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0\": container with ID starting with 08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0 not found: ID does not exist" containerID="08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.791138 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0"} err="failed to get container status \"08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0\": rpc error: code = NotFound desc = could not find container \"08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0\": container with ID starting with 08bc2c0558c2261fa7baad496fa1a85f278d1c517ea19e63fd32d32d2a6b3bc0 not found: ID does not exist" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.798975 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.799086 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:14:54 crc kubenswrapper[4909]: I0202 11:14:54.799153 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqj6n\" (UniqueName: \"kubernetes.io/projected/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96-kube-api-access-mqj6n\") on node \"crc\" DevicePath \"\"" Feb 02 11:14:55 crc kubenswrapper[4909]: I0202 11:14:55.058966 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kzw2c"] Feb 02 11:14:55 crc kubenswrapper[4909]: I0202 11:14:55.064095 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kzw2c"] Feb 02 11:14:57 crc kubenswrapper[4909]: I0202 11:14:57.026011 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" path="/var/lib/kubelet/pods/fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96/volumes" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.140445 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h"] Feb 02 11:15:00 crc kubenswrapper[4909]: E0202 11:15:00.140822 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.140837 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4909]: E0202 11:15:00.140853 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerName="extract-utilities" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.140860 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerName="extract-utilities" Feb 02 11:15:00 crc kubenswrapper[4909]: E0202 11:15:00.140878 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerName="extract-content" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.140885 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerName="extract-content" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.141072 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa23dabd-2e2f-4ac8-b3f5-668fbcf37e96" containerName="registry-server" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.141607 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.143482 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.144370 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.160765 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h"] Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.171947 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtd2l\" (UniqueName: \"kubernetes.io/projected/1ca8f19f-9709-47f7-8207-7e98f2eec922-kube-api-access-qtd2l\") pod \"collect-profiles-29500515-8jw4h\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.172172 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca8f19f-9709-47f7-8207-7e98f2eec922-config-volume\") pod \"collect-profiles-29500515-8jw4h\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.172288 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca8f19f-9709-47f7-8207-7e98f2eec922-secret-volume\") pod \"collect-profiles-29500515-8jw4h\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.273405 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtd2l\" (UniqueName: \"kubernetes.io/projected/1ca8f19f-9709-47f7-8207-7e98f2eec922-kube-api-access-qtd2l\") pod \"collect-profiles-29500515-8jw4h\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.273468 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca8f19f-9709-47f7-8207-7e98f2eec922-config-volume\") pod \"collect-profiles-29500515-8jw4h\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.273532 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca8f19f-9709-47f7-8207-7e98f2eec922-secret-volume\") pod \"collect-profiles-29500515-8jw4h\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.274665 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca8f19f-9709-47f7-8207-7e98f2eec922-config-volume\") pod \"collect-profiles-29500515-8jw4h\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.279576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca8f19f-9709-47f7-8207-7e98f2eec922-secret-volume\") pod \"collect-profiles-29500515-8jw4h\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.290135 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtd2l\" (UniqueName: \"kubernetes.io/projected/1ca8f19f-9709-47f7-8207-7e98f2eec922-kube-api-access-qtd2l\") pod \"collect-profiles-29500515-8jw4h\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.459520 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:00 crc kubenswrapper[4909]: I0202 11:15:00.874094 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h"] Feb 02 11:15:01 crc kubenswrapper[4909]: I0202 11:15:01.775964 4909 generic.go:334] "Generic (PLEG): container finished" podID="1ca8f19f-9709-47f7-8207-7e98f2eec922" containerID="439dc9403e649a83be0f9cae449b08589f73d864776a962e1987a3a3dba74aae" exitCode=0 Feb 02 11:15:01 crc kubenswrapper[4909]: I0202 11:15:01.776019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" event={"ID":"1ca8f19f-9709-47f7-8207-7e98f2eec922","Type":"ContainerDied","Data":"439dc9403e649a83be0f9cae449b08589f73d864776a962e1987a3a3dba74aae"} Feb 02 11:15:01 crc kubenswrapper[4909]: I0202 11:15:01.776309 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" event={"ID":"1ca8f19f-9709-47f7-8207-7e98f2eec922","Type":"ContainerStarted","Data":"a5bac1122ee3bd54baee66acef0c9f0e7385393c46e5f3d21c5a464b66fbd161"} Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.007689 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.116970 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca8f19f-9709-47f7-8207-7e98f2eec922-config-volume\") pod \"1ca8f19f-9709-47f7-8207-7e98f2eec922\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.117245 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca8f19f-9709-47f7-8207-7e98f2eec922-secret-volume\") pod \"1ca8f19f-9709-47f7-8207-7e98f2eec922\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.117350 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtd2l\" (UniqueName: \"kubernetes.io/projected/1ca8f19f-9709-47f7-8207-7e98f2eec922-kube-api-access-qtd2l\") pod \"1ca8f19f-9709-47f7-8207-7e98f2eec922\" (UID: \"1ca8f19f-9709-47f7-8207-7e98f2eec922\") " Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.117828 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ca8f19f-9709-47f7-8207-7e98f2eec922-config-volume" (OuterVolumeSpecName: "config-volume") pod "1ca8f19f-9709-47f7-8207-7e98f2eec922" (UID: "1ca8f19f-9709-47f7-8207-7e98f2eec922"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.122986 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca8f19f-9709-47f7-8207-7e98f2eec922-kube-api-access-qtd2l" (OuterVolumeSpecName: "kube-api-access-qtd2l") pod "1ca8f19f-9709-47f7-8207-7e98f2eec922" (UID: "1ca8f19f-9709-47f7-8207-7e98f2eec922"). InnerVolumeSpecName "kube-api-access-qtd2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.123356 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca8f19f-9709-47f7-8207-7e98f2eec922-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1ca8f19f-9709-47f7-8207-7e98f2eec922" (UID: "1ca8f19f-9709-47f7-8207-7e98f2eec922"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.219258 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1ca8f19f-9709-47f7-8207-7e98f2eec922-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.219293 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtd2l\" (UniqueName: \"kubernetes.io/projected/1ca8f19f-9709-47f7-8207-7e98f2eec922-kube-api-access-qtd2l\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.219303 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1ca8f19f-9709-47f7-8207-7e98f2eec922-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.793456 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" event={"ID":"1ca8f19f-9709-47f7-8207-7e98f2eec922","Type":"ContainerDied","Data":"a5bac1122ee3bd54baee66acef0c9f0e7385393c46e5f3d21c5a464b66fbd161"} Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.793497 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5bac1122ee3bd54baee66acef0c9f0e7385393c46e5f3d21c5a464b66fbd161" Feb 02 11:15:03 crc kubenswrapper[4909]: I0202 11:15:03.793516 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h" Feb 02 11:15:04 crc kubenswrapper[4909]: I0202 11:15:04.080272 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb"] Feb 02 11:15:04 crc kubenswrapper[4909]: I0202 11:15:04.084525 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500470-c2btb"] Feb 02 11:15:05 crc kubenswrapper[4909]: I0202 11:15:05.028786 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e06f1d-ae35-4177-afd0-55fc1112f0a7" path="/var/lib/kubelet/pods/50e06f1d-ae35-4177-afd0-55fc1112f0a7/volumes" Feb 02 11:15:19 crc kubenswrapper[4909]: I0202 11:15:19.627498 4909 scope.go:117] "RemoveContainer" containerID="f2ce9a446a9f4c9bdabeb2a92038d6a2aff0a57b0b772e5fde0eed499f1ff4ee" Feb 02 11:15:49 crc kubenswrapper[4909]: I0202 11:15:49.511145 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:15:49 crc kubenswrapper[4909]: I0202 11:15:49.511728 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:16:19 crc kubenswrapper[4909]: I0202 11:16:19.510604 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:16:19 crc kubenswrapper[4909]: I0202 11:16:19.511092 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.474393 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zrpsf"] Feb 02 11:16:46 crc kubenswrapper[4909]: E0202 11:16:46.475712 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca8f19f-9709-47f7-8207-7e98f2eec922" containerName="collect-profiles" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.475732 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca8f19f-9709-47f7-8207-7e98f2eec922" containerName="collect-profiles" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.475935 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca8f19f-9709-47f7-8207-7e98f2eec922" containerName="collect-profiles" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.477317 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.488000 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrpsf"] Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.579311 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-catalog-content\") pod \"redhat-operators-zrpsf\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.579393 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn742\" (UniqueName: \"kubernetes.io/projected/84009fbe-f5f0-4426-8104-b942425e4522-kube-api-access-bn742\") pod \"redhat-operators-zrpsf\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.579444 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-utilities\") pod \"redhat-operators-zrpsf\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.681155 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-catalog-content\") pod \"redhat-operators-zrpsf\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.681235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn742\" (UniqueName: \"kubernetes.io/projected/84009fbe-f5f0-4426-8104-b942425e4522-kube-api-access-bn742\") pod \"redhat-operators-zrpsf\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.681288 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-utilities\") pod \"redhat-operators-zrpsf\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.682001 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-utilities\") pod \"redhat-operators-zrpsf\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.682784 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-catalog-content\") pod \"redhat-operators-zrpsf\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.709528 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn742\" (UniqueName: \"kubernetes.io/projected/84009fbe-f5f0-4426-8104-b942425e4522-kube-api-access-bn742\") pod \"redhat-operators-zrpsf\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:46 crc kubenswrapper[4909]: I0202 11:16:46.843977 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:47 crc kubenswrapper[4909]: I0202 11:16:47.287304 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zrpsf"] Feb 02 11:16:47 crc kubenswrapper[4909]: I0202 11:16:47.459539 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrpsf" event={"ID":"84009fbe-f5f0-4426-8104-b942425e4522","Type":"ContainerStarted","Data":"622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce"} Feb 02 11:16:47 crc kubenswrapper[4909]: I0202 11:16:47.460050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrpsf" event={"ID":"84009fbe-f5f0-4426-8104-b942425e4522","Type":"ContainerStarted","Data":"f68e2ba2aa75f5b1cccdd13437dcef2b4a6572b4d03f62c2e5be3f0562cd79b6"} Feb 02 11:16:48 crc kubenswrapper[4909]: I0202 11:16:48.467163 4909 generic.go:334] "Generic (PLEG): container finished" podID="84009fbe-f5f0-4426-8104-b942425e4522" containerID="622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce" exitCode=0 Feb 02 11:16:48 crc kubenswrapper[4909]: I0202 11:16:48.467215 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrpsf" event={"ID":"84009fbe-f5f0-4426-8104-b942425e4522","Type":"ContainerDied","Data":"622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce"} Feb 02 11:16:49 crc kubenswrapper[4909]: I0202 11:16:49.477930 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrpsf" event={"ID":"84009fbe-f5f0-4426-8104-b942425e4522","Type":"ContainerStarted","Data":"afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac"} Feb 02 11:16:49 crc kubenswrapper[4909]: I0202 11:16:49.511531 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:16:49 crc kubenswrapper[4909]: I0202 11:16:49.511595 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:16:49 crc kubenswrapper[4909]: I0202 11:16:49.511641 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 11:16:49 crc kubenswrapper[4909]: I0202 11:16:49.512242 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"183542b16bfbba399a64fca4f881f43f3b512381f9b281b3fed5ed8cb6447db0"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:16:49 crc kubenswrapper[4909]: I0202 11:16:49.512316 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://183542b16bfbba399a64fca4f881f43f3b512381f9b281b3fed5ed8cb6447db0" gracePeriod=600 Feb 02 11:16:50 crc kubenswrapper[4909]: I0202 11:16:50.486403 4909 generic.go:334] "Generic (PLEG): container finished" podID="84009fbe-f5f0-4426-8104-b942425e4522" containerID="afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac" exitCode=0 Feb 02 11:16:50 crc kubenswrapper[4909]: I0202 11:16:50.486772 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrpsf" event={"ID":"84009fbe-f5f0-4426-8104-b942425e4522","Type":"ContainerDied","Data":"afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac"} Feb 02 11:16:50 crc kubenswrapper[4909]: I0202 11:16:50.491643 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"183542b16bfbba399a64fca4f881f43f3b512381f9b281b3fed5ed8cb6447db0"} Feb 02 11:16:50 crc kubenswrapper[4909]: I0202 11:16:50.491772 4909 scope.go:117] "RemoveContainer" containerID="042cf1adf9a9870ec7eef86fabe40de975e5e001615487eec3af4c0c6423f6ca" Feb 02 11:16:50 crc kubenswrapper[4909]: I0202 11:16:50.491495 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="183542b16bfbba399a64fca4f881f43f3b512381f9b281b3fed5ed8cb6447db0" exitCode=0 Feb 02 11:16:50 crc kubenswrapper[4909]: I0202 11:16:50.495269 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26"} Feb 02 11:16:51 crc kubenswrapper[4909]: I0202 11:16:51.506342 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrpsf" event={"ID":"84009fbe-f5f0-4426-8104-b942425e4522","Type":"ContainerStarted","Data":"ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be"} Feb 02 11:16:51 crc kubenswrapper[4909]: I0202 11:16:51.526391 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zrpsf" podStartSLOduration=3.114536822 podStartE2EDuration="5.526372466s" podCreationTimestamp="2026-02-02 11:16:46 +0000 UTC" firstStartedPulling="2026-02-02 11:16:48.468571665 +0000 UTC m=+2734.214672400" lastFinishedPulling="2026-02-02 11:16:50.880407309 +0000 UTC m=+2736.626508044" observedRunningTime="2026-02-02 11:16:51.52126089 +0000 UTC m=+2737.267361625" watchObservedRunningTime="2026-02-02 11:16:51.526372466 +0000 UTC m=+2737.272473201" Feb 02 11:16:56 crc kubenswrapper[4909]: I0202 11:16:56.844333 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:56 crc kubenswrapper[4909]: I0202 11:16:56.844907 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:56 crc kubenswrapper[4909]: I0202 11:16:56.889357 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:57 crc kubenswrapper[4909]: I0202 11:16:57.581689 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:16:57 crc kubenswrapper[4909]: I0202 11:16:57.624655 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrpsf"] Feb 02 11:16:59 crc kubenswrapper[4909]: I0202 11:16:59.555507 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zrpsf" podUID="84009fbe-f5f0-4426-8104-b942425e4522" containerName="registry-server" containerID="cri-o://ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be" gracePeriod=2 Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.002171 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.166090 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-utilities\") pod \"84009fbe-f5f0-4426-8104-b942425e4522\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.166164 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-catalog-content\") pod \"84009fbe-f5f0-4426-8104-b942425e4522\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.166257 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn742\" (UniqueName: \"kubernetes.io/projected/84009fbe-f5f0-4426-8104-b942425e4522-kube-api-access-bn742\") pod \"84009fbe-f5f0-4426-8104-b942425e4522\" (UID: \"84009fbe-f5f0-4426-8104-b942425e4522\") " Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.167175 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-utilities" (OuterVolumeSpecName: "utilities") pod "84009fbe-f5f0-4426-8104-b942425e4522" (UID: "84009fbe-f5f0-4426-8104-b942425e4522"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.171040 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84009fbe-f5f0-4426-8104-b942425e4522-kube-api-access-bn742" (OuterVolumeSpecName: "kube-api-access-bn742") pod "84009fbe-f5f0-4426-8104-b942425e4522" (UID: "84009fbe-f5f0-4426-8104-b942425e4522"). InnerVolumeSpecName "kube-api-access-bn742". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.267919 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn742\" (UniqueName: \"kubernetes.io/projected/84009fbe-f5f0-4426-8104-b942425e4522-kube-api-access-bn742\") on node \"crc\" DevicePath \"\"" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.267955 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.563453 4909 generic.go:334] "Generic (PLEG): container finished" podID="84009fbe-f5f0-4426-8104-b942425e4522" containerID="ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be" exitCode=0 Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.563493 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrpsf" event={"ID":"84009fbe-f5f0-4426-8104-b942425e4522","Type":"ContainerDied","Data":"ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be"} Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.563504 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zrpsf" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.563518 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zrpsf" event={"ID":"84009fbe-f5f0-4426-8104-b942425e4522","Type":"ContainerDied","Data":"f68e2ba2aa75f5b1cccdd13437dcef2b4a6572b4d03f62c2e5be3f0562cd79b6"} Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.563532 4909 scope.go:117] "RemoveContainer" containerID="ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.581463 4909 scope.go:117] "RemoveContainer" containerID="afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.605080 4909 scope.go:117] "RemoveContainer" containerID="622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.630265 4909 scope.go:117] "RemoveContainer" containerID="ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be" Feb 02 11:17:00 crc kubenswrapper[4909]: E0202 11:17:00.630916 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be\": container with ID starting with ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be not found: ID does not exist" containerID="ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.630948 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be"} err="failed to get container status \"ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be\": rpc error: code = NotFound desc = could not find container \"ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be\": container with ID starting with ac11369a4b79a050f7a8d7bc9962d94c805eec304df2f4aa1a464ebd6d18c2be not found: ID does not exist" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.630970 4909 scope.go:117] "RemoveContainer" containerID="afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac" Feb 02 11:17:00 crc kubenswrapper[4909]: E0202 11:17:00.631354 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac\": container with ID starting with afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac not found: ID does not exist" containerID="afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.631373 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac"} err="failed to get container status \"afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac\": rpc error: code = NotFound desc = could not find container \"afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac\": container with ID starting with afc5af276c066ce576ceaefedbe5ae6ef64914291fce8cd98bb1ffb9d7dfb4ac not found: ID does not exist" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.631386 4909 scope.go:117] "RemoveContainer" containerID="622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce" Feb 02 11:17:00 crc kubenswrapper[4909]: E0202 11:17:00.631764 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce\": container with ID starting with 622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce not found: ID does not exist" containerID="622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.631785 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce"} err="failed to get container status \"622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce\": rpc error: code = NotFound desc = could not find container \"622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce\": container with ID starting with 622c0d46e2b4f0fa6ead1c067432bda8dd9586ada2f59b3da4e260ee8e1f04ce not found: ID does not exist" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.958355 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84009fbe-f5f0-4426-8104-b942425e4522" (UID: "84009fbe-f5f0-4426-8104-b942425e4522"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:17:00 crc kubenswrapper[4909]: I0202 11:17:00.979412 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84009fbe-f5f0-4426-8104-b942425e4522-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:17:01 crc kubenswrapper[4909]: I0202 11:17:01.187845 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zrpsf"] Feb 02 11:17:01 crc kubenswrapper[4909]: I0202 11:17:01.194774 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zrpsf"] Feb 02 11:17:03 crc kubenswrapper[4909]: I0202 11:17:03.023921 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84009fbe-f5f0-4426-8104-b942425e4522" path="/var/lib/kubelet/pods/84009fbe-f5f0-4426-8104-b942425e4522/volumes" Feb 02 11:18:49 crc kubenswrapper[4909]: I0202 11:18:49.510689 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:18:49 crc kubenswrapper[4909]: I0202 11:18:49.511153 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:18:59 crc kubenswrapper[4909]: I0202 11:18:59.979980 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4rpcq"] Feb 02 11:18:59 crc kubenswrapper[4909]: E0202 11:18:59.980930 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84009fbe-f5f0-4426-8104-b942425e4522" containerName="extract-utilities" Feb 02 11:18:59 crc kubenswrapper[4909]: I0202 11:18:59.980949 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="84009fbe-f5f0-4426-8104-b942425e4522" containerName="extract-utilities" Feb 02 11:18:59 crc kubenswrapper[4909]: E0202 11:18:59.980978 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84009fbe-f5f0-4426-8104-b942425e4522" containerName="extract-content" Feb 02 11:18:59 crc kubenswrapper[4909]: I0202 11:18:59.980987 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="84009fbe-f5f0-4426-8104-b942425e4522" containerName="extract-content" Feb 02 11:18:59 crc kubenswrapper[4909]: E0202 11:18:59.981001 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84009fbe-f5f0-4426-8104-b942425e4522" containerName="registry-server" Feb 02 11:18:59 crc kubenswrapper[4909]: I0202 11:18:59.981008 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="84009fbe-f5f0-4426-8104-b942425e4522" containerName="registry-server" Feb 02 11:18:59 crc kubenswrapper[4909]: I0202 11:18:59.981184 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="84009fbe-f5f0-4426-8104-b942425e4522" containerName="registry-server" Feb 02 11:18:59 crc kubenswrapper[4909]: I0202 11:18:59.982481 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:18:59 crc kubenswrapper[4909]: I0202 11:18:59.986569 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rpcq"] Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.069113 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-catalog-content\") pod \"certified-operators-4rpcq\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.069169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cccj4\" (UniqueName: \"kubernetes.io/projected/fd4c4bca-6832-45d2-b626-183a59d5642c-kube-api-access-cccj4\") pod \"certified-operators-4rpcq\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.069189 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-utilities\") pod \"certified-operators-4rpcq\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.170171 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-catalog-content\") pod \"certified-operators-4rpcq\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.170225 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cccj4\" (UniqueName: \"kubernetes.io/projected/fd4c4bca-6832-45d2-b626-183a59d5642c-kube-api-access-cccj4\") pod \"certified-operators-4rpcq\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.170240 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-utilities\") pod \"certified-operators-4rpcq\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.170702 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-catalog-content\") pod \"certified-operators-4rpcq\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.171180 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-utilities\") pod \"certified-operators-4rpcq\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.192091 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cccj4\" (UniqueName: \"kubernetes.io/projected/fd4c4bca-6832-45d2-b626-183a59d5642c-kube-api-access-cccj4\") pod \"certified-operators-4rpcq\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.301349 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:00 crc kubenswrapper[4909]: I0202 11:19:00.793066 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4rpcq"] Feb 02 11:19:01 crc kubenswrapper[4909]: I0202 11:19:01.120115 4909 generic.go:334] "Generic (PLEG): container finished" podID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerID="327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170" exitCode=0 Feb 02 11:19:01 crc kubenswrapper[4909]: I0202 11:19:01.120157 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rpcq" event={"ID":"fd4c4bca-6832-45d2-b626-183a59d5642c","Type":"ContainerDied","Data":"327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170"} Feb 02 11:19:01 crc kubenswrapper[4909]: I0202 11:19:01.120389 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rpcq" event={"ID":"fd4c4bca-6832-45d2-b626-183a59d5642c","Type":"ContainerStarted","Data":"bcd7bfec8832608e2b0dad66f34dafb34bb484f99664948da6095e533682bb76"} Feb 02 11:19:03 crc kubenswrapper[4909]: I0202 11:19:03.132596 4909 generic.go:334] "Generic (PLEG): container finished" podID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerID="fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089" exitCode=0 Feb 02 11:19:03 crc kubenswrapper[4909]: I0202 11:19:03.132790 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rpcq" event={"ID":"fd4c4bca-6832-45d2-b626-183a59d5642c","Type":"ContainerDied","Data":"fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089"} Feb 02 11:19:04 crc kubenswrapper[4909]: I0202 11:19:04.140167 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rpcq" event={"ID":"fd4c4bca-6832-45d2-b626-183a59d5642c","Type":"ContainerStarted","Data":"09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35"} Feb 02 11:19:04 crc kubenswrapper[4909]: I0202 11:19:04.157696 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4rpcq" podStartSLOduration=2.556774073 podStartE2EDuration="5.157680956s" podCreationTimestamp="2026-02-02 11:18:59 +0000 UTC" firstStartedPulling="2026-02-02 11:19:01.121557302 +0000 UTC m=+2866.867658037" lastFinishedPulling="2026-02-02 11:19:03.722464185 +0000 UTC m=+2869.468564920" observedRunningTime="2026-02-02 11:19:04.154110375 +0000 UTC m=+2869.900211110" watchObservedRunningTime="2026-02-02 11:19:04.157680956 +0000 UTC m=+2869.903781691" Feb 02 11:19:10 crc kubenswrapper[4909]: I0202 11:19:10.301666 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:10 crc kubenswrapper[4909]: I0202 11:19:10.302439 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:10 crc kubenswrapper[4909]: I0202 11:19:10.345620 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:11 crc kubenswrapper[4909]: I0202 11:19:11.219849 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:11 crc kubenswrapper[4909]: I0202 11:19:11.270084 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rpcq"] Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.195186 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4rpcq" podUID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerName="registry-server" containerID="cri-o://09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35" gracePeriod=2 Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.553038 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.664687 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-utilities\") pod \"fd4c4bca-6832-45d2-b626-183a59d5642c\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.664761 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-catalog-content\") pod \"fd4c4bca-6832-45d2-b626-183a59d5642c\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.664937 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cccj4\" (UniqueName: \"kubernetes.io/projected/fd4c4bca-6832-45d2-b626-183a59d5642c-kube-api-access-cccj4\") pod \"fd4c4bca-6832-45d2-b626-183a59d5642c\" (UID: \"fd4c4bca-6832-45d2-b626-183a59d5642c\") " Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.665462 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-utilities" (OuterVolumeSpecName: "utilities") pod "fd4c4bca-6832-45d2-b626-183a59d5642c" (UID: "fd4c4bca-6832-45d2-b626-183a59d5642c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.666357 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.671140 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd4c4bca-6832-45d2-b626-183a59d5642c-kube-api-access-cccj4" (OuterVolumeSpecName: "kube-api-access-cccj4") pod "fd4c4bca-6832-45d2-b626-183a59d5642c" (UID: "fd4c4bca-6832-45d2-b626-183a59d5642c"). InnerVolumeSpecName "kube-api-access-cccj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.714067 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd4c4bca-6832-45d2-b626-183a59d5642c" (UID: "fd4c4bca-6832-45d2-b626-183a59d5642c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.767647 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cccj4\" (UniqueName: \"kubernetes.io/projected/fd4c4bca-6832-45d2-b626-183a59d5642c-kube-api-access-cccj4\") on node \"crc\" DevicePath \"\"" Feb 02 11:19:13 crc kubenswrapper[4909]: I0202 11:19:13.767678 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd4c4bca-6832-45d2-b626-183a59d5642c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.208444 4909 generic.go:334] "Generic (PLEG): container finished" podID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerID="09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35" exitCode=0 Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.208486 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rpcq" event={"ID":"fd4c4bca-6832-45d2-b626-183a59d5642c","Type":"ContainerDied","Data":"09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35"} Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.208515 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4rpcq" Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.208530 4909 scope.go:117] "RemoveContainer" containerID="09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35" Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.208517 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4rpcq" event={"ID":"fd4c4bca-6832-45d2-b626-183a59d5642c","Type":"ContainerDied","Data":"bcd7bfec8832608e2b0dad66f34dafb34bb484f99664948da6095e533682bb76"} Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.229109 4909 scope.go:117] "RemoveContainer" containerID="fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089" Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.241399 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4rpcq"] Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.247970 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4rpcq"] Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.268385 4909 scope.go:117] "RemoveContainer" containerID="327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170" Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.283398 4909 scope.go:117] "RemoveContainer" containerID="09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35" Feb 02 11:19:14 crc kubenswrapper[4909]: E0202 11:19:14.283768 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35\": container with ID starting with 09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35 not found: ID does not exist" containerID="09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35" Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.283798 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35"} err="failed to get container status \"09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35\": rpc error: code = NotFound desc = could not find container \"09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35\": container with ID starting with 09129f55c76aa302d1a6a7579d4d16d7fd85dd871f8c9c05bb369937d2285f35 not found: ID does not exist" Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.283915 4909 scope.go:117] "RemoveContainer" containerID="fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089" Feb 02 11:19:14 crc kubenswrapper[4909]: E0202 11:19:14.284315 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089\": container with ID starting with fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089 not found: ID does not exist" containerID="fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089" Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.284366 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089"} err="failed to get container status \"fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089\": rpc error: code = NotFound desc = could not find container \"fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089\": container with ID starting with fd2a4c7440320b0240a7fc5f6488f375852727086ad415492a312f12ec500089 not found: ID does not exist" Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.284395 4909 scope.go:117] "RemoveContainer" containerID="327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170" Feb 02 11:19:14 crc kubenswrapper[4909]: E0202 11:19:14.284703 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170\": container with ID starting with 327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170 not found: ID does not exist" containerID="327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170" Feb 02 11:19:14 crc kubenswrapper[4909]: I0202 11:19:14.284729 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170"} err="failed to get container status \"327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170\": rpc error: code = NotFound desc = could not find container \"327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170\": container with ID starting with 327fb033be016e108daafbd3229ebedc878a52c31b8a0df049c6b0d6db36d170 not found: ID does not exist" Feb 02 11:19:15 crc kubenswrapper[4909]: I0202 11:19:15.041221 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd4c4bca-6832-45d2-b626-183a59d5642c" path="/var/lib/kubelet/pods/fd4c4bca-6832-45d2-b626-183a59d5642c/volumes" Feb 02 11:19:19 crc kubenswrapper[4909]: I0202 11:19:19.510900 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:19:19 crc kubenswrapper[4909]: I0202 11:19:19.511230 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:19:49 crc kubenswrapper[4909]: I0202 11:19:49.511555 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:19:49 crc kubenswrapper[4909]: I0202 11:19:49.512220 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:19:49 crc kubenswrapper[4909]: I0202 11:19:49.512277 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 11:19:49 crc kubenswrapper[4909]: I0202 11:19:49.513007 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:19:49 crc kubenswrapper[4909]: I0202 11:19:49.513064 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" gracePeriod=600 Feb 02 11:19:49 crc kubenswrapper[4909]: E0202 11:19:49.647061 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:19:50 crc kubenswrapper[4909]: I0202 11:19:50.464049 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" exitCode=0 Feb 02 11:19:50 crc kubenswrapper[4909]: I0202 11:19:50.464095 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26"} Feb 02 11:19:50 crc kubenswrapper[4909]: I0202 11:19:50.464130 4909 scope.go:117] "RemoveContainer" containerID="183542b16bfbba399a64fca4f881f43f3b512381f9b281b3fed5ed8cb6447db0" Feb 02 11:19:50 crc kubenswrapper[4909]: I0202 11:19:50.464790 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:19:50 crc kubenswrapper[4909]: E0202 11:19:50.465862 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:20:03 crc kubenswrapper[4909]: I0202 11:20:03.016093 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:20:03 crc kubenswrapper[4909]: E0202 11:20:03.016752 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:20:15 crc kubenswrapper[4909]: I0202 11:20:15.021900 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:20:15 crc kubenswrapper[4909]: E0202 11:20:15.022438 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:20:30 crc kubenswrapper[4909]: I0202 11:20:30.016693 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:20:30 crc kubenswrapper[4909]: E0202 11:20:30.018540 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:20:42 crc kubenswrapper[4909]: I0202 11:20:42.016484 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:20:42 crc kubenswrapper[4909]: E0202 11:20:42.017255 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:20:56 crc kubenswrapper[4909]: I0202 11:20:56.015906 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:20:56 crc kubenswrapper[4909]: E0202 11:20:56.016677 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:21:08 crc kubenswrapper[4909]: I0202 11:21:08.016656 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:21:08 crc kubenswrapper[4909]: E0202 11:21:08.017227 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:21:21 crc kubenswrapper[4909]: I0202 11:21:21.016299 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:21:21 crc kubenswrapper[4909]: E0202 11:21:21.016847 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:21:34 crc kubenswrapper[4909]: I0202 11:21:34.016647 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:21:34 crc kubenswrapper[4909]: E0202 11:21:34.017485 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:21:45 crc kubenswrapper[4909]: I0202 11:21:45.019987 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:21:45 crc kubenswrapper[4909]: E0202 11:21:45.021963 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:21:59 crc kubenswrapper[4909]: I0202 11:21:59.017511 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:21:59 crc kubenswrapper[4909]: E0202 11:21:59.031651 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:22:14 crc kubenswrapper[4909]: I0202 11:22:14.016502 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:22:14 crc kubenswrapper[4909]: E0202 11:22:14.017261 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:22:25 crc kubenswrapper[4909]: I0202 11:22:25.020467 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:22:25 crc kubenswrapper[4909]: E0202 11:22:25.021307 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.179574 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9gzbc"] Feb 02 11:22:26 crc kubenswrapper[4909]: E0202 11:22:26.181081 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerName="extract-content" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.181163 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerName="extract-content" Feb 02 11:22:26 crc kubenswrapper[4909]: E0202 11:22:26.181232 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerName="extract-utilities" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.181379 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerName="extract-utilities" Feb 02 11:22:26 crc kubenswrapper[4909]: E0202 11:22:26.181455 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerName="registry-server" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.181510 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerName="registry-server" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.181686 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd4c4bca-6832-45d2-b626-183a59d5642c" containerName="registry-server" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.182767 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.208204 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gzbc"] Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.336680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92d6h\" (UniqueName: \"kubernetes.io/projected/d0922c33-f5bf-401b-9dee-57c3811e3357-kube-api-access-92d6h\") pod \"redhat-marketplace-9gzbc\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.337092 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-catalog-content\") pod \"redhat-marketplace-9gzbc\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.337193 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-utilities\") pod \"redhat-marketplace-9gzbc\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.438884 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92d6h\" (UniqueName: \"kubernetes.io/projected/d0922c33-f5bf-401b-9dee-57c3811e3357-kube-api-access-92d6h\") pod \"redhat-marketplace-9gzbc\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.438972 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-catalog-content\") pod \"redhat-marketplace-9gzbc\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.439008 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-utilities\") pod \"redhat-marketplace-9gzbc\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.439576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-catalog-content\") pod \"redhat-marketplace-9gzbc\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.439604 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-utilities\") pod \"redhat-marketplace-9gzbc\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.460979 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92d6h\" (UniqueName: \"kubernetes.io/projected/d0922c33-f5bf-401b-9dee-57c3811e3357-kube-api-access-92d6h\") pod \"redhat-marketplace-9gzbc\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.500074 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:26 crc kubenswrapper[4909]: I0202 11:22:26.748097 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gzbc"] Feb 02 11:22:27 crc kubenswrapper[4909]: I0202 11:22:27.491750 4909 generic.go:334] "Generic (PLEG): container finished" podID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerID="430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6" exitCode=0 Feb 02 11:22:27 crc kubenswrapper[4909]: I0202 11:22:27.491789 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gzbc" event={"ID":"d0922c33-f5bf-401b-9dee-57c3811e3357","Type":"ContainerDied","Data":"430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6"} Feb 02 11:22:27 crc kubenswrapper[4909]: I0202 11:22:27.492155 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gzbc" event={"ID":"d0922c33-f5bf-401b-9dee-57c3811e3357","Type":"ContainerStarted","Data":"0b738acbd2f3173b271b4cc0b94ac06e124fbea904adca2589fdb616362a93be"} Feb 02 11:22:27 crc kubenswrapper[4909]: I0202 11:22:27.493945 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:22:29 crc kubenswrapper[4909]: I0202 11:22:29.506907 4909 generic.go:334] "Generic (PLEG): container finished" podID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerID="c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3" exitCode=0 Feb 02 11:22:29 crc kubenswrapper[4909]: I0202 11:22:29.506961 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gzbc" event={"ID":"d0922c33-f5bf-401b-9dee-57c3811e3357","Type":"ContainerDied","Data":"c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3"} Feb 02 11:22:30 crc kubenswrapper[4909]: I0202 11:22:30.519969 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gzbc" event={"ID":"d0922c33-f5bf-401b-9dee-57c3811e3357","Type":"ContainerStarted","Data":"dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf"} Feb 02 11:22:30 crc kubenswrapper[4909]: I0202 11:22:30.544549 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9gzbc" podStartSLOduration=1.995012909 podStartE2EDuration="4.544529926s" podCreationTimestamp="2026-02-02 11:22:26 +0000 UTC" firstStartedPulling="2026-02-02 11:22:27.493736535 +0000 UTC m=+3073.239837270" lastFinishedPulling="2026-02-02 11:22:30.043253552 +0000 UTC m=+3075.789354287" observedRunningTime="2026-02-02 11:22:30.53940871 +0000 UTC m=+3076.285509445" watchObservedRunningTime="2026-02-02 11:22:30.544529926 +0000 UTC m=+3076.290630661" Feb 02 11:22:36 crc kubenswrapper[4909]: I0202 11:22:36.500570 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:36 crc kubenswrapper[4909]: I0202 11:22:36.501028 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:36 crc kubenswrapper[4909]: I0202 11:22:36.542754 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:36 crc kubenswrapper[4909]: I0202 11:22:36.607872 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:36 crc kubenswrapper[4909]: I0202 11:22:36.780318 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gzbc"] Feb 02 11:22:38 crc kubenswrapper[4909]: I0202 11:22:38.582244 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9gzbc" podUID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerName="registry-server" containerID="cri-o://dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf" gracePeriod=2 Feb 02 11:22:38 crc kubenswrapper[4909]: I0202 11:22:38.940855 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.016056 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:22:39 crc kubenswrapper[4909]: E0202 11:22:39.016374 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.030432 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-utilities\") pod \"d0922c33-f5bf-401b-9dee-57c3811e3357\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.030497 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92d6h\" (UniqueName: \"kubernetes.io/projected/d0922c33-f5bf-401b-9dee-57c3811e3357-kube-api-access-92d6h\") pod \"d0922c33-f5bf-401b-9dee-57c3811e3357\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.030567 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-catalog-content\") pod \"d0922c33-f5bf-401b-9dee-57c3811e3357\" (UID: \"d0922c33-f5bf-401b-9dee-57c3811e3357\") " Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.031499 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-utilities" (OuterVolumeSpecName: "utilities") pod "d0922c33-f5bf-401b-9dee-57c3811e3357" (UID: "d0922c33-f5bf-401b-9dee-57c3811e3357"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.037164 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0922c33-f5bf-401b-9dee-57c3811e3357-kube-api-access-92d6h" (OuterVolumeSpecName: "kube-api-access-92d6h") pod "d0922c33-f5bf-401b-9dee-57c3811e3357" (UID: "d0922c33-f5bf-401b-9dee-57c3811e3357"). InnerVolumeSpecName "kube-api-access-92d6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.054325 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0922c33-f5bf-401b-9dee-57c3811e3357" (UID: "d0922c33-f5bf-401b-9dee-57c3811e3357"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.132703 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.132735 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92d6h\" (UniqueName: \"kubernetes.io/projected/d0922c33-f5bf-401b-9dee-57c3811e3357-kube-api-access-92d6h\") on node \"crc\" DevicePath \"\"" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.132746 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0922c33-f5bf-401b-9dee-57c3811e3357-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.590588 4909 generic.go:334] "Generic (PLEG): container finished" podID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerID="dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf" exitCode=0 Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.590626 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gzbc" event={"ID":"d0922c33-f5bf-401b-9dee-57c3811e3357","Type":"ContainerDied","Data":"dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf"} Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.590678 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gzbc" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.590701 4909 scope.go:117] "RemoveContainer" containerID="dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.590665 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gzbc" event={"ID":"d0922c33-f5bf-401b-9dee-57c3811e3357","Type":"ContainerDied","Data":"0b738acbd2f3173b271b4cc0b94ac06e124fbea904adca2589fdb616362a93be"} Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.608234 4909 scope.go:117] "RemoveContainer" containerID="c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.622784 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gzbc"] Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.629382 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gzbc"] Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.643558 4909 scope.go:117] "RemoveContainer" containerID="430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.656297 4909 scope.go:117] "RemoveContainer" containerID="dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf" Feb 02 11:22:39 crc kubenswrapper[4909]: E0202 11:22:39.656739 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf\": container with ID starting with dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf not found: ID does not exist" containerID="dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.656781 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf"} err="failed to get container status \"dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf\": rpc error: code = NotFound desc = could not find container \"dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf\": container with ID starting with dc447027ffdb3bb59a354a81b90b4b56827a36a60cc42f2c961461a7035d5eaf not found: ID does not exist" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.656824 4909 scope.go:117] "RemoveContainer" containerID="c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3" Feb 02 11:22:39 crc kubenswrapper[4909]: E0202 11:22:39.657139 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3\": container with ID starting with c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3 not found: ID does not exist" containerID="c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.657187 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3"} err="failed to get container status \"c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3\": rpc error: code = NotFound desc = could not find container \"c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3\": container with ID starting with c091abecaa4fa80786472d5c8b5003d6caa48588363fdefa361f889d3b2eabc3 not found: ID does not exist" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.657221 4909 scope.go:117] "RemoveContainer" containerID="430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6" Feb 02 11:22:39 crc kubenswrapper[4909]: E0202 11:22:39.657567 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6\": container with ID starting with 430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6 not found: ID does not exist" containerID="430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6" Feb 02 11:22:39 crc kubenswrapper[4909]: I0202 11:22:39.657596 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6"} err="failed to get container status \"430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6\": rpc error: code = NotFound desc = could not find container \"430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6\": container with ID starting with 430bb22dbab23bebaa6369530befa13b1e1da7e3bc40aa749da36590423f8ea6 not found: ID does not exist" Feb 02 11:22:41 crc kubenswrapper[4909]: I0202 11:22:41.036604 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0922c33-f5bf-401b-9dee-57c3811e3357" path="/var/lib/kubelet/pods/d0922c33-f5bf-401b-9dee-57c3811e3357/volumes" Feb 02 11:22:54 crc kubenswrapper[4909]: I0202 11:22:54.016462 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:22:54 crc kubenswrapper[4909]: E0202 11:22:54.017175 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:23:09 crc kubenswrapper[4909]: I0202 11:23:09.017228 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:23:09 crc kubenswrapper[4909]: E0202 11:23:09.018118 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:23:23 crc kubenswrapper[4909]: I0202 11:23:23.017613 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:23:23 crc kubenswrapper[4909]: E0202 11:23:23.018328 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:23:34 crc kubenswrapper[4909]: I0202 11:23:34.016424 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:23:34 crc kubenswrapper[4909]: E0202 11:23:34.017217 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:23:48 crc kubenswrapper[4909]: I0202 11:23:48.016904 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:23:48 crc kubenswrapper[4909]: E0202 11:23:48.018030 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:23:59 crc kubenswrapper[4909]: I0202 11:23:59.016705 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:23:59 crc kubenswrapper[4909]: E0202 11:23:59.017442 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:24:14 crc kubenswrapper[4909]: I0202 11:24:14.016129 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:24:14 crc kubenswrapper[4909]: E0202 11:24:14.016875 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:24:28 crc kubenswrapper[4909]: I0202 11:24:28.016982 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:24:28 crc kubenswrapper[4909]: E0202 11:24:28.017888 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:24:41 crc kubenswrapper[4909]: I0202 11:24:41.016701 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:24:41 crc kubenswrapper[4909]: E0202 11:24:41.017447 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:24:55 crc kubenswrapper[4909]: I0202 11:24:55.021247 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:24:55 crc kubenswrapper[4909]: I0202 11:24:55.775555 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"401ec58a81f865f1d469f4269c75522eae0c401e9af17536ebba685dc73b468f"} Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.155332 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r6xpp"] Feb 02 11:27:13 crc kubenswrapper[4909]: E0202 11:27:13.156264 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerName="registry-server" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.156277 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerName="registry-server" Feb 02 11:27:13 crc kubenswrapper[4909]: E0202 11:27:13.156285 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerName="extract-content" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.156291 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerName="extract-content" Feb 02 11:27:13 crc kubenswrapper[4909]: E0202 11:27:13.156314 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerName="extract-utilities" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.156320 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerName="extract-utilities" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.156443 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0922c33-f5bf-401b-9dee-57c3811e3357" containerName="registry-server" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.157394 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.160916 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6xpp"] Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.174990 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwn57\" (UniqueName: \"kubernetes.io/projected/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-kube-api-access-rwn57\") pod \"redhat-operators-r6xpp\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.175040 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-catalog-content\") pod \"redhat-operators-r6xpp\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.175159 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-utilities\") pod \"redhat-operators-r6xpp\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.276732 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-utilities\") pod \"redhat-operators-r6xpp\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.277176 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwn57\" (UniqueName: \"kubernetes.io/projected/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-kube-api-access-rwn57\") pod \"redhat-operators-r6xpp\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.277303 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-utilities\") pod \"redhat-operators-r6xpp\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.277572 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-catalog-content\") pod \"redhat-operators-r6xpp\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.277586 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-catalog-content\") pod \"redhat-operators-r6xpp\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.300890 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwn57\" (UniqueName: \"kubernetes.io/projected/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-kube-api-access-rwn57\") pod \"redhat-operators-r6xpp\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.486589 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.929894 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6xpp"] Feb 02 11:27:13 crc kubenswrapper[4909]: I0202 11:27:13.992000 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6xpp" event={"ID":"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69","Type":"ContainerStarted","Data":"c55b983ff331dd9eb985693ca28461f29fb6192dea12ada983fae95b6254a8c5"} Feb 02 11:27:15 crc kubenswrapper[4909]: I0202 11:27:15.001264 4909 generic.go:334] "Generic (PLEG): container finished" podID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerID="53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705" exitCode=0 Feb 02 11:27:15 crc kubenswrapper[4909]: I0202 11:27:15.001771 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6xpp" event={"ID":"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69","Type":"ContainerDied","Data":"53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705"} Feb 02 11:27:17 crc kubenswrapper[4909]: I0202 11:27:17.024158 4909 generic.go:334] "Generic (PLEG): container finished" podID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerID="ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc" exitCode=0 Feb 02 11:27:17 crc kubenswrapper[4909]: I0202 11:27:17.034228 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6xpp" event={"ID":"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69","Type":"ContainerDied","Data":"ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc"} Feb 02 11:27:18 crc kubenswrapper[4909]: I0202 11:27:18.033156 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6xpp" event={"ID":"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69","Type":"ContainerStarted","Data":"276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd"} Feb 02 11:27:18 crc kubenswrapper[4909]: I0202 11:27:18.051184 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r6xpp" podStartSLOduration=2.619574773 podStartE2EDuration="5.051165921s" podCreationTimestamp="2026-02-02 11:27:13 +0000 UTC" firstStartedPulling="2026-02-02 11:27:15.003670573 +0000 UTC m=+3360.749771308" lastFinishedPulling="2026-02-02 11:27:17.435261721 +0000 UTC m=+3363.181362456" observedRunningTime="2026-02-02 11:27:18.048543607 +0000 UTC m=+3363.794644342" watchObservedRunningTime="2026-02-02 11:27:18.051165921 +0000 UTC m=+3363.797266676" Feb 02 11:27:19 crc kubenswrapper[4909]: I0202 11:27:19.511480 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:27:19 crc kubenswrapper[4909]: I0202 11:27:19.511822 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:27:23 crc kubenswrapper[4909]: I0202 11:27:23.487154 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:23 crc kubenswrapper[4909]: I0202 11:27:23.489063 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:23 crc kubenswrapper[4909]: I0202 11:27:23.534831 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:24 crc kubenswrapper[4909]: I0202 11:27:24.114945 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:24 crc kubenswrapper[4909]: I0202 11:27:24.159911 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6xpp"] Feb 02 11:27:26 crc kubenswrapper[4909]: I0202 11:27:26.076630 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r6xpp" podUID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerName="registry-server" containerID="cri-o://276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd" gracePeriod=2 Feb 02 11:27:26 crc kubenswrapper[4909]: I0202 11:27:26.437291 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:26 crc kubenswrapper[4909]: I0202 11:27:26.551763 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwn57\" (UniqueName: \"kubernetes.io/projected/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-kube-api-access-rwn57\") pod \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " Feb 02 11:27:26 crc kubenswrapper[4909]: I0202 11:27:26.551853 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-catalog-content\") pod \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " Feb 02 11:27:26 crc kubenswrapper[4909]: I0202 11:27:26.551957 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-utilities\") pod \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\" (UID: \"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69\") " Feb 02 11:27:26 crc kubenswrapper[4909]: I0202 11:27:26.553157 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-utilities" (OuterVolumeSpecName: "utilities") pod "c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" (UID: "c7d817e1-51e1-4db3-b8cc-9043cbaf5b69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:27:26 crc kubenswrapper[4909]: I0202 11:27:26.558389 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-kube-api-access-rwn57" (OuterVolumeSpecName: "kube-api-access-rwn57") pod "c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" (UID: "c7d817e1-51e1-4db3-b8cc-9043cbaf5b69"). InnerVolumeSpecName "kube-api-access-rwn57". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:27:26 crc kubenswrapper[4909]: I0202 11:27:26.653169 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:27:26 crc kubenswrapper[4909]: I0202 11:27:26.653430 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwn57\" (UniqueName: \"kubernetes.io/projected/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-kube-api-access-rwn57\") on node \"crc\" DevicePath \"\"" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.085156 4909 generic.go:334] "Generic (PLEG): container finished" podID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerID="276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd" exitCode=0 Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.085201 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6xpp" event={"ID":"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69","Type":"ContainerDied","Data":"276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd"} Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.085219 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6xpp" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.085231 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6xpp" event={"ID":"c7d817e1-51e1-4db3-b8cc-9043cbaf5b69","Type":"ContainerDied","Data":"c55b983ff331dd9eb985693ca28461f29fb6192dea12ada983fae95b6254a8c5"} Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.085249 4909 scope.go:117] "RemoveContainer" containerID="276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.103647 4909 scope.go:117] "RemoveContainer" containerID="ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.122790 4909 scope.go:117] "RemoveContainer" containerID="53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.152102 4909 scope.go:117] "RemoveContainer" containerID="276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd" Feb 02 11:27:27 crc kubenswrapper[4909]: E0202 11:27:27.153006 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd\": container with ID starting with 276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd not found: ID does not exist" containerID="276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.153078 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd"} err="failed to get container status \"276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd\": rpc error: code = NotFound desc = could not find container \"276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd\": container with ID starting with 276482d6bdccaf17ffa08a17d8ccd08b1602db2984a5d4de13907eb00af262cd not found: ID does not exist" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.153105 4909 scope.go:117] "RemoveContainer" containerID="ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc" Feb 02 11:27:27 crc kubenswrapper[4909]: E0202 11:27:27.153548 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc\": container with ID starting with ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc not found: ID does not exist" containerID="ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.153570 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc"} err="failed to get container status \"ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc\": rpc error: code = NotFound desc = could not find container \"ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc\": container with ID starting with ffbc4c0e265a16769dd77eeadb7414373d6694f857fe842bec69377a48e1e4cc not found: ID does not exist" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.153584 4909 scope.go:117] "RemoveContainer" containerID="53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705" Feb 02 11:27:27 crc kubenswrapper[4909]: E0202 11:27:27.153868 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705\": container with ID starting with 53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705 not found: ID does not exist" containerID="53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.153908 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705"} err="failed to get container status \"53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705\": rpc error: code = NotFound desc = could not find container \"53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705\": container with ID starting with 53187f5b069747244609102c4ecf34eb912c291acd36439bc7a2bc1e59dcb705 not found: ID does not exist" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.840699 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" (UID: "c7d817e1-51e1-4db3-b8cc-9043cbaf5b69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:27:27 crc kubenswrapper[4909]: I0202 11:27:27.871020 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:27:28 crc kubenswrapper[4909]: I0202 11:27:28.016048 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6xpp"] Feb 02 11:27:28 crc kubenswrapper[4909]: I0202 11:27:28.022147 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r6xpp"] Feb 02 11:27:29 crc kubenswrapper[4909]: I0202 11:27:29.026140 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" path="/var/lib/kubelet/pods/c7d817e1-51e1-4db3-b8cc-9043cbaf5b69/volumes" Feb 02 11:27:49 crc kubenswrapper[4909]: I0202 11:27:49.510988 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:27:49 crc kubenswrapper[4909]: I0202 11:27:49.512071 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:28:19 crc kubenswrapper[4909]: I0202 11:28:19.510984 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:28:19 crc kubenswrapper[4909]: I0202 11:28:19.511900 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:28:19 crc kubenswrapper[4909]: I0202 11:28:19.511971 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 11:28:19 crc kubenswrapper[4909]: I0202 11:28:19.512924 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"401ec58a81f865f1d469f4269c75522eae0c401e9af17536ebba685dc73b468f"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:28:19 crc kubenswrapper[4909]: I0202 11:28:19.513124 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://401ec58a81f865f1d469f4269c75522eae0c401e9af17536ebba685dc73b468f" gracePeriod=600 Feb 02 11:28:20 crc kubenswrapper[4909]: I0202 11:28:20.486314 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="401ec58a81f865f1d469f4269c75522eae0c401e9af17536ebba685dc73b468f" exitCode=0 Feb 02 11:28:20 crc kubenswrapper[4909]: I0202 11:28:20.486404 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"401ec58a81f865f1d469f4269c75522eae0c401e9af17536ebba685dc73b468f"} Feb 02 11:28:20 crc kubenswrapper[4909]: I0202 11:28:20.486669 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7"} Feb 02 11:28:20 crc kubenswrapper[4909]: I0202 11:28:20.486717 4909 scope.go:117] "RemoveContainer" containerID="84ad2af5870e9b6151b069eb5c0c54620b9f1c2ca47e5dfb3d01466a784d9f26" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.407656 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5pb52"] Feb 02 11:28:29 crc kubenswrapper[4909]: E0202 11:28:29.409455 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerName="registry-server" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.409476 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerName="registry-server" Feb 02 11:28:29 crc kubenswrapper[4909]: E0202 11:28:29.409509 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerName="extract-content" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.409518 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerName="extract-content" Feb 02 11:28:29 crc kubenswrapper[4909]: E0202 11:28:29.409535 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerName="extract-utilities" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.409543 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerName="extract-utilities" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.409834 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d817e1-51e1-4db3-b8cc-9043cbaf5b69" containerName="registry-server" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.411314 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.421753 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5pb52"] Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.470792 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-catalog-content\") pod \"community-operators-5pb52\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.470886 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncjkp\" (UniqueName: \"kubernetes.io/projected/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-kube-api-access-ncjkp\") pod \"community-operators-5pb52\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.470944 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-utilities\") pod \"community-operators-5pb52\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.571439 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-catalog-content\") pod \"community-operators-5pb52\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.571870 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncjkp\" (UniqueName: \"kubernetes.io/projected/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-kube-api-access-ncjkp\") pod \"community-operators-5pb52\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.571932 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-utilities\") pod \"community-operators-5pb52\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.572031 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-catalog-content\") pod \"community-operators-5pb52\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.572397 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-utilities\") pod \"community-operators-5pb52\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.590507 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncjkp\" (UniqueName: \"kubernetes.io/projected/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-kube-api-access-ncjkp\") pod \"community-operators-5pb52\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:29 crc kubenswrapper[4909]: I0202 11:28:29.792466 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:30 crc kubenswrapper[4909]: I0202 11:28:30.241878 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5pb52"] Feb 02 11:28:30 crc kubenswrapper[4909]: I0202 11:28:30.561258 4909 generic.go:334] "Generic (PLEG): container finished" podID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerID="bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c" exitCode=0 Feb 02 11:28:30 crc kubenswrapper[4909]: I0202 11:28:30.561321 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pb52" event={"ID":"33c21b1d-4c73-4561-ac5e-f6c446dc27cf","Type":"ContainerDied","Data":"bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c"} Feb 02 11:28:30 crc kubenswrapper[4909]: I0202 11:28:30.561390 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pb52" event={"ID":"33c21b1d-4c73-4561-ac5e-f6c446dc27cf","Type":"ContainerStarted","Data":"1a7ef19e58586f1998435cc00a0962a9f665b984c3ec9ced42e99d931d37bbb3"} Feb 02 11:28:30 crc kubenswrapper[4909]: I0202 11:28:30.562692 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:28:31 crc kubenswrapper[4909]: I0202 11:28:31.575228 4909 generic.go:334] "Generic (PLEG): container finished" podID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerID="79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a" exitCode=0 Feb 02 11:28:31 crc kubenswrapper[4909]: I0202 11:28:31.575303 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pb52" event={"ID":"33c21b1d-4c73-4561-ac5e-f6c446dc27cf","Type":"ContainerDied","Data":"79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a"} Feb 02 11:28:32 crc kubenswrapper[4909]: I0202 11:28:32.588140 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pb52" event={"ID":"33c21b1d-4c73-4561-ac5e-f6c446dc27cf","Type":"ContainerStarted","Data":"8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd"} Feb 02 11:28:32 crc kubenswrapper[4909]: I0202 11:28:32.612667 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5pb52" podStartSLOduration=2.0365895 podStartE2EDuration="3.612642659s" podCreationTimestamp="2026-02-02 11:28:29 +0000 UTC" firstStartedPulling="2026-02-02 11:28:30.562504161 +0000 UTC m=+3436.308604896" lastFinishedPulling="2026-02-02 11:28:32.13855732 +0000 UTC m=+3437.884658055" observedRunningTime="2026-02-02 11:28:32.61019846 +0000 UTC m=+3438.356299205" watchObservedRunningTime="2026-02-02 11:28:32.612642659 +0000 UTC m=+3438.358743394" Feb 02 11:28:39 crc kubenswrapper[4909]: I0202 11:28:39.793212 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:39 crc kubenswrapper[4909]: I0202 11:28:39.794160 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:39 crc kubenswrapper[4909]: I0202 11:28:39.833116 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:39 crc kubenswrapper[4909]: I0202 11:28:39.904089 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:40 crc kubenswrapper[4909]: I0202 11:28:40.061348 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5pb52"] Feb 02 11:28:41 crc kubenswrapper[4909]: I0202 11:28:41.881909 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5pb52" podUID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerName="registry-server" containerID="cri-o://8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd" gracePeriod=2 Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.239866 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.384654 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-utilities\") pod \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.384715 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-catalog-content\") pod \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.384818 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncjkp\" (UniqueName: \"kubernetes.io/projected/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-kube-api-access-ncjkp\") pod \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\" (UID: \"33c21b1d-4c73-4561-ac5e-f6c446dc27cf\") " Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.386034 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-utilities" (OuterVolumeSpecName: "utilities") pod "33c21b1d-4c73-4561-ac5e-f6c446dc27cf" (UID: "33c21b1d-4c73-4561-ac5e-f6c446dc27cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.389982 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-kube-api-access-ncjkp" (OuterVolumeSpecName: "kube-api-access-ncjkp") pod "33c21b1d-4c73-4561-ac5e-f6c446dc27cf" (UID: "33c21b1d-4c73-4561-ac5e-f6c446dc27cf"). InnerVolumeSpecName "kube-api-access-ncjkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.486203 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.486237 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncjkp\" (UniqueName: \"kubernetes.io/projected/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-kube-api-access-ncjkp\") on node \"crc\" DevicePath \"\"" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.702953 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33c21b1d-4c73-4561-ac5e-f6c446dc27cf" (UID: "33c21b1d-4c73-4561-ac5e-f6c446dc27cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.789703 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c21b1d-4c73-4561-ac5e-f6c446dc27cf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.896971 4909 generic.go:334] "Generic (PLEG): container finished" podID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerID="8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd" exitCode=0 Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.897036 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pb52" event={"ID":"33c21b1d-4c73-4561-ac5e-f6c446dc27cf","Type":"ContainerDied","Data":"8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd"} Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.897070 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5pb52" event={"ID":"33c21b1d-4c73-4561-ac5e-f6c446dc27cf","Type":"ContainerDied","Data":"1a7ef19e58586f1998435cc00a0962a9f665b984c3ec9ced42e99d931d37bbb3"} Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.897090 4909 scope.go:117] "RemoveContainer" containerID="8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.897293 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5pb52" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.915006 4909 scope.go:117] "RemoveContainer" containerID="79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.928833 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5pb52"] Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.936268 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5pb52"] Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.947249 4909 scope.go:117] "RemoveContainer" containerID="bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.961376 4909 scope.go:117] "RemoveContainer" containerID="8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd" Feb 02 11:28:42 crc kubenswrapper[4909]: E0202 11:28:42.961850 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd\": container with ID starting with 8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd not found: ID does not exist" containerID="8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.961879 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd"} err="failed to get container status \"8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd\": rpc error: code = NotFound desc = could not find container \"8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd\": container with ID starting with 8c5f1d21026482b79f6eb3b8dff0df1d874ed0f39adebabdeb9dff475cbee4cd not found: ID does not exist" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.961899 4909 scope.go:117] "RemoveContainer" containerID="79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a" Feb 02 11:28:42 crc kubenswrapper[4909]: E0202 11:28:42.962207 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a\": container with ID starting with 79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a not found: ID does not exist" containerID="79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.962245 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a"} err="failed to get container status \"79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a\": rpc error: code = NotFound desc = could not find container \"79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a\": container with ID starting with 79310c95176e2159ea3e30b46bb5c87f2637d96f010fb9a6dc81d6b34100315a not found: ID does not exist" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.962270 4909 scope.go:117] "RemoveContainer" containerID="bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c" Feb 02 11:28:42 crc kubenswrapper[4909]: E0202 11:28:42.962528 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c\": container with ID starting with bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c not found: ID does not exist" containerID="bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c" Feb 02 11:28:42 crc kubenswrapper[4909]: I0202 11:28:42.962546 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c"} err="failed to get container status \"bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c\": rpc error: code = NotFound desc = could not find container \"bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c\": container with ID starting with bd3e746b23bc4fb7a24d1fcf39cddf39b52fa2dbaede04e9f903ae0e74b19c9c not found: ID does not exist" Feb 02 11:28:43 crc kubenswrapper[4909]: I0202 11:28:43.026305 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" path="/var/lib/kubelet/pods/33c21b1d-4c73-4561-ac5e-f6c446dc27cf/volumes" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.146628 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575"] Feb 02 11:30:00 crc kubenswrapper[4909]: E0202 11:30:00.148688 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerName="extract-utilities" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.148755 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerName="extract-utilities" Feb 02 11:30:00 crc kubenswrapper[4909]: E0202 11:30:00.148844 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerName="registry-server" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.148858 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerName="registry-server" Feb 02 11:30:00 crc kubenswrapper[4909]: E0202 11:30:00.148901 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerName="extract-content" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.148912 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerName="extract-content" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.149270 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c21b1d-4c73-4561-ac5e-f6c446dc27cf" containerName="registry-server" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.150202 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.153697 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.153729 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575"] Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.153697 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.274648 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7d28527-82aa-4c77-b150-d94c4e9c4f32-config-volume\") pod \"collect-profiles-29500530-xv575\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.274704 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwjm\" (UniqueName: \"kubernetes.io/projected/f7d28527-82aa-4c77-b150-d94c4e9c4f32-kube-api-access-xxwjm\") pod \"collect-profiles-29500530-xv575\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.274757 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7d28527-82aa-4c77-b150-d94c4e9c4f32-secret-volume\") pod \"collect-profiles-29500530-xv575\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.376962 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxwjm\" (UniqueName: \"kubernetes.io/projected/f7d28527-82aa-4c77-b150-d94c4e9c4f32-kube-api-access-xxwjm\") pod \"collect-profiles-29500530-xv575\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.377054 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7d28527-82aa-4c77-b150-d94c4e9c4f32-secret-volume\") pod \"collect-profiles-29500530-xv575\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.377126 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7d28527-82aa-4c77-b150-d94c4e9c4f32-config-volume\") pod \"collect-profiles-29500530-xv575\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.378124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7d28527-82aa-4c77-b150-d94c4e9c4f32-config-volume\") pod \"collect-profiles-29500530-xv575\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.385601 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7d28527-82aa-4c77-b150-d94c4e9c4f32-secret-volume\") pod \"collect-profiles-29500530-xv575\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.394134 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxwjm\" (UniqueName: \"kubernetes.io/projected/f7d28527-82aa-4c77-b150-d94c4e9c4f32-kube-api-access-xxwjm\") pod \"collect-profiles-29500530-xv575\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.479626 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:00 crc kubenswrapper[4909]: I0202 11:30:00.885869 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575"] Feb 02 11:30:01 crc kubenswrapper[4909]: I0202 11:30:01.138514 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" event={"ID":"f7d28527-82aa-4c77-b150-d94c4e9c4f32","Type":"ContainerStarted","Data":"29fcba3fe2ea1b72f8d6aa8a8a4c017a890fec1a0a7a6c956786a50d42a118bc"} Feb 02 11:30:01 crc kubenswrapper[4909]: I0202 11:30:01.138568 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" event={"ID":"f7d28527-82aa-4c77-b150-d94c4e9c4f32","Type":"ContainerStarted","Data":"8f38404b743d5982cfd5886fa7064c1dd4c654c55c0d920e2e3ebc62bd431a68"} Feb 02 11:30:02 crc kubenswrapper[4909]: I0202 11:30:02.144957 4909 generic.go:334] "Generic (PLEG): container finished" podID="f7d28527-82aa-4c77-b150-d94c4e9c4f32" containerID="29fcba3fe2ea1b72f8d6aa8a8a4c017a890fec1a0a7a6c956786a50d42a118bc" exitCode=0 Feb 02 11:30:02 crc kubenswrapper[4909]: I0202 11:30:02.145004 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" event={"ID":"f7d28527-82aa-4c77-b150-d94c4e9c4f32","Type":"ContainerDied","Data":"29fcba3fe2ea1b72f8d6aa8a8a4c017a890fec1a0a7a6c956786a50d42a118bc"} Feb 02 11:30:03 crc kubenswrapper[4909]: I0202 11:30:03.457540 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:03 crc kubenswrapper[4909]: I0202 11:30:03.627691 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7d28527-82aa-4c77-b150-d94c4e9c4f32-config-volume\") pod \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " Feb 02 11:30:03 crc kubenswrapper[4909]: I0202 11:30:03.627742 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7d28527-82aa-4c77-b150-d94c4e9c4f32-secret-volume\") pod \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " Feb 02 11:30:03 crc kubenswrapper[4909]: I0202 11:30:03.627864 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxwjm\" (UniqueName: \"kubernetes.io/projected/f7d28527-82aa-4c77-b150-d94c4e9c4f32-kube-api-access-xxwjm\") pod \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\" (UID: \"f7d28527-82aa-4c77-b150-d94c4e9c4f32\") " Feb 02 11:30:03 crc kubenswrapper[4909]: I0202 11:30:03.628697 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7d28527-82aa-4c77-b150-d94c4e9c4f32-config-volume" (OuterVolumeSpecName: "config-volume") pod "f7d28527-82aa-4c77-b150-d94c4e9c4f32" (UID: "f7d28527-82aa-4c77-b150-d94c4e9c4f32"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:30:03 crc kubenswrapper[4909]: I0202 11:30:03.633337 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7d28527-82aa-4c77-b150-d94c4e9c4f32-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f7d28527-82aa-4c77-b150-d94c4e9c4f32" (UID: "f7d28527-82aa-4c77-b150-d94c4e9c4f32"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:30:03 crc kubenswrapper[4909]: I0202 11:30:03.634059 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7d28527-82aa-4c77-b150-d94c4e9c4f32-kube-api-access-xxwjm" (OuterVolumeSpecName: "kube-api-access-xxwjm") pod "f7d28527-82aa-4c77-b150-d94c4e9c4f32" (UID: "f7d28527-82aa-4c77-b150-d94c4e9c4f32"). InnerVolumeSpecName "kube-api-access-xxwjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:30:03 crc kubenswrapper[4909]: I0202 11:30:03.729133 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7d28527-82aa-4c77-b150-d94c4e9c4f32-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:03 crc kubenswrapper[4909]: I0202 11:30:03.729427 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f7d28527-82aa-4c77-b150-d94c4e9c4f32-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:03 crc kubenswrapper[4909]: I0202 11:30:03.729510 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxwjm\" (UniqueName: \"kubernetes.io/projected/f7d28527-82aa-4c77-b150-d94c4e9c4f32-kube-api-access-xxwjm\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:04 crc kubenswrapper[4909]: I0202 11:30:04.157557 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" event={"ID":"f7d28527-82aa-4c77-b150-d94c4e9c4f32","Type":"ContainerDied","Data":"8f38404b743d5982cfd5886fa7064c1dd4c654c55c0d920e2e3ebc62bd431a68"} Feb 02 11:30:04 crc kubenswrapper[4909]: I0202 11:30:04.157599 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f38404b743d5982cfd5886fa7064c1dd4c654c55c0d920e2e3ebc62bd431a68" Feb 02 11:30:04 crc kubenswrapper[4909]: I0202 11:30:04.157632 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575" Feb 02 11:30:04 crc kubenswrapper[4909]: I0202 11:30:04.519577 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj"] Feb 02 11:30:04 crc kubenswrapper[4909]: I0202 11:30:04.528068 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500485-vsktj"] Feb 02 11:30:05 crc kubenswrapper[4909]: I0202 11:30:05.024506 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="995fb696-26f8-4ae2-9552-14bea880b2ff" path="/var/lib/kubelet/pods/995fb696-26f8-4ae2-9552-14bea880b2ff/volumes" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.755056 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p9blt"] Feb 02 11:30:06 crc kubenswrapper[4909]: E0202 11:30:06.756248 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7d28527-82aa-4c77-b150-d94c4e9c4f32" containerName="collect-profiles" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.756270 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7d28527-82aa-4c77-b150-d94c4e9c4f32" containerName="collect-profiles" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.756456 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7d28527-82aa-4c77-b150-d94c4e9c4f32" containerName="collect-profiles" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.757476 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.767023 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p9blt"] Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.867532 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-utilities\") pod \"certified-operators-p9blt\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.867683 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2p74\" (UniqueName: \"kubernetes.io/projected/cc40ec40-c511-435c-9324-0d81db7d3416-kube-api-access-z2p74\") pod \"certified-operators-p9blt\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.867730 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-catalog-content\") pod \"certified-operators-p9blt\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.969274 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2p74\" (UniqueName: \"kubernetes.io/projected/cc40ec40-c511-435c-9324-0d81db7d3416-kube-api-access-z2p74\") pod \"certified-operators-p9blt\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.969318 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-catalog-content\") pod \"certified-operators-p9blt\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.969405 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-utilities\") pod \"certified-operators-p9blt\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.969895 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-utilities\") pod \"certified-operators-p9blt\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.970066 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-catalog-content\") pod \"certified-operators-p9blt\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:06 crc kubenswrapper[4909]: I0202 11:30:06.988330 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2p74\" (UniqueName: \"kubernetes.io/projected/cc40ec40-c511-435c-9324-0d81db7d3416-kube-api-access-z2p74\") pod \"certified-operators-p9blt\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:07 crc kubenswrapper[4909]: I0202 11:30:07.091989 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:07 crc kubenswrapper[4909]: I0202 11:30:07.632677 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p9blt"] Feb 02 11:30:08 crc kubenswrapper[4909]: I0202 11:30:08.195589 4909 generic.go:334] "Generic (PLEG): container finished" podID="cc40ec40-c511-435c-9324-0d81db7d3416" containerID="981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716" exitCode=0 Feb 02 11:30:08 crc kubenswrapper[4909]: I0202 11:30:08.195628 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9blt" event={"ID":"cc40ec40-c511-435c-9324-0d81db7d3416","Type":"ContainerDied","Data":"981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716"} Feb 02 11:30:08 crc kubenswrapper[4909]: I0202 11:30:08.195651 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9blt" event={"ID":"cc40ec40-c511-435c-9324-0d81db7d3416","Type":"ContainerStarted","Data":"e54e7be4803e99da8b3afdaa3313a8a85b75d88abeb62b41ff58ff5fcd2a740d"} Feb 02 11:30:09 crc kubenswrapper[4909]: I0202 11:30:09.203614 4909 generic.go:334] "Generic (PLEG): container finished" podID="cc40ec40-c511-435c-9324-0d81db7d3416" containerID="1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961" exitCode=0 Feb 02 11:30:09 crc kubenswrapper[4909]: I0202 11:30:09.203968 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9blt" event={"ID":"cc40ec40-c511-435c-9324-0d81db7d3416","Type":"ContainerDied","Data":"1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961"} Feb 02 11:30:10 crc kubenswrapper[4909]: I0202 11:30:10.212246 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9blt" event={"ID":"cc40ec40-c511-435c-9324-0d81db7d3416","Type":"ContainerStarted","Data":"07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2"} Feb 02 11:30:10 crc kubenswrapper[4909]: I0202 11:30:10.234969 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p9blt" podStartSLOduration=2.871526478 podStartE2EDuration="4.234944798s" podCreationTimestamp="2026-02-02 11:30:06 +0000 UTC" firstStartedPulling="2026-02-02 11:30:08.197135361 +0000 UTC m=+3533.943236096" lastFinishedPulling="2026-02-02 11:30:09.560553681 +0000 UTC m=+3535.306654416" observedRunningTime="2026-02-02 11:30:10.229127222 +0000 UTC m=+3535.975227957" watchObservedRunningTime="2026-02-02 11:30:10.234944798 +0000 UTC m=+3535.981045533" Feb 02 11:30:17 crc kubenswrapper[4909]: I0202 11:30:17.092708 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:17 crc kubenswrapper[4909]: I0202 11:30:17.093295 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:17 crc kubenswrapper[4909]: I0202 11:30:17.131754 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:17 crc kubenswrapper[4909]: I0202 11:30:17.289973 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:17 crc kubenswrapper[4909]: I0202 11:30:17.362758 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p9blt"] Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.266068 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p9blt" podUID="cc40ec40-c511-435c-9324-0d81db7d3416" containerName="registry-server" containerID="cri-o://07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2" gracePeriod=2 Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.511447 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.511502 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.728673 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.864489 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-utilities\") pod \"cc40ec40-c511-435c-9324-0d81db7d3416\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.864918 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-catalog-content\") pod \"cc40ec40-c511-435c-9324-0d81db7d3416\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.865001 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2p74\" (UniqueName: \"kubernetes.io/projected/cc40ec40-c511-435c-9324-0d81db7d3416-kube-api-access-z2p74\") pod \"cc40ec40-c511-435c-9324-0d81db7d3416\" (UID: \"cc40ec40-c511-435c-9324-0d81db7d3416\") " Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.865845 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-utilities" (OuterVolumeSpecName: "utilities") pod "cc40ec40-c511-435c-9324-0d81db7d3416" (UID: "cc40ec40-c511-435c-9324-0d81db7d3416"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.870041 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc40ec40-c511-435c-9324-0d81db7d3416-kube-api-access-z2p74" (OuterVolumeSpecName: "kube-api-access-z2p74") pod "cc40ec40-c511-435c-9324-0d81db7d3416" (UID: "cc40ec40-c511-435c-9324-0d81db7d3416"). InnerVolumeSpecName "kube-api-access-z2p74". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.904835 4909 scope.go:117] "RemoveContainer" containerID="c84ec98d2cc34adc0a281112c51ec90de99eb89f53680dc68950becfac8ae100" Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.917696 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc40ec40-c511-435c-9324-0d81db7d3416" (UID: "cc40ec40-c511-435c-9324-0d81db7d3416"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.969233 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.969279 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc40ec40-c511-435c-9324-0d81db7d3416-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:19 crc kubenswrapper[4909]: I0202 11:30:19.969300 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2p74\" (UniqueName: \"kubernetes.io/projected/cc40ec40-c511-435c-9324-0d81db7d3416-kube-api-access-z2p74\") on node \"crc\" DevicePath \"\"" Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.277592 4909 generic.go:334] "Generic (PLEG): container finished" podID="cc40ec40-c511-435c-9324-0d81db7d3416" containerID="07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2" exitCode=0 Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.277637 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9blt" event={"ID":"cc40ec40-c511-435c-9324-0d81db7d3416","Type":"ContainerDied","Data":"07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2"} Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.277676 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p9blt" event={"ID":"cc40ec40-c511-435c-9324-0d81db7d3416","Type":"ContainerDied","Data":"e54e7be4803e99da8b3afdaa3313a8a85b75d88abeb62b41ff58ff5fcd2a740d"} Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.277694 4909 scope.go:117] "RemoveContainer" containerID="07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2" Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.277719 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p9blt" Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.302435 4909 scope.go:117] "RemoveContainer" containerID="1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961" Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.325311 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p9blt"] Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.331641 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p9blt"] Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.354269 4909 scope.go:117] "RemoveContainer" containerID="981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716" Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.371946 4909 scope.go:117] "RemoveContainer" containerID="07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2" Feb 02 11:30:20 crc kubenswrapper[4909]: E0202 11:30:20.372543 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2\": container with ID starting with 07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2 not found: ID does not exist" containerID="07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2" Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.372579 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2"} err="failed to get container status \"07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2\": rpc error: code = NotFound desc = could not find container \"07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2\": container with ID starting with 07b0cdc0949a1c99300fbaaaabfe72171fe7b3d8593a42d8a6fa1de64f2c24f2 not found: ID does not exist" Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.372609 4909 scope.go:117] "RemoveContainer" containerID="1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961" Feb 02 11:30:20 crc kubenswrapper[4909]: E0202 11:30:20.373214 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961\": container with ID starting with 1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961 not found: ID does not exist" containerID="1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961" Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.373284 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961"} err="failed to get container status \"1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961\": rpc error: code = NotFound desc = could not find container \"1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961\": container with ID starting with 1db5bbc045e35d3d2184d3e33a75822531066b676a9e2f35a0f8589ee40b6961 not found: ID does not exist" Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.373326 4909 scope.go:117] "RemoveContainer" containerID="981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716" Feb 02 11:30:20 crc kubenswrapper[4909]: E0202 11:30:20.374060 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716\": container with ID starting with 981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716 not found: ID does not exist" containerID="981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716" Feb 02 11:30:20 crc kubenswrapper[4909]: I0202 11:30:20.374177 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716"} err="failed to get container status \"981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716\": rpc error: code = NotFound desc = could not find container \"981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716\": container with ID starting with 981440b522a1f9200aead6c46910474fb5be19ca58356d3c90e07e6df1adc716 not found: ID does not exist" Feb 02 11:30:21 crc kubenswrapper[4909]: I0202 11:30:21.025853 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc40ec40-c511-435c-9324-0d81db7d3416" path="/var/lib/kubelet/pods/cc40ec40-c511-435c-9324-0d81db7d3416/volumes" Feb 02 11:30:49 crc kubenswrapper[4909]: I0202 11:30:49.511164 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:30:49 crc kubenswrapper[4909]: I0202 11:30:49.511622 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:31:19 crc kubenswrapper[4909]: I0202 11:31:19.510794 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:31:19 crc kubenswrapper[4909]: I0202 11:31:19.511345 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:31:19 crc kubenswrapper[4909]: I0202 11:31:19.511385 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 11:31:19 crc kubenswrapper[4909]: I0202 11:31:19.512186 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:31:19 crc kubenswrapper[4909]: I0202 11:31:19.512289 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" gracePeriod=600 Feb 02 11:31:19 crc kubenswrapper[4909]: E0202 11:31:19.629273 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:31:19 crc kubenswrapper[4909]: I0202 11:31:19.688750 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" exitCode=0 Feb 02 11:31:19 crc kubenswrapper[4909]: I0202 11:31:19.688848 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7"} Feb 02 11:31:19 crc kubenswrapper[4909]: I0202 11:31:19.688905 4909 scope.go:117] "RemoveContainer" containerID="401ec58a81f865f1d469f4269c75522eae0c401e9af17536ebba685dc73b468f" Feb 02 11:31:19 crc kubenswrapper[4909]: I0202 11:31:19.689408 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:31:19 crc kubenswrapper[4909]: E0202 11:31:19.689996 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:31:31 crc kubenswrapper[4909]: I0202 11:31:31.016522 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:31:31 crc kubenswrapper[4909]: E0202 11:31:31.017289 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:31:46 crc kubenswrapper[4909]: I0202 11:31:46.016070 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:31:46 crc kubenswrapper[4909]: E0202 11:31:46.016730 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:32:00 crc kubenswrapper[4909]: I0202 11:32:00.017061 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:32:00 crc kubenswrapper[4909]: E0202 11:32:00.017957 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:32:12 crc kubenswrapper[4909]: I0202 11:32:12.017301 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:32:12 crc kubenswrapper[4909]: E0202 11:32:12.018743 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:32:23 crc kubenswrapper[4909]: I0202 11:32:23.017318 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:32:23 crc kubenswrapper[4909]: E0202 11:32:23.018183 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:32:35 crc kubenswrapper[4909]: I0202 11:32:35.021439 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:32:35 crc kubenswrapper[4909]: E0202 11:32:35.022379 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:32:48 crc kubenswrapper[4909]: I0202 11:32:48.017101 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:32:48 crc kubenswrapper[4909]: E0202 11:32:48.017779 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.249565 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5b9s2"] Feb 02 11:32:51 crc kubenswrapper[4909]: E0202 11:32:51.250195 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc40ec40-c511-435c-9324-0d81db7d3416" containerName="extract-utilities" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.250211 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc40ec40-c511-435c-9324-0d81db7d3416" containerName="extract-utilities" Feb 02 11:32:51 crc kubenswrapper[4909]: E0202 11:32:51.250222 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc40ec40-c511-435c-9324-0d81db7d3416" containerName="registry-server" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.250228 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc40ec40-c511-435c-9324-0d81db7d3416" containerName="registry-server" Feb 02 11:32:51 crc kubenswrapper[4909]: E0202 11:32:51.250237 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc40ec40-c511-435c-9324-0d81db7d3416" containerName="extract-content" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.250245 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc40ec40-c511-435c-9324-0d81db7d3416" containerName="extract-content" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.250435 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc40ec40-c511-435c-9324-0d81db7d3416" containerName="registry-server" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.251477 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.265672 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b9s2"] Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.367418 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8zd7\" (UniqueName: \"kubernetes.io/projected/074166e0-8f28-4d68-af23-f416c61980de-kube-api-access-k8zd7\") pod \"redhat-marketplace-5b9s2\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.367533 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-catalog-content\") pod \"redhat-marketplace-5b9s2\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.367602 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-utilities\") pod \"redhat-marketplace-5b9s2\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.468928 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8zd7\" (UniqueName: \"kubernetes.io/projected/074166e0-8f28-4d68-af23-f416c61980de-kube-api-access-k8zd7\") pod \"redhat-marketplace-5b9s2\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.469264 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-catalog-content\") pod \"redhat-marketplace-5b9s2\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.469385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-utilities\") pod \"redhat-marketplace-5b9s2\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.469861 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-catalog-content\") pod \"redhat-marketplace-5b9s2\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.469883 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-utilities\") pod \"redhat-marketplace-5b9s2\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.487690 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8zd7\" (UniqueName: \"kubernetes.io/projected/074166e0-8f28-4d68-af23-f416c61980de-kube-api-access-k8zd7\") pod \"redhat-marketplace-5b9s2\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:51 crc kubenswrapper[4909]: I0202 11:32:51.575743 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:32:52 crc kubenswrapper[4909]: I0202 11:32:52.014407 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b9s2"] Feb 02 11:32:52 crc kubenswrapper[4909]: I0202 11:32:52.288540 4909 generic.go:334] "Generic (PLEG): container finished" podID="074166e0-8f28-4d68-af23-f416c61980de" containerID="8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020" exitCode=0 Feb 02 11:32:52 crc kubenswrapper[4909]: I0202 11:32:52.288587 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b9s2" event={"ID":"074166e0-8f28-4d68-af23-f416c61980de","Type":"ContainerDied","Data":"8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020"} Feb 02 11:32:52 crc kubenswrapper[4909]: I0202 11:32:52.288634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b9s2" event={"ID":"074166e0-8f28-4d68-af23-f416c61980de","Type":"ContainerStarted","Data":"a923a4c6f21cb5a3161f520fc23e24cc04766260a68d81ca0d3fc8eff5ed14cf"} Feb 02 11:32:53 crc kubenswrapper[4909]: I0202 11:32:53.296869 4909 generic.go:334] "Generic (PLEG): container finished" podID="074166e0-8f28-4d68-af23-f416c61980de" containerID="ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13" exitCode=0 Feb 02 11:32:53 crc kubenswrapper[4909]: I0202 11:32:53.297031 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b9s2" event={"ID":"074166e0-8f28-4d68-af23-f416c61980de","Type":"ContainerDied","Data":"ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13"} Feb 02 11:32:54 crc kubenswrapper[4909]: I0202 11:32:54.306393 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b9s2" event={"ID":"074166e0-8f28-4d68-af23-f416c61980de","Type":"ContainerStarted","Data":"16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97"} Feb 02 11:32:54 crc kubenswrapper[4909]: I0202 11:32:54.342700 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5b9s2" podStartSLOduration=1.89671052 podStartE2EDuration="3.34267777s" podCreationTimestamp="2026-02-02 11:32:51 +0000 UTC" firstStartedPulling="2026-02-02 11:32:52.289911466 +0000 UTC m=+3698.036012201" lastFinishedPulling="2026-02-02 11:32:53.735878716 +0000 UTC m=+3699.481979451" observedRunningTime="2026-02-02 11:32:54.335542367 +0000 UTC m=+3700.081643112" watchObservedRunningTime="2026-02-02 11:32:54.34267777 +0000 UTC m=+3700.088778505" Feb 02 11:32:59 crc kubenswrapper[4909]: I0202 11:32:59.016487 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:32:59 crc kubenswrapper[4909]: E0202 11:32:59.018490 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:33:01 crc kubenswrapper[4909]: I0202 11:33:01.576907 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:33:01 crc kubenswrapper[4909]: I0202 11:33:01.577328 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:33:01 crc kubenswrapper[4909]: I0202 11:33:01.648792 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:33:02 crc kubenswrapper[4909]: I0202 11:33:02.397109 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:33:02 crc kubenswrapper[4909]: I0202 11:33:02.442156 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b9s2"] Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.366021 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5b9s2" podUID="074166e0-8f28-4d68-af23-f416c61980de" containerName="registry-server" containerID="cri-o://16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97" gracePeriod=2 Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.749836 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.854545 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-utilities\") pod \"074166e0-8f28-4d68-af23-f416c61980de\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.854630 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8zd7\" (UniqueName: \"kubernetes.io/projected/074166e0-8f28-4d68-af23-f416c61980de-kube-api-access-k8zd7\") pod \"074166e0-8f28-4d68-af23-f416c61980de\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.854699 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-catalog-content\") pod \"074166e0-8f28-4d68-af23-f416c61980de\" (UID: \"074166e0-8f28-4d68-af23-f416c61980de\") " Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.857505 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-utilities" (OuterVolumeSpecName: "utilities") pod "074166e0-8f28-4d68-af23-f416c61980de" (UID: "074166e0-8f28-4d68-af23-f416c61980de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.865009 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/074166e0-8f28-4d68-af23-f416c61980de-kube-api-access-k8zd7" (OuterVolumeSpecName: "kube-api-access-k8zd7") pod "074166e0-8f28-4d68-af23-f416c61980de" (UID: "074166e0-8f28-4d68-af23-f416c61980de"). InnerVolumeSpecName "kube-api-access-k8zd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.882399 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "074166e0-8f28-4d68-af23-f416c61980de" (UID: "074166e0-8f28-4d68-af23-f416c61980de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.956159 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.956192 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8zd7\" (UniqueName: \"kubernetes.io/projected/074166e0-8f28-4d68-af23-f416c61980de-kube-api-access-k8zd7\") on node \"crc\" DevicePath \"\"" Feb 02 11:33:04 crc kubenswrapper[4909]: I0202 11:33:04.956203 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074166e0-8f28-4d68-af23-f416c61980de-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.374216 4909 generic.go:334] "Generic (PLEG): container finished" podID="074166e0-8f28-4d68-af23-f416c61980de" containerID="16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97" exitCode=0 Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.374279 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5b9s2" Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.374312 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b9s2" event={"ID":"074166e0-8f28-4d68-af23-f416c61980de","Type":"ContainerDied","Data":"16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97"} Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.374718 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5b9s2" event={"ID":"074166e0-8f28-4d68-af23-f416c61980de","Type":"ContainerDied","Data":"a923a4c6f21cb5a3161f520fc23e24cc04766260a68d81ca0d3fc8eff5ed14cf"} Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.374753 4909 scope.go:117] "RemoveContainer" containerID="16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97" Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.394568 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b9s2"] Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.404507 4909 scope.go:117] "RemoveContainer" containerID="ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13" Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.406165 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5b9s2"] Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.423060 4909 scope.go:117] "RemoveContainer" containerID="8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020" Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.448112 4909 scope.go:117] "RemoveContainer" containerID="16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97" Feb 02 11:33:05 crc kubenswrapper[4909]: E0202 11:33:05.448897 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97\": container with ID starting with 16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97 not found: ID does not exist" containerID="16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97" Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.448932 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97"} err="failed to get container status \"16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97\": rpc error: code = NotFound desc = could not find container \"16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97\": container with ID starting with 16500a88b01fc6d81d26fa38ae7556b0d2662d909a63992c9ff63f606bff2d97 not found: ID does not exist" Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.448957 4909 scope.go:117] "RemoveContainer" containerID="ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13" Feb 02 11:33:05 crc kubenswrapper[4909]: E0202 11:33:05.449326 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13\": container with ID starting with ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13 not found: ID does not exist" containerID="ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13" Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.449352 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13"} err="failed to get container status \"ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13\": rpc error: code = NotFound desc = could not find container \"ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13\": container with ID starting with ad0fa2ea1ecbdc6505f01bd29c38f52d9b0d3e528faa3e93a0d74ffe0cc8cf13 not found: ID does not exist" Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.449369 4909 scope.go:117] "RemoveContainer" containerID="8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020" Feb 02 11:33:05 crc kubenswrapper[4909]: E0202 11:33:05.449575 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020\": container with ID starting with 8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020 not found: ID does not exist" containerID="8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020" Feb 02 11:33:05 crc kubenswrapper[4909]: I0202 11:33:05.449598 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020"} err="failed to get container status \"8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020\": rpc error: code = NotFound desc = could not find container \"8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020\": container with ID starting with 8c3727e4eca566ba07722eec8ca04b04c554036d52251c844b2d423d27c55020 not found: ID does not exist" Feb 02 11:33:07 crc kubenswrapper[4909]: I0202 11:33:07.025832 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="074166e0-8f28-4d68-af23-f416c61980de" path="/var/lib/kubelet/pods/074166e0-8f28-4d68-af23-f416c61980de/volumes" Feb 02 11:33:12 crc kubenswrapper[4909]: I0202 11:33:12.016581 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:33:12 crc kubenswrapper[4909]: E0202 11:33:12.017347 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:33:25 crc kubenswrapper[4909]: I0202 11:33:25.022550 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:33:25 crc kubenswrapper[4909]: E0202 11:33:25.023282 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:33:40 crc kubenswrapper[4909]: I0202 11:33:40.016620 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:33:40 crc kubenswrapper[4909]: E0202 11:33:40.017367 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:33:55 crc kubenswrapper[4909]: I0202 11:33:55.020924 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:33:55 crc kubenswrapper[4909]: E0202 11:33:55.021779 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:34:08 crc kubenswrapper[4909]: I0202 11:34:08.016570 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:34:08 crc kubenswrapper[4909]: E0202 11:34:08.017333 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:34:23 crc kubenswrapper[4909]: I0202 11:34:23.017015 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:34:23 crc kubenswrapper[4909]: E0202 11:34:23.017698 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:34:34 crc kubenswrapper[4909]: I0202 11:34:34.016304 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:34:34 crc kubenswrapper[4909]: E0202 11:34:34.017049 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:34:47 crc kubenswrapper[4909]: I0202 11:34:47.016551 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:34:47 crc kubenswrapper[4909]: E0202 11:34:47.017363 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:35:02 crc kubenswrapper[4909]: I0202 11:35:02.017645 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:35:02 crc kubenswrapper[4909]: E0202 11:35:02.019119 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:35:16 crc kubenswrapper[4909]: I0202 11:35:16.017154 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:35:16 crc kubenswrapper[4909]: E0202 11:35:16.018117 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:35:27 crc kubenswrapper[4909]: I0202 11:35:27.017016 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:35:27 crc kubenswrapper[4909]: E0202 11:35:27.018386 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:35:38 crc kubenswrapper[4909]: I0202 11:35:38.083502 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:35:38 crc kubenswrapper[4909]: E0202 11:35:38.084308 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:35:51 crc kubenswrapper[4909]: I0202 11:35:51.016869 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:35:51 crc kubenswrapper[4909]: E0202 11:35:51.017606 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:36:06 crc kubenswrapper[4909]: I0202 11:36:06.016109 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:36:06 crc kubenswrapper[4909]: E0202 11:36:06.016909 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:36:20 crc kubenswrapper[4909]: I0202 11:36:20.016033 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:36:20 crc kubenswrapper[4909]: I0202 11:36:20.753555 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"36a72ceaf0b148007a050e96b5dec8b56b8c314c0db5036a29c4b45ee6f11fbd"} Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.496607 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-44k54"] Feb 02 11:37:46 crc kubenswrapper[4909]: E0202 11:37:46.498771 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074166e0-8f28-4d68-af23-f416c61980de" containerName="extract-utilities" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.498907 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="074166e0-8f28-4d68-af23-f416c61980de" containerName="extract-utilities" Feb 02 11:37:46 crc kubenswrapper[4909]: E0202 11:37:46.499007 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074166e0-8f28-4d68-af23-f416c61980de" containerName="registry-server" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.499088 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="074166e0-8f28-4d68-af23-f416c61980de" containerName="registry-server" Feb 02 11:37:46 crc kubenswrapper[4909]: E0202 11:37:46.499165 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074166e0-8f28-4d68-af23-f416c61980de" containerName="extract-content" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.499236 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="074166e0-8f28-4d68-af23-f416c61980de" containerName="extract-content" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.499472 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="074166e0-8f28-4d68-af23-f416c61980de" containerName="registry-server" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.500724 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.508396 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-44k54"] Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.685446 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzthf\" (UniqueName: \"kubernetes.io/projected/158b5a3c-619b-4fca-a512-acdba20255d0-kube-api-access-hzthf\") pod \"redhat-operators-44k54\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.685560 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-catalog-content\") pod \"redhat-operators-44k54\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.685639 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-utilities\") pod \"redhat-operators-44k54\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.786698 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzthf\" (UniqueName: \"kubernetes.io/projected/158b5a3c-619b-4fca-a512-acdba20255d0-kube-api-access-hzthf\") pod \"redhat-operators-44k54\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.787397 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-catalog-content\") pod \"redhat-operators-44k54\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.787528 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-utilities\") pod \"redhat-operators-44k54\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.788093 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-catalog-content\") pod \"redhat-operators-44k54\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.788190 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-utilities\") pod \"redhat-operators-44k54\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.811136 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzthf\" (UniqueName: \"kubernetes.io/projected/158b5a3c-619b-4fca-a512-acdba20255d0-kube-api-access-hzthf\") pod \"redhat-operators-44k54\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:46 crc kubenswrapper[4909]: I0202 11:37:46.900140 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:47 crc kubenswrapper[4909]: I0202 11:37:47.720000 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-44k54"] Feb 02 11:37:48 crc kubenswrapper[4909]: I0202 11:37:48.373455 4909 generic.go:334] "Generic (PLEG): container finished" podID="158b5a3c-619b-4fca-a512-acdba20255d0" containerID="054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3" exitCode=0 Feb 02 11:37:48 crc kubenswrapper[4909]: I0202 11:37:48.373593 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44k54" event={"ID":"158b5a3c-619b-4fca-a512-acdba20255d0","Type":"ContainerDied","Data":"054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3"} Feb 02 11:37:48 crc kubenswrapper[4909]: I0202 11:37:48.373797 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44k54" event={"ID":"158b5a3c-619b-4fca-a512-acdba20255d0","Type":"ContainerStarted","Data":"9d42133388a83a3a13e38c6fef840c66b58783ac5f9689ec7be2e69e0ba36638"} Feb 02 11:37:48 crc kubenswrapper[4909]: I0202 11:37:48.375591 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:37:50 crc kubenswrapper[4909]: I0202 11:37:50.386615 4909 generic.go:334] "Generic (PLEG): container finished" podID="158b5a3c-619b-4fca-a512-acdba20255d0" containerID="4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e" exitCode=0 Feb 02 11:37:50 crc kubenswrapper[4909]: I0202 11:37:50.386719 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44k54" event={"ID":"158b5a3c-619b-4fca-a512-acdba20255d0","Type":"ContainerDied","Data":"4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e"} Feb 02 11:37:51 crc kubenswrapper[4909]: I0202 11:37:51.396274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44k54" event={"ID":"158b5a3c-619b-4fca-a512-acdba20255d0","Type":"ContainerStarted","Data":"c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031"} Feb 02 11:37:51 crc kubenswrapper[4909]: I0202 11:37:51.412191 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-44k54" podStartSLOduration=3.022339327 podStartE2EDuration="5.412171997s" podCreationTimestamp="2026-02-02 11:37:46 +0000 UTC" firstStartedPulling="2026-02-02 11:37:48.375390543 +0000 UTC m=+3994.121491278" lastFinishedPulling="2026-02-02 11:37:50.765223213 +0000 UTC m=+3996.511323948" observedRunningTime="2026-02-02 11:37:51.411306473 +0000 UTC m=+3997.157407208" watchObservedRunningTime="2026-02-02 11:37:51.412171997 +0000 UTC m=+3997.158272732" Feb 02 11:37:56 crc kubenswrapper[4909]: I0202 11:37:56.901145 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:56 crc kubenswrapper[4909]: I0202 11:37:56.901738 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:56 crc kubenswrapper[4909]: I0202 11:37:56.942439 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:57 crc kubenswrapper[4909]: I0202 11:37:57.485533 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:37:57 crc kubenswrapper[4909]: I0202 11:37:57.542856 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-44k54"] Feb 02 11:37:59 crc kubenswrapper[4909]: I0202 11:37:59.451125 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-44k54" podUID="158b5a3c-619b-4fca-a512-acdba20255d0" containerName="registry-server" containerID="cri-o://c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031" gracePeriod=2 Feb 02 11:37:59 crc kubenswrapper[4909]: I0202 11:37:59.862239 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.055279 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-catalog-content\") pod \"158b5a3c-619b-4fca-a512-acdba20255d0\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.055415 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzthf\" (UniqueName: \"kubernetes.io/projected/158b5a3c-619b-4fca-a512-acdba20255d0-kube-api-access-hzthf\") pod \"158b5a3c-619b-4fca-a512-acdba20255d0\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.055466 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-utilities\") pod \"158b5a3c-619b-4fca-a512-acdba20255d0\" (UID: \"158b5a3c-619b-4fca-a512-acdba20255d0\") " Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.056996 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-utilities" (OuterVolumeSpecName: "utilities") pod "158b5a3c-619b-4fca-a512-acdba20255d0" (UID: "158b5a3c-619b-4fca-a512-acdba20255d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.061498 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158b5a3c-619b-4fca-a512-acdba20255d0-kube-api-access-hzthf" (OuterVolumeSpecName: "kube-api-access-hzthf") pod "158b5a3c-619b-4fca-a512-acdba20255d0" (UID: "158b5a3c-619b-4fca-a512-acdba20255d0"). InnerVolumeSpecName "kube-api-access-hzthf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.157561 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzthf\" (UniqueName: \"kubernetes.io/projected/158b5a3c-619b-4fca-a512-acdba20255d0-kube-api-access-hzthf\") on node \"crc\" DevicePath \"\"" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.157869 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.195451 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "158b5a3c-619b-4fca-a512-acdba20255d0" (UID: "158b5a3c-619b-4fca-a512-acdba20255d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.259481 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/158b5a3c-619b-4fca-a512-acdba20255d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.458539 4909 generic.go:334] "Generic (PLEG): container finished" podID="158b5a3c-619b-4fca-a512-acdba20255d0" containerID="c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031" exitCode=0 Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.458586 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44k54" event={"ID":"158b5a3c-619b-4fca-a512-acdba20255d0","Type":"ContainerDied","Data":"c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031"} Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.458623 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44k54" event={"ID":"158b5a3c-619b-4fca-a512-acdba20255d0","Type":"ContainerDied","Data":"9d42133388a83a3a13e38c6fef840c66b58783ac5f9689ec7be2e69e0ba36638"} Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.458642 4909 scope.go:117] "RemoveContainer" containerID="c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.459627 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44k54" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.474561 4909 scope.go:117] "RemoveContainer" containerID="4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.492450 4909 scope.go:117] "RemoveContainer" containerID="054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.507759 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-44k54"] Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.513131 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-44k54"] Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.518249 4909 scope.go:117] "RemoveContainer" containerID="c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031" Feb 02 11:38:00 crc kubenswrapper[4909]: E0202 11:38:00.518753 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031\": container with ID starting with c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031 not found: ID does not exist" containerID="c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.518842 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031"} err="failed to get container status \"c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031\": rpc error: code = NotFound desc = could not find container \"c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031\": container with ID starting with c757b59781f115191c41a7f460b77a8265c46a2078473a8c1e0ffd61a9b7d031 not found: ID does not exist" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.518867 4909 scope.go:117] "RemoveContainer" containerID="4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e" Feb 02 11:38:00 crc kubenswrapper[4909]: E0202 11:38:00.519233 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e\": container with ID starting with 4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e not found: ID does not exist" containerID="4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.519262 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e"} err="failed to get container status \"4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e\": rpc error: code = NotFound desc = could not find container \"4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e\": container with ID starting with 4654c8ab8b08d26618ffa6c2fca40ceb7c0deefdf2d6c7c6ce7e1c223837ae4e not found: ID does not exist" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.519283 4909 scope.go:117] "RemoveContainer" containerID="054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3" Feb 02 11:38:00 crc kubenswrapper[4909]: E0202 11:38:00.519503 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3\": container with ID starting with 054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3 not found: ID does not exist" containerID="054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3" Feb 02 11:38:00 crc kubenswrapper[4909]: I0202 11:38:00.519562 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3"} err="failed to get container status \"054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3\": rpc error: code = NotFound desc = could not find container \"054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3\": container with ID starting with 054b2340d04b8d8187ee88221c278e9a166c475c209544529fe57408af842ce3 not found: ID does not exist" Feb 02 11:38:01 crc kubenswrapper[4909]: I0202 11:38:01.027554 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="158b5a3c-619b-4fca-a512-acdba20255d0" path="/var/lib/kubelet/pods/158b5a3c-619b-4fca-a512-acdba20255d0/volumes" Feb 02 11:38:49 crc kubenswrapper[4909]: I0202 11:38:49.511655 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:38:49 crc kubenswrapper[4909]: I0202 11:38:49.512520 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:39:09 crc kubenswrapper[4909]: I0202 11:39:09.830940 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7h856"] Feb 02 11:39:09 crc kubenswrapper[4909]: E0202 11:39:09.831773 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158b5a3c-619b-4fca-a512-acdba20255d0" containerName="extract-utilities" Feb 02 11:39:09 crc kubenswrapper[4909]: I0202 11:39:09.831789 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="158b5a3c-619b-4fca-a512-acdba20255d0" containerName="extract-utilities" Feb 02 11:39:09 crc kubenswrapper[4909]: E0202 11:39:09.831824 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158b5a3c-619b-4fca-a512-acdba20255d0" containerName="extract-content" Feb 02 11:39:09 crc kubenswrapper[4909]: I0202 11:39:09.831832 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="158b5a3c-619b-4fca-a512-acdba20255d0" containerName="extract-content" Feb 02 11:39:09 crc kubenswrapper[4909]: E0202 11:39:09.831844 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158b5a3c-619b-4fca-a512-acdba20255d0" containerName="registry-server" Feb 02 11:39:09 crc kubenswrapper[4909]: I0202 11:39:09.831851 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="158b5a3c-619b-4fca-a512-acdba20255d0" containerName="registry-server" Feb 02 11:39:09 crc kubenswrapper[4909]: I0202 11:39:09.832022 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="158b5a3c-619b-4fca-a512-acdba20255d0" containerName="registry-server" Feb 02 11:39:09 crc kubenswrapper[4909]: I0202 11:39:09.833190 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:09 crc kubenswrapper[4909]: I0202 11:39:09.855839 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7h856"] Feb 02 11:39:09 crc kubenswrapper[4909]: I0202 11:39:09.995360 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-catalog-content\") pod \"community-operators-7h856\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:09 crc kubenswrapper[4909]: I0202 11:39:09.995656 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd222\" (UniqueName: \"kubernetes.io/projected/382e7f57-dfa1-4ec3-a490-677197365b23-kube-api-access-zd222\") pod \"community-operators-7h856\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:09 crc kubenswrapper[4909]: I0202 11:39:09.995781 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-utilities\") pod \"community-operators-7h856\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.097230 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-catalog-content\") pod \"community-operators-7h856\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.097603 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd222\" (UniqueName: \"kubernetes.io/projected/382e7f57-dfa1-4ec3-a490-677197365b23-kube-api-access-zd222\") pod \"community-operators-7h856\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.097761 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-utilities\") pod \"community-operators-7h856\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.097775 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-catalog-content\") pod \"community-operators-7h856\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.098037 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-utilities\") pod \"community-operators-7h856\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.117423 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd222\" (UniqueName: \"kubernetes.io/projected/382e7f57-dfa1-4ec3-a490-677197365b23-kube-api-access-zd222\") pod \"community-operators-7h856\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.153402 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.434566 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7h856"] Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.930377 4909 generic.go:334] "Generic (PLEG): container finished" podID="382e7f57-dfa1-4ec3-a490-677197365b23" containerID="02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e" exitCode=0 Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.930482 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h856" event={"ID":"382e7f57-dfa1-4ec3-a490-677197365b23","Type":"ContainerDied","Data":"02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e"} Feb 02 11:39:10 crc kubenswrapper[4909]: I0202 11:39:10.930693 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h856" event={"ID":"382e7f57-dfa1-4ec3-a490-677197365b23","Type":"ContainerStarted","Data":"02bd77b75de4afb558d4638f9516503e68d6f99f7d5609c2b2b038c0c5368fa2"} Feb 02 11:39:11 crc kubenswrapper[4909]: I0202 11:39:11.937054 4909 generic.go:334] "Generic (PLEG): container finished" podID="382e7f57-dfa1-4ec3-a490-677197365b23" containerID="055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80" exitCode=0 Feb 02 11:39:11 crc kubenswrapper[4909]: I0202 11:39:11.937126 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h856" event={"ID":"382e7f57-dfa1-4ec3-a490-677197365b23","Type":"ContainerDied","Data":"055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80"} Feb 02 11:39:12 crc kubenswrapper[4909]: I0202 11:39:12.944796 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h856" event={"ID":"382e7f57-dfa1-4ec3-a490-677197365b23","Type":"ContainerStarted","Data":"c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08"} Feb 02 11:39:12 crc kubenswrapper[4909]: I0202 11:39:12.965192 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7h856" podStartSLOduration=2.379380066 podStartE2EDuration="3.9651739s" podCreationTimestamp="2026-02-02 11:39:09 +0000 UTC" firstStartedPulling="2026-02-02 11:39:10.932295801 +0000 UTC m=+4076.678396546" lastFinishedPulling="2026-02-02 11:39:12.518089645 +0000 UTC m=+4078.264190380" observedRunningTime="2026-02-02 11:39:12.962448814 +0000 UTC m=+4078.708549549" watchObservedRunningTime="2026-02-02 11:39:12.9651739 +0000 UTC m=+4078.711274635" Feb 02 11:39:19 crc kubenswrapper[4909]: I0202 11:39:19.511134 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:39:19 crc kubenswrapper[4909]: I0202 11:39:19.511739 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:39:20 crc kubenswrapper[4909]: I0202 11:39:20.154298 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:20 crc kubenswrapper[4909]: I0202 11:39:20.154406 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:20 crc kubenswrapper[4909]: I0202 11:39:20.199933 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:21 crc kubenswrapper[4909]: I0202 11:39:21.065993 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:21 crc kubenswrapper[4909]: I0202 11:39:21.107121 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7h856"] Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.041167 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7h856" podUID="382e7f57-dfa1-4ec3-a490-677197365b23" containerName="registry-server" containerID="cri-o://c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08" gracePeriod=2 Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.446736 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.576136 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-catalog-content\") pod \"382e7f57-dfa1-4ec3-a490-677197365b23\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.576183 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-utilities\") pod \"382e7f57-dfa1-4ec3-a490-677197365b23\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.576229 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd222\" (UniqueName: \"kubernetes.io/projected/382e7f57-dfa1-4ec3-a490-677197365b23-kube-api-access-zd222\") pod \"382e7f57-dfa1-4ec3-a490-677197365b23\" (UID: \"382e7f57-dfa1-4ec3-a490-677197365b23\") " Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.578122 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-utilities" (OuterVolumeSpecName: "utilities") pod "382e7f57-dfa1-4ec3-a490-677197365b23" (UID: "382e7f57-dfa1-4ec3-a490-677197365b23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.581765 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/382e7f57-dfa1-4ec3-a490-677197365b23-kube-api-access-zd222" (OuterVolumeSpecName: "kube-api-access-zd222") pod "382e7f57-dfa1-4ec3-a490-677197365b23" (UID: "382e7f57-dfa1-4ec3-a490-677197365b23"). InnerVolumeSpecName "kube-api-access-zd222". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.626008 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "382e7f57-dfa1-4ec3-a490-677197365b23" (UID: "382e7f57-dfa1-4ec3-a490-677197365b23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.678116 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.678148 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382e7f57-dfa1-4ec3-a490-677197365b23-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:39:23 crc kubenswrapper[4909]: I0202 11:39:23.678165 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd222\" (UniqueName: \"kubernetes.io/projected/382e7f57-dfa1-4ec3-a490-677197365b23-kube-api-access-zd222\") on node \"crc\" DevicePath \"\"" Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.048745 4909 generic.go:334] "Generic (PLEG): container finished" podID="382e7f57-dfa1-4ec3-a490-677197365b23" containerID="c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08" exitCode=0 Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.048789 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h856" event={"ID":"382e7f57-dfa1-4ec3-a490-677197365b23","Type":"ContainerDied","Data":"c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08"} Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.048831 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7h856" event={"ID":"382e7f57-dfa1-4ec3-a490-677197365b23","Type":"ContainerDied","Data":"02bd77b75de4afb558d4638f9516503e68d6f99f7d5609c2b2b038c0c5368fa2"} Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.048849 4909 scope.go:117] "RemoveContainer" containerID="c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08" Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.048970 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7h856" Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.069831 4909 scope.go:117] "RemoveContainer" containerID="055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80" Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.081452 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7h856"] Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.094532 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7h856"] Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.116302 4909 scope.go:117] "RemoveContainer" containerID="02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e" Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.132779 4909 scope.go:117] "RemoveContainer" containerID="c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08" Feb 02 11:39:24 crc kubenswrapper[4909]: E0202 11:39:24.133200 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08\": container with ID starting with c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08 not found: ID does not exist" containerID="c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08" Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.133234 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08"} err="failed to get container status \"c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08\": rpc error: code = NotFound desc = could not find container \"c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08\": container with ID starting with c49c56d3ec602e0881811177e9bcb31f2ea479730beb5a4b5c8eabc2925c4f08 not found: ID does not exist" Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.133301 4909 scope.go:117] "RemoveContainer" containerID="055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80" Feb 02 11:39:24 crc kubenswrapper[4909]: E0202 11:39:24.133573 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80\": container with ID starting with 055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80 not found: ID does not exist" containerID="055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80" Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.133616 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80"} err="failed to get container status \"055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80\": rpc error: code = NotFound desc = could not find container \"055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80\": container with ID starting with 055ef4762bebf9dad38c3aca0dcdcc03775cff45c3f6d61a7b3ceb5543210c80 not found: ID does not exist" Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.133631 4909 scope.go:117] "RemoveContainer" containerID="02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e" Feb 02 11:39:24 crc kubenswrapper[4909]: E0202 11:39:24.133879 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e\": container with ID starting with 02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e not found: ID does not exist" containerID="02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e" Feb 02 11:39:24 crc kubenswrapper[4909]: I0202 11:39:24.133921 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e"} err="failed to get container status \"02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e\": rpc error: code = NotFound desc = could not find container \"02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e\": container with ID starting with 02b6ad83660ba7c343f18ed421186c93d6cb1ddf8145a645a10a27e8ed7d9e5e not found: ID does not exist" Feb 02 11:39:25 crc kubenswrapper[4909]: I0202 11:39:25.025055 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="382e7f57-dfa1-4ec3-a490-677197365b23" path="/var/lib/kubelet/pods/382e7f57-dfa1-4ec3-a490-677197365b23/volumes" Feb 02 11:39:49 crc kubenswrapper[4909]: I0202 11:39:49.511541 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:39:49 crc kubenswrapper[4909]: I0202 11:39:49.512437 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:39:49 crc kubenswrapper[4909]: I0202 11:39:49.512860 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 11:39:49 crc kubenswrapper[4909]: I0202 11:39:49.513603 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36a72ceaf0b148007a050e96b5dec8b56b8c314c0db5036a29c4b45ee6f11fbd"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:39:49 crc kubenswrapper[4909]: I0202 11:39:49.513666 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://36a72ceaf0b148007a050e96b5dec8b56b8c314c0db5036a29c4b45ee6f11fbd" gracePeriod=600 Feb 02 11:39:50 crc kubenswrapper[4909]: I0202 11:39:50.480381 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="36a72ceaf0b148007a050e96b5dec8b56b8c314c0db5036a29c4b45ee6f11fbd" exitCode=0 Feb 02 11:39:50 crc kubenswrapper[4909]: I0202 11:39:50.480490 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"36a72ceaf0b148007a050e96b5dec8b56b8c314c0db5036a29c4b45ee6f11fbd"} Feb 02 11:39:50 crc kubenswrapper[4909]: I0202 11:39:50.480845 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520"} Feb 02 11:39:50 crc kubenswrapper[4909]: I0202 11:39:50.480879 4909 scope.go:117] "RemoveContainer" containerID="f76faa89778a347ae7e25a692d7d4a805e86009f4cd8fce1562eea878f6e8fe7" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.518059 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jgf8f"] Feb 02 11:40:24 crc kubenswrapper[4909]: E0202 11:40:24.519037 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382e7f57-dfa1-4ec3-a490-677197365b23" containerName="extract-utilities" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.519056 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="382e7f57-dfa1-4ec3-a490-677197365b23" containerName="extract-utilities" Feb 02 11:40:24 crc kubenswrapper[4909]: E0202 11:40:24.519084 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382e7f57-dfa1-4ec3-a490-677197365b23" containerName="extract-content" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.519092 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="382e7f57-dfa1-4ec3-a490-677197365b23" containerName="extract-content" Feb 02 11:40:24 crc kubenswrapper[4909]: E0202 11:40:24.519108 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382e7f57-dfa1-4ec3-a490-677197365b23" containerName="registry-server" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.519116 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="382e7f57-dfa1-4ec3-a490-677197365b23" containerName="registry-server" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.519304 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="382e7f57-dfa1-4ec3-a490-677197365b23" containerName="registry-server" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.520568 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.541609 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jgf8f"] Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.713771 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6t87\" (UniqueName: \"kubernetes.io/projected/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-kube-api-access-z6t87\") pod \"certified-operators-jgf8f\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.714123 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-utilities\") pod \"certified-operators-jgf8f\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.714215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-catalog-content\") pod \"certified-operators-jgf8f\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.815315 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6t87\" (UniqueName: \"kubernetes.io/projected/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-kube-api-access-z6t87\") pod \"certified-operators-jgf8f\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.815384 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-utilities\") pod \"certified-operators-jgf8f\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.815419 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-catalog-content\") pod \"certified-operators-jgf8f\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.816083 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-utilities\") pod \"certified-operators-jgf8f\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.816146 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-catalog-content\") pod \"certified-operators-jgf8f\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.839888 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6t87\" (UniqueName: \"kubernetes.io/projected/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-kube-api-access-z6t87\") pod \"certified-operators-jgf8f\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:24 crc kubenswrapper[4909]: I0202 11:40:24.844178 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:25 crc kubenswrapper[4909]: I0202 11:40:25.124604 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jgf8f"] Feb 02 11:40:25 crc kubenswrapper[4909]: I0202 11:40:25.739535 4909 generic.go:334] "Generic (PLEG): container finished" podID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerID="510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91" exitCode=0 Feb 02 11:40:25 crc kubenswrapper[4909]: I0202 11:40:25.739841 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgf8f" event={"ID":"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a","Type":"ContainerDied","Data":"510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91"} Feb 02 11:40:25 crc kubenswrapper[4909]: I0202 11:40:25.740018 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgf8f" event={"ID":"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a","Type":"ContainerStarted","Data":"ab2833f4984302caffeb46ba46ba28ea20329423bc92a9185e2f3f9be07122e9"} Feb 02 11:40:26 crc kubenswrapper[4909]: I0202 11:40:26.750904 4909 generic.go:334] "Generic (PLEG): container finished" podID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerID="ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4" exitCode=0 Feb 02 11:40:26 crc kubenswrapper[4909]: I0202 11:40:26.750948 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgf8f" event={"ID":"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a","Type":"ContainerDied","Data":"ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4"} Feb 02 11:40:27 crc kubenswrapper[4909]: I0202 11:40:27.759957 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgf8f" event={"ID":"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a","Type":"ContainerStarted","Data":"c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191"} Feb 02 11:40:27 crc kubenswrapper[4909]: I0202 11:40:27.780760 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jgf8f" podStartSLOduration=2.298461255 podStartE2EDuration="3.780738255s" podCreationTimestamp="2026-02-02 11:40:24 +0000 UTC" firstStartedPulling="2026-02-02 11:40:25.741445017 +0000 UTC m=+4151.487545752" lastFinishedPulling="2026-02-02 11:40:27.223722017 +0000 UTC m=+4152.969822752" observedRunningTime="2026-02-02 11:40:27.776468605 +0000 UTC m=+4153.522569340" watchObservedRunningTime="2026-02-02 11:40:27.780738255 +0000 UTC m=+4153.526839000" Feb 02 11:40:34 crc kubenswrapper[4909]: I0202 11:40:34.844981 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:34 crc kubenswrapper[4909]: I0202 11:40:34.845536 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:34 crc kubenswrapper[4909]: I0202 11:40:34.911259 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:35 crc kubenswrapper[4909]: I0202 11:40:35.873842 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:35 crc kubenswrapper[4909]: I0202 11:40:35.923479 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jgf8f"] Feb 02 11:40:37 crc kubenswrapper[4909]: I0202 11:40:37.838007 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jgf8f" podUID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerName="registry-server" containerID="cri-o://c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191" gracePeriod=2 Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.200230 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.368622 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-catalog-content\") pod \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.368661 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-utilities\") pod \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.368732 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6t87\" (UniqueName: \"kubernetes.io/projected/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-kube-api-access-z6t87\") pod \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\" (UID: \"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a\") " Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.369875 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-utilities" (OuterVolumeSpecName: "utilities") pod "55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" (UID: "55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.374169 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-kube-api-access-z6t87" (OuterVolumeSpecName: "kube-api-access-z6t87") pod "55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" (UID: "55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a"). InnerVolumeSpecName "kube-api-access-z6t87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.425060 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" (UID: "55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.470732 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6t87\" (UniqueName: \"kubernetes.io/projected/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-kube-api-access-z6t87\") on node \"crc\" DevicePath \"\"" Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.470764 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.470775 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.845829 4909 generic.go:334] "Generic (PLEG): container finished" podID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerID="c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191" exitCode=0 Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.846207 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgf8f" event={"ID":"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a","Type":"ContainerDied","Data":"c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191"} Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.846323 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgf8f" event={"ID":"55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a","Type":"ContainerDied","Data":"ab2833f4984302caffeb46ba46ba28ea20329423bc92a9185e2f3f9be07122e9"} Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.846340 4909 scope.go:117] "RemoveContainer" containerID="c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191" Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.846269 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgf8f" Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.873522 4909 scope.go:117] "RemoveContainer" containerID="ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4" Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.889792 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jgf8f"] Feb 02 11:40:38 crc kubenswrapper[4909]: I0202 11:40:38.896304 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jgf8f"] Feb 02 11:40:39 crc kubenswrapper[4909]: I0202 11:40:39.025354 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" path="/var/lib/kubelet/pods/55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a/volumes" Feb 02 11:40:39 crc kubenswrapper[4909]: I0202 11:40:39.102336 4909 scope.go:117] "RemoveContainer" containerID="510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91" Feb 02 11:40:39 crc kubenswrapper[4909]: I0202 11:40:39.136228 4909 scope.go:117] "RemoveContainer" containerID="c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191" Feb 02 11:40:39 crc kubenswrapper[4909]: E0202 11:40:39.136751 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191\": container with ID starting with c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191 not found: ID does not exist" containerID="c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191" Feb 02 11:40:39 crc kubenswrapper[4909]: I0202 11:40:39.136800 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191"} err="failed to get container status \"c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191\": rpc error: code = NotFound desc = could not find container \"c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191\": container with ID starting with c2509609f19bd38b18fa49a0ff7c281304e05bf4575f22cc994e398275c70191 not found: ID does not exist" Feb 02 11:40:39 crc kubenswrapper[4909]: I0202 11:40:39.136861 4909 scope.go:117] "RemoveContainer" containerID="ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4" Feb 02 11:40:39 crc kubenswrapper[4909]: E0202 11:40:39.137394 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4\": container with ID starting with ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4 not found: ID does not exist" containerID="ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4" Feb 02 11:40:39 crc kubenswrapper[4909]: I0202 11:40:39.137429 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4"} err="failed to get container status \"ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4\": rpc error: code = NotFound desc = could not find container \"ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4\": container with ID starting with ff2572c5ba49bbb375c02a53a8e746e5f649b59fa8302262722f21affe4ca4a4 not found: ID does not exist" Feb 02 11:40:39 crc kubenswrapper[4909]: I0202 11:40:39.137456 4909 scope.go:117] "RemoveContainer" containerID="510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91" Feb 02 11:40:39 crc kubenswrapper[4909]: E0202 11:40:39.137675 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91\": container with ID starting with 510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91 not found: ID does not exist" containerID="510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91" Feb 02 11:40:39 crc kubenswrapper[4909]: I0202 11:40:39.137698 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91"} err="failed to get container status \"510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91\": rpc error: code = NotFound desc = could not find container \"510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91\": container with ID starting with 510e3dd2b726b99142319281994cb2252b299270907944175c56af037b78dd91 not found: ID does not exist" Feb 02 11:41:49 crc kubenswrapper[4909]: I0202 11:41:49.510897 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:41:49 crc kubenswrapper[4909]: I0202 11:41:49.511536 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:42:19 crc kubenswrapper[4909]: I0202 11:42:19.511244 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:42:19 crc kubenswrapper[4909]: I0202 11:42:19.513373 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:42:49 crc kubenswrapper[4909]: I0202 11:42:49.511355 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:42:49 crc kubenswrapper[4909]: I0202 11:42:49.511849 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:42:49 crc kubenswrapper[4909]: I0202 11:42:49.511894 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 11:42:49 crc kubenswrapper[4909]: I0202 11:42:49.512401 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:42:49 crc kubenswrapper[4909]: I0202 11:42:49.512457 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" gracePeriod=600 Feb 02 11:42:49 crc kubenswrapper[4909]: E0202 11:42:49.647075 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:42:49 crc kubenswrapper[4909]: I0202 11:42:49.863111 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" exitCode=0 Feb 02 11:42:49 crc kubenswrapper[4909]: I0202 11:42:49.863161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520"} Feb 02 11:42:49 crc kubenswrapper[4909]: I0202 11:42:49.863201 4909 scope.go:117] "RemoveContainer" containerID="36a72ceaf0b148007a050e96b5dec8b56b8c314c0db5036a29c4b45ee6f11fbd" Feb 02 11:42:49 crc kubenswrapper[4909]: I0202 11:42:49.864220 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:42:49 crc kubenswrapper[4909]: E0202 11:42:49.864918 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:43:05 crc kubenswrapper[4909]: I0202 11:43:05.020800 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:43:05 crc kubenswrapper[4909]: E0202 11:43:05.021647 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:43:18 crc kubenswrapper[4909]: I0202 11:43:18.017097 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:43:18 crc kubenswrapper[4909]: E0202 11:43:18.017794 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:43:31 crc kubenswrapper[4909]: I0202 11:43:31.017147 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:43:31 crc kubenswrapper[4909]: E0202 11:43:31.018151 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:43:46 crc kubenswrapper[4909]: I0202 11:43:46.015918 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:43:46 crc kubenswrapper[4909]: E0202 11:43:46.016671 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:43:59 crc kubenswrapper[4909]: I0202 11:43:59.016786 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:43:59 crc kubenswrapper[4909]: E0202 11:43:59.017685 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.345550 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4x4qs"] Feb 02 11:44:04 crc kubenswrapper[4909]: E0202 11:44:04.346470 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerName="extract-content" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.346487 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerName="extract-content" Feb 02 11:44:04 crc kubenswrapper[4909]: E0202 11:44:04.346517 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerName="extract-utilities" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.346523 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerName="extract-utilities" Feb 02 11:44:04 crc kubenswrapper[4909]: E0202 11:44:04.346532 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerName="registry-server" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.346538 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerName="registry-server" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.346685 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c7e44e-a3ce-4079-85bf-6eeeb91bdd7a" containerName="registry-server" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.347731 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.361285 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4x4qs"] Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.503333 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-catalog-content\") pod \"redhat-marketplace-4x4qs\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.503482 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-utilities\") pod \"redhat-marketplace-4x4qs\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.503609 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqjm2\" (UniqueName: \"kubernetes.io/projected/39e0f570-77dc-4b88-a2b4-bc213f0fae96-kube-api-access-wqjm2\") pod \"redhat-marketplace-4x4qs\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.604495 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-utilities\") pod \"redhat-marketplace-4x4qs\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.604617 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqjm2\" (UniqueName: \"kubernetes.io/projected/39e0f570-77dc-4b88-a2b4-bc213f0fae96-kube-api-access-wqjm2\") pod \"redhat-marketplace-4x4qs\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.604651 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-catalog-content\") pod \"redhat-marketplace-4x4qs\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.605098 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-catalog-content\") pod \"redhat-marketplace-4x4qs\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.605135 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-utilities\") pod \"redhat-marketplace-4x4qs\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.653781 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqjm2\" (UniqueName: \"kubernetes.io/projected/39e0f570-77dc-4b88-a2b4-bc213f0fae96-kube-api-access-wqjm2\") pod \"redhat-marketplace-4x4qs\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:04 crc kubenswrapper[4909]: I0202 11:44:04.669050 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:05 crc kubenswrapper[4909]: I0202 11:44:05.160037 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4x4qs"] Feb 02 11:44:05 crc kubenswrapper[4909]: I0202 11:44:05.369265 4909 generic.go:334] "Generic (PLEG): container finished" podID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerID="080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4" exitCode=0 Feb 02 11:44:05 crc kubenswrapper[4909]: I0202 11:44:05.369360 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x4qs" event={"ID":"39e0f570-77dc-4b88-a2b4-bc213f0fae96","Type":"ContainerDied","Data":"080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4"} Feb 02 11:44:05 crc kubenswrapper[4909]: I0202 11:44:05.369557 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x4qs" event={"ID":"39e0f570-77dc-4b88-a2b4-bc213f0fae96","Type":"ContainerStarted","Data":"910627d8d7d1f29792f033440e2a1195b41c0e94c257871ba1a824317ddda020"} Feb 02 11:44:05 crc kubenswrapper[4909]: I0202 11:44:05.371142 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:44:07 crc kubenswrapper[4909]: I0202 11:44:07.386895 4909 generic.go:334] "Generic (PLEG): container finished" podID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerID="34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4" exitCode=0 Feb 02 11:44:07 crc kubenswrapper[4909]: I0202 11:44:07.386946 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x4qs" event={"ID":"39e0f570-77dc-4b88-a2b4-bc213f0fae96","Type":"ContainerDied","Data":"34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4"} Feb 02 11:44:13 crc kubenswrapper[4909]: I0202 11:44:13.016981 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:44:13 crc kubenswrapper[4909]: E0202 11:44:13.017583 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:44:13 crc kubenswrapper[4909]: I0202 11:44:13.442623 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x4qs" event={"ID":"39e0f570-77dc-4b88-a2b4-bc213f0fae96","Type":"ContainerStarted","Data":"fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e"} Feb 02 11:44:13 crc kubenswrapper[4909]: I0202 11:44:13.462558 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4x4qs" podStartSLOduration=2.536014065 podStartE2EDuration="9.462540284s" podCreationTimestamp="2026-02-02 11:44:04 +0000 UTC" firstStartedPulling="2026-02-02 11:44:05.370932707 +0000 UTC m=+4371.117033442" lastFinishedPulling="2026-02-02 11:44:12.297458916 +0000 UTC m=+4378.043559661" observedRunningTime="2026-02-02 11:44:13.458444879 +0000 UTC m=+4379.204545614" watchObservedRunningTime="2026-02-02 11:44:13.462540284 +0000 UTC m=+4379.208641019" Feb 02 11:44:14 crc kubenswrapper[4909]: I0202 11:44:14.670063 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:14 crc kubenswrapper[4909]: I0202 11:44:14.670127 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:14 crc kubenswrapper[4909]: I0202 11:44:14.713197 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:24 crc kubenswrapper[4909]: I0202 11:44:24.711603 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:24 crc kubenswrapper[4909]: I0202 11:44:24.757525 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4x4qs"] Feb 02 11:44:25 crc kubenswrapper[4909]: I0202 11:44:25.518575 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4x4qs" podUID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerName="registry-server" containerID="cri-o://fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e" gracePeriod=2 Feb 02 11:44:25 crc kubenswrapper[4909]: I0202 11:44:25.888007 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.001823 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqjm2\" (UniqueName: \"kubernetes.io/projected/39e0f570-77dc-4b88-a2b4-bc213f0fae96-kube-api-access-wqjm2\") pod \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.001941 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-utilities\") pod \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.002062 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-catalog-content\") pod \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\" (UID: \"39e0f570-77dc-4b88-a2b4-bc213f0fae96\") " Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.002967 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-utilities" (OuterVolumeSpecName: "utilities") pod "39e0f570-77dc-4b88-a2b4-bc213f0fae96" (UID: "39e0f570-77dc-4b88-a2b4-bc213f0fae96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.009037 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39e0f570-77dc-4b88-a2b4-bc213f0fae96-kube-api-access-wqjm2" (OuterVolumeSpecName: "kube-api-access-wqjm2") pod "39e0f570-77dc-4b88-a2b4-bc213f0fae96" (UID: "39e0f570-77dc-4b88-a2b4-bc213f0fae96"). InnerVolumeSpecName "kube-api-access-wqjm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.029691 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "39e0f570-77dc-4b88-a2b4-bc213f0fae96" (UID: "39e0f570-77dc-4b88-a2b4-bc213f0fae96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.103611 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.103654 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqjm2\" (UniqueName: \"kubernetes.io/projected/39e0f570-77dc-4b88-a2b4-bc213f0fae96-kube-api-access-wqjm2\") on node \"crc\" DevicePath \"\"" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.103670 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/39e0f570-77dc-4b88-a2b4-bc213f0fae96-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.530969 4909 generic.go:334] "Generic (PLEG): container finished" podID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerID="fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e" exitCode=0 Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.531032 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x4qs" event={"ID":"39e0f570-77dc-4b88-a2b4-bc213f0fae96","Type":"ContainerDied","Data":"fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e"} Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.531071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4x4qs" event={"ID":"39e0f570-77dc-4b88-a2b4-bc213f0fae96","Type":"ContainerDied","Data":"910627d8d7d1f29792f033440e2a1195b41c0e94c257871ba1a824317ddda020"} Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.531096 4909 scope.go:117] "RemoveContainer" containerID="fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.531299 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4x4qs" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.562759 4909 scope.go:117] "RemoveContainer" containerID="34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.569433 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4x4qs"] Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.577482 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4x4qs"] Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.587908 4909 scope.go:117] "RemoveContainer" containerID="080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.614133 4909 scope.go:117] "RemoveContainer" containerID="fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e" Feb 02 11:44:26 crc kubenswrapper[4909]: E0202 11:44:26.614716 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e\": container with ID starting with fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e not found: ID does not exist" containerID="fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.614782 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e"} err="failed to get container status \"fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e\": rpc error: code = NotFound desc = could not find container \"fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e\": container with ID starting with fe865a0b70534aa385cbaf1c22f44642d6074f721946ca7609702ef331c7f77e not found: ID does not exist" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.614837 4909 scope.go:117] "RemoveContainer" containerID="34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4" Feb 02 11:44:26 crc kubenswrapper[4909]: E0202 11:44:26.615307 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4\": container with ID starting with 34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4 not found: ID does not exist" containerID="34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.615348 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4"} err="failed to get container status \"34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4\": rpc error: code = NotFound desc = could not find container \"34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4\": container with ID starting with 34c6c58060adbcca75775cfa9db0c47dde4b82597ad4663cc26e6999040178d4 not found: ID does not exist" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.615378 4909 scope.go:117] "RemoveContainer" containerID="080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4" Feb 02 11:44:26 crc kubenswrapper[4909]: E0202 11:44:26.615685 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4\": container with ID starting with 080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4 not found: ID does not exist" containerID="080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4" Feb 02 11:44:26 crc kubenswrapper[4909]: I0202 11:44:26.615726 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4"} err="failed to get container status \"080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4\": rpc error: code = NotFound desc = could not find container \"080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4\": container with ID starting with 080426c8ae9833ea53560b885a317c368d13416d74f49d2e3eb4638ede0d8cf4 not found: ID does not exist" Feb 02 11:44:27 crc kubenswrapper[4909]: I0202 11:44:27.024998 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" path="/var/lib/kubelet/pods/39e0f570-77dc-4b88-a2b4-bc213f0fae96/volumes" Feb 02 11:44:28 crc kubenswrapper[4909]: I0202 11:44:28.016711 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:44:28 crc kubenswrapper[4909]: E0202 11:44:28.017423 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:44:43 crc kubenswrapper[4909]: I0202 11:44:43.016237 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:44:43 crc kubenswrapper[4909]: E0202 11:44:43.017031 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:44:54 crc kubenswrapper[4909]: I0202 11:44:54.016674 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:44:54 crc kubenswrapper[4909]: E0202 11:44:54.017460 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.176772 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p"] Feb 02 11:45:00 crc kubenswrapper[4909]: E0202 11:45:00.177664 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerName="extract-content" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.177678 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerName="extract-content" Feb 02 11:45:00 crc kubenswrapper[4909]: E0202 11:45:00.177707 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerName="extract-utilities" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.177714 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerName="extract-utilities" Feb 02 11:45:00 crc kubenswrapper[4909]: E0202 11:45:00.177730 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerName="registry-server" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.177737 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerName="registry-server" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.177907 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="39e0f570-77dc-4b88-a2b4-bc213f0fae96" containerName="registry-server" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.178400 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.181120 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.183197 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.193053 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p"] Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.276530 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bae9c104-1064-429b-b67a-2b6fca33d38c-config-volume\") pod \"collect-profiles-29500545-8tm2p\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.276593 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bae9c104-1064-429b-b67a-2b6fca33d38c-secret-volume\") pod \"collect-profiles-29500545-8tm2p\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.276645 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npzf5\" (UniqueName: \"kubernetes.io/projected/bae9c104-1064-429b-b67a-2b6fca33d38c-kube-api-access-npzf5\") pod \"collect-profiles-29500545-8tm2p\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.378135 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bae9c104-1064-429b-b67a-2b6fca33d38c-secret-volume\") pod \"collect-profiles-29500545-8tm2p\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.378444 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npzf5\" (UniqueName: \"kubernetes.io/projected/bae9c104-1064-429b-b67a-2b6fca33d38c-kube-api-access-npzf5\") pod \"collect-profiles-29500545-8tm2p\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.378598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bae9c104-1064-429b-b67a-2b6fca33d38c-config-volume\") pod \"collect-profiles-29500545-8tm2p\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.379470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bae9c104-1064-429b-b67a-2b6fca33d38c-config-volume\") pod \"collect-profiles-29500545-8tm2p\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.384481 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bae9c104-1064-429b-b67a-2b6fca33d38c-secret-volume\") pod \"collect-profiles-29500545-8tm2p\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.395631 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npzf5\" (UniqueName: \"kubernetes.io/projected/bae9c104-1064-429b-b67a-2b6fca33d38c-kube-api-access-npzf5\") pod \"collect-profiles-29500545-8tm2p\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.530337 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.730748 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p"] Feb 02 11:45:00 crc kubenswrapper[4909]: I0202 11:45:00.744902 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" event={"ID":"bae9c104-1064-429b-b67a-2b6fca33d38c","Type":"ContainerStarted","Data":"5e61fb0df1a19680b5baa90f0dec55befb0e7204ea1d75ac948c631787dee5f4"} Feb 02 11:45:01 crc kubenswrapper[4909]: I0202 11:45:01.753297 4909 generic.go:334] "Generic (PLEG): container finished" podID="bae9c104-1064-429b-b67a-2b6fca33d38c" containerID="dcc5795e21b5e3c7234f9ac6df8bd69227be7ed5bf67eb183dda28cde0d54e14" exitCode=0 Feb 02 11:45:01 crc kubenswrapper[4909]: I0202 11:45:01.753371 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" event={"ID":"bae9c104-1064-429b-b67a-2b6fca33d38c","Type":"ContainerDied","Data":"dcc5795e21b5e3c7234f9ac6df8bd69227be7ed5bf67eb183dda28cde0d54e14"} Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.041283 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.122601 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npzf5\" (UniqueName: \"kubernetes.io/projected/bae9c104-1064-429b-b67a-2b6fca33d38c-kube-api-access-npzf5\") pod \"bae9c104-1064-429b-b67a-2b6fca33d38c\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.122683 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bae9c104-1064-429b-b67a-2b6fca33d38c-secret-volume\") pod \"bae9c104-1064-429b-b67a-2b6fca33d38c\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.122825 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bae9c104-1064-429b-b67a-2b6fca33d38c-config-volume\") pod \"bae9c104-1064-429b-b67a-2b6fca33d38c\" (UID: \"bae9c104-1064-429b-b67a-2b6fca33d38c\") " Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.123825 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bae9c104-1064-429b-b67a-2b6fca33d38c-config-volume" (OuterVolumeSpecName: "config-volume") pod "bae9c104-1064-429b-b67a-2b6fca33d38c" (UID: "bae9c104-1064-429b-b67a-2b6fca33d38c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.128233 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae9c104-1064-429b-b67a-2b6fca33d38c-kube-api-access-npzf5" (OuterVolumeSpecName: "kube-api-access-npzf5") pod "bae9c104-1064-429b-b67a-2b6fca33d38c" (UID: "bae9c104-1064-429b-b67a-2b6fca33d38c"). InnerVolumeSpecName "kube-api-access-npzf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.128319 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae9c104-1064-429b-b67a-2b6fca33d38c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bae9c104-1064-429b-b67a-2b6fca33d38c" (UID: "bae9c104-1064-429b-b67a-2b6fca33d38c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.225039 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bae9c104-1064-429b-b67a-2b6fca33d38c-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.225324 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npzf5\" (UniqueName: \"kubernetes.io/projected/bae9c104-1064-429b-b67a-2b6fca33d38c-kube-api-access-npzf5\") on node \"crc\" DevicePath \"\"" Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.225338 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bae9c104-1064-429b-b67a-2b6fca33d38c-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.771271 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" event={"ID":"bae9c104-1064-429b-b67a-2b6fca33d38c","Type":"ContainerDied","Data":"5e61fb0df1a19680b5baa90f0dec55befb0e7204ea1d75ac948c631787dee5f4"} Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.771316 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e61fb0df1a19680b5baa90f0dec55befb0e7204ea1d75ac948c631787dee5f4" Feb 02 11:45:03 crc kubenswrapper[4909]: I0202 11:45:03.771380 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p" Feb 02 11:45:04 crc kubenswrapper[4909]: I0202 11:45:04.120703 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld"] Feb 02 11:45:04 crc kubenswrapper[4909]: I0202 11:45:04.127404 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500500-dr6ld"] Feb 02 11:45:05 crc kubenswrapper[4909]: I0202 11:45:05.025288 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5908b427-7fb4-4548-bb65-68c1f0f2ec7e" path="/var/lib/kubelet/pods/5908b427-7fb4-4548-bb65-68c1f0f2ec7e/volumes" Feb 02 11:45:06 crc kubenswrapper[4909]: I0202 11:45:06.016656 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:45:06 crc kubenswrapper[4909]: E0202 11:45:06.016934 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:45:20 crc kubenswrapper[4909]: I0202 11:45:20.192174 4909 scope.go:117] "RemoveContainer" containerID="de856560113eeb3fb23d80f25fdd30eb5948e9ee2d5c213f010b5d04ef2e8877" Feb 02 11:45:21 crc kubenswrapper[4909]: I0202 11:45:21.017160 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:45:21 crc kubenswrapper[4909]: E0202 11:45:21.017658 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:45:33 crc kubenswrapper[4909]: I0202 11:45:33.017603 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:45:33 crc kubenswrapper[4909]: E0202 11:45:33.018311 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:45:46 crc kubenswrapper[4909]: I0202 11:45:46.016284 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:45:46 crc kubenswrapper[4909]: E0202 11:45:46.017198 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:45:58 crc kubenswrapper[4909]: I0202 11:45:58.017348 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:45:58 crc kubenswrapper[4909]: E0202 11:45:58.018758 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:46:13 crc kubenswrapper[4909]: I0202 11:46:13.017631 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:46:13 crc kubenswrapper[4909]: E0202 11:46:13.018586 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:46:25 crc kubenswrapper[4909]: I0202 11:46:25.020257 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:46:25 crc kubenswrapper[4909]: E0202 11:46:25.021010 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:46:39 crc kubenswrapper[4909]: I0202 11:46:39.017115 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:46:39 crc kubenswrapper[4909]: E0202 11:46:39.017788 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:46:51 crc kubenswrapper[4909]: I0202 11:46:51.016837 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:46:51 crc kubenswrapper[4909]: E0202 11:46:51.017444 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:47:02 crc kubenswrapper[4909]: I0202 11:47:02.016224 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:47:02 crc kubenswrapper[4909]: E0202 11:47:02.016966 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:47:13 crc kubenswrapper[4909]: I0202 11:47:13.017350 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:47:13 crc kubenswrapper[4909]: E0202 11:47:13.018060 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:47:27 crc kubenswrapper[4909]: I0202 11:47:27.016366 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:47:27 crc kubenswrapper[4909]: E0202 11:47:27.017234 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:47:42 crc kubenswrapper[4909]: I0202 11:47:42.016436 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:47:42 crc kubenswrapper[4909]: E0202 11:47:42.017365 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:47:56 crc kubenswrapper[4909]: I0202 11:47:56.016485 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:47:57 crc kubenswrapper[4909]: I0202 11:47:57.044828 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"38ee1278ab1a7ba78b053e71d2e465c14550aab95c120841b2bbca3648b7d0d2"} Feb 02 11:48:06 crc kubenswrapper[4909]: I0202 11:48:06.887686 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-898q2"] Feb 02 11:48:06 crc kubenswrapper[4909]: E0202 11:48:06.888605 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae9c104-1064-429b-b67a-2b6fca33d38c" containerName="collect-profiles" Feb 02 11:48:06 crc kubenswrapper[4909]: I0202 11:48:06.888620 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae9c104-1064-429b-b67a-2b6fca33d38c" containerName="collect-profiles" Feb 02 11:48:06 crc kubenswrapper[4909]: I0202 11:48:06.888794 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae9c104-1064-429b-b67a-2b6fca33d38c" containerName="collect-profiles" Feb 02 11:48:06 crc kubenswrapper[4909]: I0202 11:48:06.891215 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:06 crc kubenswrapper[4909]: I0202 11:48:06.904385 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-898q2"] Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.082766 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-utilities\") pod \"redhat-operators-898q2\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.082851 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kxck\" (UniqueName: \"kubernetes.io/projected/5859e252-e453-4f14-996f-47f543b77cad-kube-api-access-5kxck\") pod \"redhat-operators-898q2\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.082874 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-catalog-content\") pod \"redhat-operators-898q2\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.184623 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-utilities\") pod \"redhat-operators-898q2\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.184680 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kxck\" (UniqueName: \"kubernetes.io/projected/5859e252-e453-4f14-996f-47f543b77cad-kube-api-access-5kxck\") pod \"redhat-operators-898q2\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.184703 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-catalog-content\") pod \"redhat-operators-898q2\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.185505 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-catalog-content\") pod \"redhat-operators-898q2\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.185551 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-utilities\") pod \"redhat-operators-898q2\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.206187 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kxck\" (UniqueName: \"kubernetes.io/projected/5859e252-e453-4f14-996f-47f543b77cad-kube-api-access-5kxck\") pod \"redhat-operators-898q2\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.217435 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:07 crc kubenswrapper[4909]: I0202 11:48:07.685940 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-898q2"] Feb 02 11:48:08 crc kubenswrapper[4909]: I0202 11:48:08.116648 4909 generic.go:334] "Generic (PLEG): container finished" podID="5859e252-e453-4f14-996f-47f543b77cad" containerID="65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb" exitCode=0 Feb 02 11:48:08 crc kubenswrapper[4909]: I0202 11:48:08.116691 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-898q2" event={"ID":"5859e252-e453-4f14-996f-47f543b77cad","Type":"ContainerDied","Data":"65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb"} Feb 02 11:48:08 crc kubenswrapper[4909]: I0202 11:48:08.116719 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-898q2" event={"ID":"5859e252-e453-4f14-996f-47f543b77cad","Type":"ContainerStarted","Data":"62d3c883c03c1a980442ec6b73113bc3aad89e8628c22bb9b96f44031a9a4e26"} Feb 02 11:48:09 crc kubenswrapper[4909]: I0202 11:48:09.141389 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-898q2" event={"ID":"5859e252-e453-4f14-996f-47f543b77cad","Type":"ContainerStarted","Data":"27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c"} Feb 02 11:48:10 crc kubenswrapper[4909]: I0202 11:48:10.148930 4909 generic.go:334] "Generic (PLEG): container finished" podID="5859e252-e453-4f14-996f-47f543b77cad" containerID="27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c" exitCode=0 Feb 02 11:48:10 crc kubenswrapper[4909]: I0202 11:48:10.148971 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-898q2" event={"ID":"5859e252-e453-4f14-996f-47f543b77cad","Type":"ContainerDied","Data":"27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c"} Feb 02 11:48:11 crc kubenswrapper[4909]: I0202 11:48:11.157900 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-898q2" event={"ID":"5859e252-e453-4f14-996f-47f543b77cad","Type":"ContainerStarted","Data":"6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff"} Feb 02 11:48:11 crc kubenswrapper[4909]: I0202 11:48:11.179033 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-898q2" podStartSLOduration=2.717720672 podStartE2EDuration="5.179013243s" podCreationTimestamp="2026-02-02 11:48:06 +0000 UTC" firstStartedPulling="2026-02-02 11:48:08.11800793 +0000 UTC m=+4613.864108665" lastFinishedPulling="2026-02-02 11:48:10.579300491 +0000 UTC m=+4616.325401236" observedRunningTime="2026-02-02 11:48:11.175914036 +0000 UTC m=+4616.922014791" watchObservedRunningTime="2026-02-02 11:48:11.179013243 +0000 UTC m=+4616.925113978" Feb 02 11:48:17 crc kubenswrapper[4909]: I0202 11:48:17.218482 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:17 crc kubenswrapper[4909]: I0202 11:48:17.219264 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:17 crc kubenswrapper[4909]: I0202 11:48:17.263757 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:18 crc kubenswrapper[4909]: I0202 11:48:18.248943 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:18 crc kubenswrapper[4909]: I0202 11:48:18.300102 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-898q2"] Feb 02 11:48:20 crc kubenswrapper[4909]: I0202 11:48:20.222598 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-898q2" podUID="5859e252-e453-4f14-996f-47f543b77cad" containerName="registry-server" containerID="cri-o://6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff" gracePeriod=2 Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.085842 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.203999 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-catalog-content\") pod \"5859e252-e453-4f14-996f-47f543b77cad\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.204054 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kxck\" (UniqueName: \"kubernetes.io/projected/5859e252-e453-4f14-996f-47f543b77cad-kube-api-access-5kxck\") pod \"5859e252-e453-4f14-996f-47f543b77cad\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.204125 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-utilities\") pod \"5859e252-e453-4f14-996f-47f543b77cad\" (UID: \"5859e252-e453-4f14-996f-47f543b77cad\") " Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.205031 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-utilities" (OuterVolumeSpecName: "utilities") pod "5859e252-e453-4f14-996f-47f543b77cad" (UID: "5859e252-e453-4f14-996f-47f543b77cad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.215633 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5859e252-e453-4f14-996f-47f543b77cad-kube-api-access-5kxck" (OuterVolumeSpecName: "kube-api-access-5kxck") pod "5859e252-e453-4f14-996f-47f543b77cad" (UID: "5859e252-e453-4f14-996f-47f543b77cad"). InnerVolumeSpecName "kube-api-access-5kxck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.231016 4909 generic.go:334] "Generic (PLEG): container finished" podID="5859e252-e453-4f14-996f-47f543b77cad" containerID="6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff" exitCode=0 Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.231063 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-898q2" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.231066 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-898q2" event={"ID":"5859e252-e453-4f14-996f-47f543b77cad","Type":"ContainerDied","Data":"6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff"} Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.231103 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-898q2" event={"ID":"5859e252-e453-4f14-996f-47f543b77cad","Type":"ContainerDied","Data":"62d3c883c03c1a980442ec6b73113bc3aad89e8628c22bb9b96f44031a9a4e26"} Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.231125 4909 scope.go:117] "RemoveContainer" containerID="6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.249655 4909 scope.go:117] "RemoveContainer" containerID="27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.280459 4909 scope.go:117] "RemoveContainer" containerID="65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.294966 4909 scope.go:117] "RemoveContainer" containerID="6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff" Feb 02 11:48:21 crc kubenswrapper[4909]: E0202 11:48:21.295408 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff\": container with ID starting with 6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff not found: ID does not exist" containerID="6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.295440 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff"} err="failed to get container status \"6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff\": rpc error: code = NotFound desc = could not find container \"6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff\": container with ID starting with 6da8681a2c13a8b8b0240b37d45e71378697a4989dc503b42962969e9ef852ff not found: ID does not exist" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.295463 4909 scope.go:117] "RemoveContainer" containerID="27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c" Feb 02 11:48:21 crc kubenswrapper[4909]: E0202 11:48:21.295799 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c\": container with ID starting with 27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c not found: ID does not exist" containerID="27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.295890 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c"} err="failed to get container status \"27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c\": rpc error: code = NotFound desc = could not find container \"27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c\": container with ID starting with 27f9597db346cc67fe8b08484999fb4cccf18ec493ab1ca2fb8cc58efd417b0c not found: ID does not exist" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.295915 4909 scope.go:117] "RemoveContainer" containerID="65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb" Feb 02 11:48:21 crc kubenswrapper[4909]: E0202 11:48:21.296159 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb\": container with ID starting with 65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb not found: ID does not exist" containerID="65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.296186 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb"} err="failed to get container status \"65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb\": rpc error: code = NotFound desc = could not find container \"65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb\": container with ID starting with 65edc26efac27f1b750f4c4d83cfac32201a7322690fd193f59543a06fed4cbb not found: ID does not exist" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.305083 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.305108 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kxck\" (UniqueName: \"kubernetes.io/projected/5859e252-e453-4f14-996f-47f543b77cad-kube-api-access-5kxck\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.326212 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5859e252-e453-4f14-996f-47f543b77cad" (UID: "5859e252-e453-4f14-996f-47f543b77cad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.405868 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5859e252-e453-4f14-996f-47f543b77cad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.565697 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-898q2"] Feb 02 11:48:21 crc kubenswrapper[4909]: I0202 11:48:21.574461 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-898q2"] Feb 02 11:48:23 crc kubenswrapper[4909]: I0202 11:48:23.026852 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5859e252-e453-4f14-996f-47f543b77cad" path="/var/lib/kubelet/pods/5859e252-e453-4f14-996f-47f543b77cad/volumes" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.301635 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-5zcch"] Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.307500 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-5zcch"] Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.402453 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tltcd"] Feb 02 11:50:00 crc kubenswrapper[4909]: E0202 11:50:00.402767 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859e252-e453-4f14-996f-47f543b77cad" containerName="extract-content" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.402791 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859e252-e453-4f14-996f-47f543b77cad" containerName="extract-content" Feb 02 11:50:00 crc kubenswrapper[4909]: E0202 11:50:00.402835 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859e252-e453-4f14-996f-47f543b77cad" containerName="extract-utilities" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.402845 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859e252-e453-4f14-996f-47f543b77cad" containerName="extract-utilities" Feb 02 11:50:00 crc kubenswrapper[4909]: E0202 11:50:00.402865 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5859e252-e453-4f14-996f-47f543b77cad" containerName="registry-server" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.402874 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5859e252-e453-4f14-996f-47f543b77cad" containerName="registry-server" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.403038 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5859e252-e453-4f14-996f-47f543b77cad" containerName="registry-server" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.403602 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.407836 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.407881 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.407934 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.408532 4909 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-d4gwn" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.413201 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tltcd"] Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.515598 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cf7c1766-e554-49f6-8832-9ec1f6fb5478-crc-storage\") pod \"crc-storage-crc-tltcd\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.515772 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cf7c1766-e554-49f6-8832-9ec1f6fb5478-node-mnt\") pod \"crc-storage-crc-tltcd\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.515832 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnlz7\" (UniqueName: \"kubernetes.io/projected/cf7c1766-e554-49f6-8832-9ec1f6fb5478-kube-api-access-xnlz7\") pod \"crc-storage-crc-tltcd\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.617105 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cf7c1766-e554-49f6-8832-9ec1f6fb5478-crc-storage\") pod \"crc-storage-crc-tltcd\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.617463 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cf7c1766-e554-49f6-8832-9ec1f6fb5478-node-mnt\") pod \"crc-storage-crc-tltcd\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.617602 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnlz7\" (UniqueName: \"kubernetes.io/projected/cf7c1766-e554-49f6-8832-9ec1f6fb5478-kube-api-access-xnlz7\") pod \"crc-storage-crc-tltcd\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.617787 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cf7c1766-e554-49f6-8832-9ec1f6fb5478-node-mnt\") pod \"crc-storage-crc-tltcd\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.618226 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cf7c1766-e554-49f6-8832-9ec1f6fb5478-crc-storage\") pod \"crc-storage-crc-tltcd\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.637290 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnlz7\" (UniqueName: \"kubernetes.io/projected/cf7c1766-e554-49f6-8832-9ec1f6fb5478-kube-api-access-xnlz7\") pod \"crc-storage-crc-tltcd\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:00 crc kubenswrapper[4909]: I0202 11:50:00.737348 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:01 crc kubenswrapper[4909]: I0202 11:50:01.025962 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f2752f4-33ed-4162-a9b6-481c1fc80957" path="/var/lib/kubelet/pods/3f2752f4-33ed-4162-a9b6-481c1fc80957/volumes" Feb 02 11:50:01 crc kubenswrapper[4909]: I0202 11:50:01.148193 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tltcd"] Feb 02 11:50:01 crc kubenswrapper[4909]: I0202 11:50:01.165891 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:50:01 crc kubenswrapper[4909]: I0202 11:50:01.900653 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tltcd" event={"ID":"cf7c1766-e554-49f6-8832-9ec1f6fb5478","Type":"ContainerStarted","Data":"59e7b291e9fa579cffe6a5085cab33a158144b59d037995f955f9fb7e2f0ba0a"} Feb 02 11:50:01 crc kubenswrapper[4909]: I0202 11:50:01.900700 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tltcd" event={"ID":"cf7c1766-e554-49f6-8832-9ec1f6fb5478","Type":"ContainerStarted","Data":"abfb52110d7d724f33d5c52028e9d42af0228b72524a288899f027994d8e0441"} Feb 02 11:50:01 crc kubenswrapper[4909]: I0202 11:50:01.925177 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-tltcd" podStartSLOduration=1.459997698 podStartE2EDuration="1.925153636s" podCreationTimestamp="2026-02-02 11:50:00 +0000 UTC" firstStartedPulling="2026-02-02 11:50:01.165606222 +0000 UTC m=+4726.911706957" lastFinishedPulling="2026-02-02 11:50:01.63076216 +0000 UTC m=+4727.376862895" observedRunningTime="2026-02-02 11:50:01.915072951 +0000 UTC m=+4727.661173696" watchObservedRunningTime="2026-02-02 11:50:01.925153636 +0000 UTC m=+4727.671254371" Feb 02 11:50:02 crc kubenswrapper[4909]: I0202 11:50:02.913581 4909 generic.go:334] "Generic (PLEG): container finished" podID="cf7c1766-e554-49f6-8832-9ec1f6fb5478" containerID="59e7b291e9fa579cffe6a5085cab33a158144b59d037995f955f9fb7e2f0ba0a" exitCode=0 Feb 02 11:50:02 crc kubenswrapper[4909]: I0202 11:50:02.913626 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tltcd" event={"ID":"cf7c1766-e554-49f6-8832-9ec1f6fb5478","Type":"ContainerDied","Data":"59e7b291e9fa579cffe6a5085cab33a158144b59d037995f955f9fb7e2f0ba0a"} Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.197952 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.263957 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cf7c1766-e554-49f6-8832-9ec1f6fb5478-node-mnt\") pod \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.264039 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnlz7\" (UniqueName: \"kubernetes.io/projected/cf7c1766-e554-49f6-8832-9ec1f6fb5478-kube-api-access-xnlz7\") pod \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.264053 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf7c1766-e554-49f6-8832-9ec1f6fb5478-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "cf7c1766-e554-49f6-8832-9ec1f6fb5478" (UID: "cf7c1766-e554-49f6-8832-9ec1f6fb5478"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.264123 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cf7c1766-e554-49f6-8832-9ec1f6fb5478-crc-storage\") pod \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\" (UID: \"cf7c1766-e554-49f6-8832-9ec1f6fb5478\") " Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.264362 4909 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/cf7c1766-e554-49f6-8832-9ec1f6fb5478-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.269020 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7c1766-e554-49f6-8832-9ec1f6fb5478-kube-api-access-xnlz7" (OuterVolumeSpecName: "kube-api-access-xnlz7") pod "cf7c1766-e554-49f6-8832-9ec1f6fb5478" (UID: "cf7c1766-e554-49f6-8832-9ec1f6fb5478"). InnerVolumeSpecName "kube-api-access-xnlz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.281365 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf7c1766-e554-49f6-8832-9ec1f6fb5478-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "cf7c1766-e554-49f6-8832-9ec1f6fb5478" (UID: "cf7c1766-e554-49f6-8832-9ec1f6fb5478"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.365421 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnlz7\" (UniqueName: \"kubernetes.io/projected/cf7c1766-e554-49f6-8832-9ec1f6fb5478-kube-api-access-xnlz7\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.365780 4909 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/cf7c1766-e554-49f6-8832-9ec1f6fb5478-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.926163 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tltcd" event={"ID":"cf7c1766-e554-49f6-8832-9ec1f6fb5478","Type":"ContainerDied","Data":"abfb52110d7d724f33d5c52028e9d42af0228b72524a288899f027994d8e0441"} Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.926508 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abfb52110d7d724f33d5c52028e9d42af0228b72524a288899f027994d8e0441" Feb 02 11:50:04 crc kubenswrapper[4909]: I0202 11:50:04.926222 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tltcd" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.001564 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-tltcd"] Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.005550 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-tltcd"] Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.159575 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-brndn"] Feb 02 11:50:06 crc kubenswrapper[4909]: E0202 11:50:06.159930 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7c1766-e554-49f6-8832-9ec1f6fb5478" containerName="storage" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.159952 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7c1766-e554-49f6-8832-9ec1f6fb5478" containerName="storage" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.160148 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7c1766-e554-49f6-8832-9ec1f6fb5478" containerName="storage" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.160683 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.162635 4909 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-d4gwn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.162701 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.163464 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.164905 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.173854 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-brndn"] Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.287399 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a2659183-db85-4168-b544-3a090bae030c-crc-storage\") pod \"crc-storage-crc-brndn\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.287719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a2659183-db85-4168-b544-3a090bae030c-node-mnt\") pod \"crc-storage-crc-brndn\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.287919 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pskkr\" (UniqueName: \"kubernetes.io/projected/a2659183-db85-4168-b544-3a090bae030c-kube-api-access-pskkr\") pod \"crc-storage-crc-brndn\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.388956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a2659183-db85-4168-b544-3a090bae030c-crc-storage\") pod \"crc-storage-crc-brndn\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.389315 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a2659183-db85-4168-b544-3a090bae030c-node-mnt\") pod \"crc-storage-crc-brndn\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.389414 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pskkr\" (UniqueName: \"kubernetes.io/projected/a2659183-db85-4168-b544-3a090bae030c-kube-api-access-pskkr\") pod \"crc-storage-crc-brndn\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.389656 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a2659183-db85-4168-b544-3a090bae030c-node-mnt\") pod \"crc-storage-crc-brndn\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.390038 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a2659183-db85-4168-b544-3a090bae030c-crc-storage\") pod \"crc-storage-crc-brndn\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.417697 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pskkr\" (UniqueName: \"kubernetes.io/projected/a2659183-db85-4168-b544-3a090bae030c-kube-api-access-pskkr\") pod \"crc-storage-crc-brndn\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.479493 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.879124 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-brndn"] Feb 02 11:50:06 crc kubenswrapper[4909]: I0202 11:50:06.937778 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-brndn" event={"ID":"a2659183-db85-4168-b544-3a090bae030c","Type":"ContainerStarted","Data":"2b7d0a6d1f9f4f73947d8195e3f612cbbaa314fd4b3d4664cd2528f8660989da"} Feb 02 11:50:07 crc kubenswrapper[4909]: I0202 11:50:07.024537 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7c1766-e554-49f6-8832-9ec1f6fb5478" path="/var/lib/kubelet/pods/cf7c1766-e554-49f6-8832-9ec1f6fb5478/volumes" Feb 02 11:50:07 crc kubenswrapper[4909]: I0202 11:50:07.945127 4909 generic.go:334] "Generic (PLEG): container finished" podID="a2659183-db85-4168-b544-3a090bae030c" containerID="f03da60544f47cd0b4af6523a1aae90fe707acefd070682e4596481ca56d6c37" exitCode=0 Feb 02 11:50:07 crc kubenswrapper[4909]: I0202 11:50:07.945192 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-brndn" event={"ID":"a2659183-db85-4168-b544-3a090bae030c","Type":"ContainerDied","Data":"f03da60544f47cd0b4af6523a1aae90fe707acefd070682e4596481ca56d6c37"} Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.228356 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.331770 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a2659183-db85-4168-b544-3a090bae030c-node-mnt\") pod \"a2659183-db85-4168-b544-3a090bae030c\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.331914 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a2659183-db85-4168-b544-3a090bae030c-crc-storage\") pod \"a2659183-db85-4168-b544-3a090bae030c\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.331932 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2659183-db85-4168-b544-3a090bae030c-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a2659183-db85-4168-b544-3a090bae030c" (UID: "a2659183-db85-4168-b544-3a090bae030c"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.331971 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pskkr\" (UniqueName: \"kubernetes.io/projected/a2659183-db85-4168-b544-3a090bae030c-kube-api-access-pskkr\") pod \"a2659183-db85-4168-b544-3a090bae030c\" (UID: \"a2659183-db85-4168-b544-3a090bae030c\") " Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.332144 4909 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a2659183-db85-4168-b544-3a090bae030c-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.337416 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2659183-db85-4168-b544-3a090bae030c-kube-api-access-pskkr" (OuterVolumeSpecName: "kube-api-access-pskkr") pod "a2659183-db85-4168-b544-3a090bae030c" (UID: "a2659183-db85-4168-b544-3a090bae030c"). InnerVolumeSpecName "kube-api-access-pskkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.349314 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2659183-db85-4168-b544-3a090bae030c-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a2659183-db85-4168-b544-3a090bae030c" (UID: "a2659183-db85-4168-b544-3a090bae030c"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.434046 4909 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a2659183-db85-4168-b544-3a090bae030c-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.434094 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pskkr\" (UniqueName: \"kubernetes.io/projected/a2659183-db85-4168-b544-3a090bae030c-kube-api-access-pskkr\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.958746 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-brndn" event={"ID":"a2659183-db85-4168-b544-3a090bae030c","Type":"ContainerDied","Data":"2b7d0a6d1f9f4f73947d8195e3f612cbbaa314fd4b3d4664cd2528f8660989da"} Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.958790 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b7d0a6d1f9f4f73947d8195e3f612cbbaa314fd4b3d4664cd2528f8660989da" Feb 02 11:50:09 crc kubenswrapper[4909]: I0202 11:50:09.958796 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-brndn" Feb 02 11:50:19 crc kubenswrapper[4909]: I0202 11:50:19.511274 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:50:19 crc kubenswrapper[4909]: I0202 11:50:19.511845 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:50:20 crc kubenswrapper[4909]: I0202 11:50:20.327114 4909 scope.go:117] "RemoveContainer" containerID="1c4c423389ba1bc8c182ae50d3353d3b414a7fea0a568cc4c2a881d651e4a59d" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.293585 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9jrww"] Feb 02 11:50:40 crc kubenswrapper[4909]: E0202 11:50:40.294383 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2659183-db85-4168-b544-3a090bae030c" containerName="storage" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.294396 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2659183-db85-4168-b544-3a090bae030c" containerName="storage" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.294524 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2659183-db85-4168-b544-3a090bae030c" containerName="storage" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.295495 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.311783 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9jrww"] Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.465825 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-catalog-content\") pod \"certified-operators-9jrww\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.465884 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbkbb\" (UniqueName: \"kubernetes.io/projected/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-kube-api-access-wbkbb\") pod \"certified-operators-9jrww\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.465935 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-utilities\") pod \"certified-operators-9jrww\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.566882 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-utilities\") pod \"certified-operators-9jrww\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.566952 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-catalog-content\") pod \"certified-operators-9jrww\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.567001 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbkbb\" (UniqueName: \"kubernetes.io/projected/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-kube-api-access-wbkbb\") pod \"certified-operators-9jrww\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.567851 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-utilities\") pod \"certified-operators-9jrww\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.567969 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-catalog-content\") pod \"certified-operators-9jrww\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.589318 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbkbb\" (UniqueName: \"kubernetes.io/projected/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-kube-api-access-wbkbb\") pod \"certified-operators-9jrww\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:40 crc kubenswrapper[4909]: I0202 11:50:40.670857 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:41 crc kubenswrapper[4909]: I0202 11:50:41.229938 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9jrww"] Feb 02 11:50:42 crc kubenswrapper[4909]: I0202 11:50:42.162912 4909 generic.go:334] "Generic (PLEG): container finished" podID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerID="1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf" exitCode=0 Feb 02 11:50:42 crc kubenswrapper[4909]: I0202 11:50:42.163028 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jrww" event={"ID":"4c12aff3-9ea6-4212-8ad7-7a9365660dc5","Type":"ContainerDied","Data":"1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf"} Feb 02 11:50:42 crc kubenswrapper[4909]: I0202 11:50:42.163459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jrww" event={"ID":"4c12aff3-9ea6-4212-8ad7-7a9365660dc5","Type":"ContainerStarted","Data":"9e4f30b482c695dbf212d6e0734ee15409f3e0df248ae600ec92ef03d5660781"} Feb 02 11:50:43 crc kubenswrapper[4909]: I0202 11:50:43.171099 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jrww" event={"ID":"4c12aff3-9ea6-4212-8ad7-7a9365660dc5","Type":"ContainerStarted","Data":"7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537"} Feb 02 11:50:44 crc kubenswrapper[4909]: I0202 11:50:44.179526 4909 generic.go:334] "Generic (PLEG): container finished" podID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerID="7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537" exitCode=0 Feb 02 11:50:44 crc kubenswrapper[4909]: I0202 11:50:44.179574 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jrww" event={"ID":"4c12aff3-9ea6-4212-8ad7-7a9365660dc5","Type":"ContainerDied","Data":"7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537"} Feb 02 11:50:45 crc kubenswrapper[4909]: I0202 11:50:45.191767 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jrww" event={"ID":"4c12aff3-9ea6-4212-8ad7-7a9365660dc5","Type":"ContainerStarted","Data":"c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824"} Feb 02 11:50:45 crc kubenswrapper[4909]: I0202 11:50:45.212488 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9jrww" podStartSLOduration=2.817454741 podStartE2EDuration="5.212465812s" podCreationTimestamp="2026-02-02 11:50:40 +0000 UTC" firstStartedPulling="2026-02-02 11:50:42.164778507 +0000 UTC m=+4767.910879242" lastFinishedPulling="2026-02-02 11:50:44.559789578 +0000 UTC m=+4770.305890313" observedRunningTime="2026-02-02 11:50:45.211292979 +0000 UTC m=+4770.957393714" watchObservedRunningTime="2026-02-02 11:50:45.212465812 +0000 UTC m=+4770.958566557" Feb 02 11:50:49 crc kubenswrapper[4909]: I0202 11:50:49.511210 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:50:49 crc kubenswrapper[4909]: I0202 11:50:49.511548 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:50:50 crc kubenswrapper[4909]: I0202 11:50:50.670909 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:50 crc kubenswrapper[4909]: I0202 11:50:50.670956 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:50 crc kubenswrapper[4909]: I0202 11:50:50.708007 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:51 crc kubenswrapper[4909]: I0202 11:50:51.258388 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:51 crc kubenswrapper[4909]: I0202 11:50:51.305919 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9jrww"] Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.235395 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9jrww" podUID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerName="registry-server" containerID="cri-o://c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824" gracePeriod=2 Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.626471 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.746550 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-catalog-content\") pod \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.746596 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbkbb\" (UniqueName: \"kubernetes.io/projected/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-kube-api-access-wbkbb\") pod \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.746739 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-utilities\") pod \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\" (UID: \"4c12aff3-9ea6-4212-8ad7-7a9365660dc5\") " Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.747618 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-utilities" (OuterVolumeSpecName: "utilities") pod "4c12aff3-9ea6-4212-8ad7-7a9365660dc5" (UID: "4c12aff3-9ea6-4212-8ad7-7a9365660dc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.751920 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-kube-api-access-wbkbb" (OuterVolumeSpecName: "kube-api-access-wbkbb") pod "4c12aff3-9ea6-4212-8ad7-7a9365660dc5" (UID: "4c12aff3-9ea6-4212-8ad7-7a9365660dc5"). InnerVolumeSpecName "kube-api-access-wbkbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.795868 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c12aff3-9ea6-4212-8ad7-7a9365660dc5" (UID: "4c12aff3-9ea6-4212-8ad7-7a9365660dc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.848687 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.848732 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbkbb\" (UniqueName: \"kubernetes.io/projected/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-kube-api-access-wbkbb\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:53 crc kubenswrapper[4909]: I0202 11:50:53.848742 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c12aff3-9ea6-4212-8ad7-7a9365660dc5-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.243741 4909 generic.go:334] "Generic (PLEG): container finished" podID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerID="c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824" exitCode=0 Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.243785 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jrww" event={"ID":"4c12aff3-9ea6-4212-8ad7-7a9365660dc5","Type":"ContainerDied","Data":"c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824"} Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.243838 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9jrww" event={"ID":"4c12aff3-9ea6-4212-8ad7-7a9365660dc5","Type":"ContainerDied","Data":"9e4f30b482c695dbf212d6e0734ee15409f3e0df248ae600ec92ef03d5660781"} Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.243835 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9jrww" Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.243852 4909 scope.go:117] "RemoveContainer" containerID="c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824" Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.260911 4909 scope.go:117] "RemoveContainer" containerID="7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537" Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.294957 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9jrww"] Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.298761 4909 scope.go:117] "RemoveContainer" containerID="1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf" Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.310907 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9jrww"] Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.325279 4909 scope.go:117] "RemoveContainer" containerID="c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824" Feb 02 11:50:54 crc kubenswrapper[4909]: E0202 11:50:54.325830 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824\": container with ID starting with c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824 not found: ID does not exist" containerID="c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824" Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.325932 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824"} err="failed to get container status \"c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824\": rpc error: code = NotFound desc = could not find container \"c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824\": container with ID starting with c25812bb55d4cfeb92ad51b235e9cdcc5627e1274a91ed9321e6251836e75824 not found: ID does not exist" Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.326019 4909 scope.go:117] "RemoveContainer" containerID="7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537" Feb 02 11:50:54 crc kubenswrapper[4909]: E0202 11:50:54.326441 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537\": container with ID starting with 7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537 not found: ID does not exist" containerID="7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537" Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.326500 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537"} err="failed to get container status \"7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537\": rpc error: code = NotFound desc = could not find container \"7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537\": container with ID starting with 7edf27243e9fb2ce3fdc30a3d073ae32a163dfe076752b3641b77ca350b20537 not found: ID does not exist" Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.326529 4909 scope.go:117] "RemoveContainer" containerID="1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf" Feb 02 11:50:54 crc kubenswrapper[4909]: E0202 11:50:54.326995 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf\": container with ID starting with 1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf not found: ID does not exist" containerID="1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf" Feb 02 11:50:54 crc kubenswrapper[4909]: I0202 11:50:54.327086 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf"} err="failed to get container status \"1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf\": rpc error: code = NotFound desc = could not find container \"1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf\": container with ID starting with 1f4301533b4390ba5069be6017181037773cef6b4faace01dda27e868ea20fbf not found: ID does not exist" Feb 02 11:50:55 crc kubenswrapper[4909]: I0202 11:50:55.026176 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" path="/var/lib/kubelet/pods/4c12aff3-9ea6-4212-8ad7-7a9365660dc5/volumes" Feb 02 11:51:19 crc kubenswrapper[4909]: I0202 11:51:19.510588 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:51:19 crc kubenswrapper[4909]: I0202 11:51:19.511151 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:51:19 crc kubenswrapper[4909]: I0202 11:51:19.511205 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 11:51:19 crc kubenswrapper[4909]: I0202 11:51:19.511868 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38ee1278ab1a7ba78b053e71d2e465c14550aab95c120841b2bbca3648b7d0d2"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:51:19 crc kubenswrapper[4909]: I0202 11:51:19.511947 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://38ee1278ab1a7ba78b053e71d2e465c14550aab95c120841b2bbca3648b7d0d2" gracePeriod=600 Feb 02 11:51:20 crc kubenswrapper[4909]: I0202 11:51:20.401826 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="38ee1278ab1a7ba78b053e71d2e465c14550aab95c120841b2bbca3648b7d0d2" exitCode=0 Feb 02 11:51:20 crc kubenswrapper[4909]: I0202 11:51:20.401847 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"38ee1278ab1a7ba78b053e71d2e465c14550aab95c120841b2bbca3648b7d0d2"} Feb 02 11:51:20 crc kubenswrapper[4909]: I0202 11:51:20.402244 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83"} Feb 02 11:51:20 crc kubenswrapper[4909]: I0202 11:51:20.402271 4909 scope.go:117] "RemoveContainer" containerID="c222d735032dfb758c7b4e7474bc0d0418866a09f9aa0b4d182310565d5c9520" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.394361 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-mwhjj"] Feb 02 11:52:06 crc kubenswrapper[4909]: E0202 11:52:06.395973 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerName="extract-utilities" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.396061 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerName="extract-utilities" Feb 02 11:52:06 crc kubenswrapper[4909]: E0202 11:52:06.396140 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerName="extract-content" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.396196 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerName="extract-content" Feb 02 11:52:06 crc kubenswrapper[4909]: E0202 11:52:06.396261 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerName="registry-server" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.396319 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerName="registry-server" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.396529 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c12aff3-9ea6-4212-8ad7-7a9365660dc5" containerName="registry-server" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.397309 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.401681 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-vnpzc" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.401866 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.401902 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.402356 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.406553 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-tc4wb"] Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.408139 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.410077 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.415550 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-mwhjj"] Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.426034 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-tc4wb"] Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.521152 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx5b6\" (UniqueName: \"kubernetes.io/projected/a42877f1-8a2d-4fb4-ba49-d543db736628-kube-api-access-fx5b6\") pod \"dnsmasq-dns-9d69655f7-tc4wb\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.521201 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd7zx\" (UniqueName: \"kubernetes.io/projected/9b924e3d-6edf-442a-b1a1-249025ce0cc6-kube-api-access-gd7zx\") pod \"dnsmasq-dns-6f98b88745-mwhjj\" (UID: \"9b924e3d-6edf-442a-b1a1-249025ce0cc6\") " pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.521361 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b924e3d-6edf-442a-b1a1-249025ce0cc6-config\") pod \"dnsmasq-dns-6f98b88745-mwhjj\" (UID: \"9b924e3d-6edf-442a-b1a1-249025ce0cc6\") " pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.521487 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-dns-svc\") pod \"dnsmasq-dns-9d69655f7-tc4wb\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.521524 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-config\") pod \"dnsmasq-dns-9d69655f7-tc4wb\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.622364 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx5b6\" (UniqueName: \"kubernetes.io/projected/a42877f1-8a2d-4fb4-ba49-d543db736628-kube-api-access-fx5b6\") pod \"dnsmasq-dns-9d69655f7-tc4wb\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.622428 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd7zx\" (UniqueName: \"kubernetes.io/projected/9b924e3d-6edf-442a-b1a1-249025ce0cc6-kube-api-access-gd7zx\") pod \"dnsmasq-dns-6f98b88745-mwhjj\" (UID: \"9b924e3d-6edf-442a-b1a1-249025ce0cc6\") " pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.622495 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b924e3d-6edf-442a-b1a1-249025ce0cc6-config\") pod \"dnsmasq-dns-6f98b88745-mwhjj\" (UID: \"9b924e3d-6edf-442a-b1a1-249025ce0cc6\") " pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.622560 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-dns-svc\") pod \"dnsmasq-dns-9d69655f7-tc4wb\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.622579 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-config\") pod \"dnsmasq-dns-9d69655f7-tc4wb\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.623550 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-config\") pod \"dnsmasq-dns-9d69655f7-tc4wb\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.623577 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b924e3d-6edf-442a-b1a1-249025ce0cc6-config\") pod \"dnsmasq-dns-6f98b88745-mwhjj\" (UID: \"9b924e3d-6edf-442a-b1a1-249025ce0cc6\") " pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.623948 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-dns-svc\") pod \"dnsmasq-dns-9d69655f7-tc4wb\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.639557 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx5b6\" (UniqueName: \"kubernetes.io/projected/a42877f1-8a2d-4fb4-ba49-d543db736628-kube-api-access-fx5b6\") pod \"dnsmasq-dns-9d69655f7-tc4wb\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.641717 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd7zx\" (UniqueName: \"kubernetes.io/projected/9b924e3d-6edf-442a-b1a1-249025ce0cc6-kube-api-access-gd7zx\") pod \"dnsmasq-dns-6f98b88745-mwhjj\" (UID: \"9b924e3d-6edf-442a-b1a1-249025ce0cc6\") " pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.718034 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.721180 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-mwhjj"] Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.731255 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.745536 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-4n6l5"] Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.747045 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.760309 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-4n6l5"] Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.929576 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-4n6l5\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.930029 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-config\") pod \"dnsmasq-dns-7c4c8f55b5-4n6l5\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:06 crc kubenswrapper[4909]: I0202 11:52:06.930067 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv8zs\" (UniqueName: \"kubernetes.io/projected/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-kube-api-access-gv8zs\") pod \"dnsmasq-dns-7c4c8f55b5-4n6l5\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.031415 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-config\") pod \"dnsmasq-dns-7c4c8f55b5-4n6l5\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.031468 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv8zs\" (UniqueName: \"kubernetes.io/projected/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-kube-api-access-gv8zs\") pod \"dnsmasq-dns-7c4c8f55b5-4n6l5\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.031514 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-4n6l5\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.032427 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-dns-svc\") pod \"dnsmasq-dns-7c4c8f55b5-4n6l5\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.033428 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-config\") pod \"dnsmasq-dns-7c4c8f55b5-4n6l5\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.072169 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-4n6l5"] Feb 02 11:52:07 crc kubenswrapper[4909]: E0202 11:52:07.072674 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-gv8zs], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" podUID="61b2e1d8-667b-48d8-b33f-8798b6c08a9d" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.082627 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv8zs\" (UniqueName: \"kubernetes.io/projected/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-kube-api-access-gv8zs\") pod \"dnsmasq-dns-7c4c8f55b5-4n6l5\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.100130 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-2cbbb"] Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.101827 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.113384 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-2cbbb"] Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.233759 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-config\") pod \"dnsmasq-dns-589cf688cc-2cbbb\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.233866 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-dns-svc\") pod \"dnsmasq-dns-589cf688cc-2cbbb\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.233938 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqltw\" (UniqueName: \"kubernetes.io/projected/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-kube-api-access-qqltw\") pod \"dnsmasq-dns-589cf688cc-2cbbb\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.287522 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-mwhjj"] Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.335151 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-dns-svc\") pod \"dnsmasq-dns-589cf688cc-2cbbb\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.335359 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqltw\" (UniqueName: \"kubernetes.io/projected/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-kube-api-access-qqltw\") pod \"dnsmasq-dns-589cf688cc-2cbbb\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.335478 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-config\") pod \"dnsmasq-dns-589cf688cc-2cbbb\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.336256 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-dns-svc\") pod \"dnsmasq-dns-589cf688cc-2cbbb\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.336946 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-config\") pod \"dnsmasq-dns-589cf688cc-2cbbb\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.353994 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqltw\" (UniqueName: \"kubernetes.io/projected/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-kube-api-access-qqltw\") pod \"dnsmasq-dns-589cf688cc-2cbbb\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.370878 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-tc4wb"] Feb 02 11:52:07 crc kubenswrapper[4909]: W0202 11:52:07.374693 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda42877f1_8a2d_4fb4_ba49_d543db736628.slice/crio-0f277178e226c713dadc96a4cbb0914bac45b6c3c8d757459d01e84986d77b2b WatchSource:0}: Error finding container 0f277178e226c713dadc96a4cbb0914bac45b6c3c8d757459d01e84986d77b2b: Status 404 returned error can't find the container with id 0f277178e226c713dadc96a4cbb0914bac45b6c3c8d757459d01e84986d77b2b Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.436902 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.718470 4909 generic.go:334] "Generic (PLEG): container finished" podID="9b924e3d-6edf-442a-b1a1-249025ce0cc6" containerID="f654e58e76a92e6bf3e68cc2d9f76a36906bd7e44cf5bc84496a7d01095ec9c5" exitCode=0 Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.718846 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" event={"ID":"9b924e3d-6edf-442a-b1a1-249025ce0cc6","Type":"ContainerDied","Data":"f654e58e76a92e6bf3e68cc2d9f76a36906bd7e44cf5bc84496a7d01095ec9c5"} Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.718922 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" event={"ID":"9b924e3d-6edf-442a-b1a1-249025ce0cc6","Type":"ContainerStarted","Data":"a41e04e68981322a9ff016dcbb1b48421fb34956bf0bac1126458e0cb4bca84e"} Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.722437 4909 generic.go:334] "Generic (PLEG): container finished" podID="a42877f1-8a2d-4fb4-ba49-d543db736628" containerID="2a255cd05480ddd3a794bcc74733690e918acfefdc4da3942e85848aea7c76f1" exitCode=0 Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.722507 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" event={"ID":"a42877f1-8a2d-4fb4-ba49-d543db736628","Type":"ContainerDied","Data":"2a255cd05480ddd3a794bcc74733690e918acfefdc4da3942e85848aea7c76f1"} Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.723370 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.723371 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" event={"ID":"a42877f1-8a2d-4fb4-ba49-d543db736628","Type":"ContainerStarted","Data":"0f277178e226c713dadc96a4cbb0914bac45b6c3c8d757459d01e84986d77b2b"} Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.816668 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.865636 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-2cbbb"] Feb 02 11:52:07 crc kubenswrapper[4909]: W0202 11:52:07.879058 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a9a11fb_cc8f_4e38_94e2_71adebf5db3f.slice/crio-eaee1ef6c86429639eceb68452a4b6113582ba34388cd7d5484ecdb59be2c8c7 WatchSource:0}: Error finding container eaee1ef6c86429639eceb68452a4b6113582ba34388cd7d5484ecdb59be2c8c7: Status 404 returned error can't find the container with id eaee1ef6c86429639eceb68452a4b6113582ba34388cd7d5484ecdb59be2c8c7 Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.924531 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.927014 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.929712 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.929843 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.930032 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.930056 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.930078 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.930135 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.930203 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-4pnr5" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.943707 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-config\") pod \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.943836 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-dns-svc\") pod \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.943944 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv8zs\" (UniqueName: \"kubernetes.io/projected/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-kube-api-access-gv8zs\") pod \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\" (UID: \"61b2e1d8-667b-48d8-b33f-8798b6c08a9d\") " Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.947605 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-config" (OuterVolumeSpecName: "config") pod "61b2e1d8-667b-48d8-b33f-8798b6c08a9d" (UID: "61b2e1d8-667b-48d8-b33f-8798b6c08a9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.948382 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61b2e1d8-667b-48d8-b33f-8798b6c08a9d" (UID: "61b2e1d8-667b-48d8-b33f-8798b6c08a9d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.951616 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-kube-api-access-gv8zs" (OuterVolumeSpecName: "kube-api-access-gv8zs") pod "61b2e1d8-667b-48d8-b33f-8798b6c08a9d" (UID: "61b2e1d8-667b-48d8-b33f-8798b6c08a9d"). InnerVolumeSpecName "kube-api-access-gv8zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:52:07 crc kubenswrapper[4909]: I0202 11:52:07.952892 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:52:07 crc kubenswrapper[4909]: E0202 11:52:07.979218 4909 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 02 11:52:07 crc kubenswrapper[4909]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/a42877f1-8a2d-4fb4-ba49-d543db736628/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 02 11:52:07 crc kubenswrapper[4909]: > podSandboxID="0f277178e226c713dadc96a4cbb0914bac45b6c3c8d757459d01e84986d77b2b" Feb 02 11:52:07 crc kubenswrapper[4909]: E0202 11:52:07.979390 4909 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 02 11:52:07 crc kubenswrapper[4909]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h54dhb7h666h69h76h59ch55ch65ch596h8h79h5c8h57hc8hfch5d7h697h79h698h5fch644hf9h54chbfh655hfchcbh5f8h646h5f7h89q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fx5b6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-9d69655f7-tc4wb_openstack(a42877f1-8a2d-4fb4-ba49-d543db736628): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/a42877f1-8a2d-4fb4-ba49-d543db736628/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 02 11:52:07 crc kubenswrapper[4909]: > logger="UnhandledError" Feb 02 11:52:07 crc kubenswrapper[4909]: E0202 11:52:07.982023 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/a42877f1-8a2d-4fb4-ba49-d543db736628/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" podUID="a42877f1-8a2d-4fb4-ba49-d543db736628" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.012033 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.046058 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.046507 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.046632 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.046718 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.046860 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.046938 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.046975 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-config-data\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.047093 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.047161 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2gsx\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-kube-api-access-m2gsx\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.047253 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.047329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.047500 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.047527 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv8zs\" (UniqueName: \"kubernetes.io/projected/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-kube-api-access-gv8zs\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.047542 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61b2e1d8-667b-48d8-b33f-8798b6c08a9d-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.148375 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd7zx\" (UniqueName: \"kubernetes.io/projected/9b924e3d-6edf-442a-b1a1-249025ce0cc6-kube-api-access-gd7zx\") pod \"9b924e3d-6edf-442a-b1a1-249025ce0cc6\" (UID: \"9b924e3d-6edf-442a-b1a1-249025ce0cc6\") " Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.148489 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b924e3d-6edf-442a-b1a1-249025ce0cc6-config\") pod \"9b924e3d-6edf-442a-b1a1-249025ce0cc6\" (UID: \"9b924e3d-6edf-442a-b1a1-249025ce0cc6\") " Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.148735 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.148764 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2gsx\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-kube-api-access-m2gsx\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.148834 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.148883 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.148953 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.148977 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.149003 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.149034 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.149750 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.149880 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.149988 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.150002 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-config-data\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.150670 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.151046 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-config-data\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.153717 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.154608 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.154633 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.157358 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b924e3d-6edf-442a-b1a1-249025ce0cc6-kube-api-access-gd7zx" (OuterVolumeSpecName: "kube-api-access-gd7zx") pod "9b924e3d-6edf-442a-b1a1-249025ce0cc6" (UID: "9b924e3d-6edf-442a-b1a1-249025ce0cc6"). InnerVolumeSpecName "kube-api-access-gd7zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.157580 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.158352 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.159108 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.159141 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b52abcf93a6cd0d5cc492e6e261fafb6a914b2b22fe9664718a6f49ac518689b/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.161598 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.177093 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2gsx\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-kube-api-access-m2gsx\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.193330 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") pod \"rabbitmq-server-0\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.198695 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b924e3d-6edf-442a-b1a1-249025ce0cc6-config" (OuterVolumeSpecName: "config") pod "9b924e3d-6edf-442a-b1a1-249025ce0cc6" (UID: "9b924e3d-6edf-442a-b1a1-249025ce0cc6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.218251 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:52:08 crc kubenswrapper[4909]: E0202 11:52:08.218641 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b924e3d-6edf-442a-b1a1-249025ce0cc6" containerName="init" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.218664 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b924e3d-6edf-442a-b1a1-249025ce0cc6" containerName="init" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.218834 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b924e3d-6edf-442a-b1a1-249025ce0cc6" containerName="init" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.219539 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.222947 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.223061 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.223126 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.223508 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.224516 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-h7vb2" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.224784 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.224859 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.237043 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.252383 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b924e3d-6edf-442a-b1a1-249025ce0cc6-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.252415 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd7zx\" (UniqueName: \"kubernetes.io/projected/9b924e3d-6edf-442a-b1a1-249025ce0cc6-kube-api-access-gd7zx\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.289832 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353468 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/624ff2b7-be29-48d9-9463-93df24dda1d7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353543 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353594 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/624ff2b7-be29-48d9-9463-93df24dda1d7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353621 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq57j\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-kube-api-access-dq57j\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353642 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353664 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353683 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.353738 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.456868 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.456928 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.457009 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.457074 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.457101 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/624ff2b7-be29-48d9-9463-93df24dda1d7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.457130 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.457253 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.457314 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.457344 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/624ff2b7-be29-48d9-9463-93df24dda1d7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.457368 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq57j\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-kube-api-access-dq57j\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.457392 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.457577 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.458733 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.459275 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.459334 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.462019 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.462071 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6437e80e58251d3d49db1d0b9bf2c70f2dad0ff12b7b4b9fb88a0164972ee98d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.462144 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/624ff2b7-be29-48d9-9463-93df24dda1d7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.463070 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.465477 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.465479 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.465530 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/624ff2b7-be29-48d9-9463-93df24dda1d7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.479286 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq57j\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-kube-api-access-dq57j\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.490694 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") pod \"rabbitmq-cell1-server-0\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.537242 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.727503 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.732415 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" event={"ID":"9b924e3d-6edf-442a-b1a1-249025ce0cc6","Type":"ContainerDied","Data":"a41e04e68981322a9ff016dcbb1b48421fb34956bf0bac1126458e0cb4bca84e"} Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.732464 4909 scope.go:117] "RemoveContainer" containerID="f654e58e76a92e6bf3e68cc2d9f76a36906bd7e44cf5bc84496a7d01095ec9c5" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.732599 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f98b88745-mwhjj" Feb 02 11:52:08 crc kubenswrapper[4909]: W0202 11:52:08.737855 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0848fba1_b8b4_4abd_9d4f_5adc86a46cd5.slice/crio-4f868ca3c7e2e59b0831c3fc8af10a700cd801c9a298d14e7b0087c15d0f4a03 WatchSource:0}: Error finding container 4f868ca3c7e2e59b0831c3fc8af10a700cd801c9a298d14e7b0087c15d0f4a03: Status 404 returned error can't find the container with id 4f868ca3c7e2e59b0831c3fc8af10a700cd801c9a298d14e7b0087c15d0f4a03 Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.738338 4909 generic.go:334] "Generic (PLEG): container finished" podID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" containerID="c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695" exitCode=0 Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.738433 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c4c8f55b5-4n6l5" Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.739512 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" event={"ID":"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f","Type":"ContainerDied","Data":"c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695"} Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.739556 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" event={"ID":"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f","Type":"ContainerStarted","Data":"eaee1ef6c86429639eceb68452a4b6113582ba34388cd7d5484ecdb59be2c8c7"} Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.894039 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-mwhjj"] Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.903150 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f98b88745-mwhjj"] Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.921153 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-4n6l5"] Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.929338 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c4c8f55b5-4n6l5"] Feb 02 11:52:08 crc kubenswrapper[4909]: I0202 11:52:08.973254 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:52:08 crc kubenswrapper[4909]: W0202 11:52:08.980361 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod624ff2b7_be29_48d9_9463_93df24dda1d7.slice/crio-d9a503a4ad3211e4c81d38ba7d1ec444b6ff7cbd81ff34c710b4f805989688a2 WatchSource:0}: Error finding container d9a503a4ad3211e4c81d38ba7d1ec444b6ff7cbd81ff34c710b4f805989688a2: Status 404 returned error can't find the container with id d9a503a4ad3211e4c81d38ba7d1ec444b6ff7cbd81ff34c710b4f805989688a2 Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.026022 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61b2e1d8-667b-48d8-b33f-8798b6c08a9d" path="/var/lib/kubelet/pods/61b2e1d8-667b-48d8-b33f-8798b6c08a9d/volumes" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.026452 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b924e3d-6edf-442a-b1a1-249025ce0cc6" path="/var/lib/kubelet/pods/9b924e3d-6edf-442a-b1a1-249025ce0cc6/volumes" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.261041 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.262297 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.264582 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.264770 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.266589 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-tz5fb" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.266735 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.273177 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.274519 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.372482 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9svh9\" (UniqueName: \"kubernetes.io/projected/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-kube-api-access-9svh9\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.372543 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.372565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.372606 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-config-data-default\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.372668 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-kolla-config\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.372693 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cd36d609-a8b7-4ce9-a4a5-10ed69631c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd36d609-a8b7-4ce9-a4a5-10ed69631c2c\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.372713 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.372732 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.474564 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-config-data-default\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.474626 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-kolla-config\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.474663 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cd36d609-a8b7-4ce9-a4a5-10ed69631c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd36d609-a8b7-4ce9-a4a5-10ed69631c2c\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.474693 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.474718 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.474781 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9svh9\" (UniqueName: \"kubernetes.io/projected/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-kube-api-access-9svh9\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.475713 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-kolla-config\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.475772 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-config-data-default\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.476946 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.477000 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.477103 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.478495 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.480446 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.480459 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.481943 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.481990 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cd36d609-a8b7-4ce9-a4a5-10ed69631c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd36d609-a8b7-4ce9-a4a5-10ed69631c2c\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/485a00d02ba8daf455ec82682cb8e2368d9103dbd8ee5054c4b32986a9aefc4c/globalmount\"" pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.497131 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9svh9\" (UniqueName: \"kubernetes.io/projected/4de2bb2a-bd6a-4a2d-b885-dacdc62949d9-kube-api-access-9svh9\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.517310 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cd36d609-a8b7-4ce9-a4a5-10ed69631c2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cd36d609-a8b7-4ce9-a4a5-10ed69631c2c\") pod \"openstack-galera-0\" (UID: \"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9\") " pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.578312 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.746775 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5","Type":"ContainerStarted","Data":"1741581c9e2297db3465851611c9fd1d26c3d106a1de2cb85e50f51ff9452ce5"} Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.746849 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5","Type":"ContainerStarted","Data":"4f868ca3c7e2e59b0831c3fc8af10a700cd801c9a298d14e7b0087c15d0f4a03"} Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.751142 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" event={"ID":"a42877f1-8a2d-4fb4-ba49-d543db736628","Type":"ContainerStarted","Data":"e58b76b6f1bac3e2be25667a46fb297dd22f6b3f946fefcd4d0b705612593e65"} Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.751870 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.754648 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" event={"ID":"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f","Type":"ContainerStarted","Data":"09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a"} Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.755344 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.756447 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"624ff2b7-be29-48d9-9463-93df24dda1d7","Type":"ContainerStarted","Data":"d9a503a4ad3211e4c81d38ba7d1ec444b6ff7cbd81ff34c710b4f805989688a2"} Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.819385 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" podStartSLOduration=3.8193632060000002 podStartE2EDuration="3.819363206s" podCreationTimestamp="2026-02-02 11:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:52:09.795825689 +0000 UTC m=+4855.541926424" watchObservedRunningTime="2026-02-02 11:52:09.819363206 +0000 UTC m=+4855.565463941" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.822563 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" podStartSLOduration=2.822549747 podStartE2EDuration="2.822549747s" podCreationTimestamp="2026-02-02 11:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:52:09.813978534 +0000 UTC m=+4855.560079269" watchObservedRunningTime="2026-02-02 11:52:09.822549747 +0000 UTC m=+4855.568650482" Feb 02 11:52:09 crc kubenswrapper[4909]: I0202 11:52:09.848991 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.721009 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.722746 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.725185 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rvpjz" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.725537 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.725716 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.728407 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.734105 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.768016 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9","Type":"ContainerStarted","Data":"5cf466b1ed1986ebcd25bd756f37aa8d90647eb6d8814177f94ceea4e66cdb52"} Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.768054 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9","Type":"ContainerStarted","Data":"ba2436817ea1ce440074ba81108c71d31b27048f525f782c38dd4927de5979c9"} Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.771199 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"624ff2b7-be29-48d9-9463-93df24dda1d7","Type":"ContainerStarted","Data":"45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6"} Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.795286 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5ac64620-4ef4-450c-bb90-9699e42b9f2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ac64620-4ef4-450c-bb90-9699e42b9f2b\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.795352 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mdkv\" (UniqueName: \"kubernetes.io/projected/80206191-1878-4ffb-a98e-5d62e577218d-kube-api-access-5mdkv\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.795378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/80206191-1878-4ffb-a98e-5d62e577218d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.795400 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80206191-1878-4ffb-a98e-5d62e577218d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.795420 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80206191-1878-4ffb-a98e-5d62e577218d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.795456 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/80206191-1878-4ffb-a98e-5d62e577218d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.795487 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/80206191-1878-4ffb-a98e-5d62e577218d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.795514 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/80206191-1878-4ffb-a98e-5d62e577218d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.897165 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/80206191-1878-4ffb-a98e-5d62e577218d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.897273 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/80206191-1878-4ffb-a98e-5d62e577218d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.897350 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5ac64620-4ef4-450c-bb90-9699e42b9f2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ac64620-4ef4-450c-bb90-9699e42b9f2b\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.897442 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mdkv\" (UniqueName: \"kubernetes.io/projected/80206191-1878-4ffb-a98e-5d62e577218d-kube-api-access-5mdkv\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.897489 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/80206191-1878-4ffb-a98e-5d62e577218d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.897557 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80206191-1878-4ffb-a98e-5d62e577218d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.897603 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80206191-1878-4ffb-a98e-5d62e577218d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.897822 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/80206191-1878-4ffb-a98e-5d62e577218d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.898377 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/80206191-1878-4ffb-a98e-5d62e577218d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.898519 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/80206191-1878-4ffb-a98e-5d62e577218d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.898664 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/80206191-1878-4ffb-a98e-5d62e577218d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.901301 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80206191-1878-4ffb-a98e-5d62e577218d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.903310 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80206191-1878-4ffb-a98e-5d62e577218d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.910285 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/80206191-1878-4ffb-a98e-5d62e577218d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.910439 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.910481 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5ac64620-4ef4-450c-bb90-9699e42b9f2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ac64620-4ef4-450c-bb90-9699e42b9f2b\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7fffdb7f79a7fa53c24e4f8fb8258f1a61a63db907e6435d482bb94b2c03b9b3/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.917323 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mdkv\" (UniqueName: \"kubernetes.io/projected/80206191-1878-4ffb-a98e-5d62e577218d-kube-api-access-5mdkv\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:10 crc kubenswrapper[4909]: I0202 11:52:10.940473 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5ac64620-4ef4-450c-bb90-9699e42b9f2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ac64620-4ef4-450c-bb90-9699e42b9f2b\") pod \"openstack-cell1-galera-0\" (UID: \"80206191-1878-4ffb-a98e-5d62e577218d\") " pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.040520 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.164137 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.165014 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.167377 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8gfjm" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.167628 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.167775 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.203699 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290858d2-96ee-4354-81b5-426df3bcfba5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.203795 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/290858d2-96ee-4354-81b5-426df3bcfba5-config-data\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.203833 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/290858d2-96ee-4354-81b5-426df3bcfba5-kolla-config\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.203957 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5d2f\" (UniqueName: \"kubernetes.io/projected/290858d2-96ee-4354-81b5-426df3bcfba5-kube-api-access-s5d2f\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.204009 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/290858d2-96ee-4354-81b5-426df3bcfba5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.225178 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.305946 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290858d2-96ee-4354-81b5-426df3bcfba5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.306028 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/290858d2-96ee-4354-81b5-426df3bcfba5-config-data\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.306054 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/290858d2-96ee-4354-81b5-426df3bcfba5-kolla-config\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.306076 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5d2f\" (UniqueName: \"kubernetes.io/projected/290858d2-96ee-4354-81b5-426df3bcfba5-kube-api-access-s5d2f\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.306095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/290858d2-96ee-4354-81b5-426df3bcfba5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.307201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/290858d2-96ee-4354-81b5-426df3bcfba5-kolla-config\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.307210 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/290858d2-96ee-4354-81b5-426df3bcfba5-config-data\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.310503 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/290858d2-96ee-4354-81b5-426df3bcfba5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.316241 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/290858d2-96ee-4354-81b5-426df3bcfba5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.321672 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5d2f\" (UniqueName: \"kubernetes.io/projected/290858d2-96ee-4354-81b5-426df3bcfba5-kube-api-access-s5d2f\") pod \"memcached-0\" (UID: \"290858d2-96ee-4354-81b5-426df3bcfba5\") " pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.484002 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.484265 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 02 11:52:11 crc kubenswrapper[4909]: W0202 11:52:11.493875 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80206191_1878_4ffb_a98e_5d62e577218d.slice/crio-e9492caa74a502542b491d21c91c4d9777c1714dd1afecba605c598ef40d5d5a WatchSource:0}: Error finding container e9492caa74a502542b491d21c91c4d9777c1714dd1afecba605c598ef40d5d5a: Status 404 returned error can't find the container with id e9492caa74a502542b491d21c91c4d9777c1714dd1afecba605c598ef40d5d5a Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.778984 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"80206191-1878-4ffb-a98e-5d62e577218d","Type":"ContainerStarted","Data":"5d43e4cd78c7087af51787a3ac2585e4293f8e728c295bfff799c8a6fc9b7d63"} Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.779360 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"80206191-1878-4ffb-a98e-5d62e577218d","Type":"ContainerStarted","Data":"e9492caa74a502542b491d21c91c4d9777c1714dd1afecba605c598ef40d5d5a"} Feb 02 11:52:11 crc kubenswrapper[4909]: I0202 11:52:11.903120 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 02 11:52:12 crc kubenswrapper[4909]: I0202 11:52:12.788751 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"290858d2-96ee-4354-81b5-426df3bcfba5","Type":"ContainerStarted","Data":"6a20f09c50c4d5963458bfc29858980188e726a83bf8a199a597be3094925793"} Feb 02 11:52:12 crc kubenswrapper[4909]: I0202 11:52:12.789060 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"290858d2-96ee-4354-81b5-426df3bcfba5","Type":"ContainerStarted","Data":"0b4926cf4ad58f8c7a34643fbb4c586e2fed9270f23295fe9fd51130a454c0c9"} Feb 02 11:52:12 crc kubenswrapper[4909]: I0202 11:52:12.806486 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.80646901 podStartE2EDuration="1.80646901s" podCreationTimestamp="2026-02-02 11:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:52:12.806046788 +0000 UTC m=+4858.552147553" watchObservedRunningTime="2026-02-02 11:52:12.80646901 +0000 UTC m=+4858.552569745" Feb 02 11:52:13 crc kubenswrapper[4909]: I0202 11:52:13.797159 4909 generic.go:334] "Generic (PLEG): container finished" podID="4de2bb2a-bd6a-4a2d-b885-dacdc62949d9" containerID="5cf466b1ed1986ebcd25bd756f37aa8d90647eb6d8814177f94ceea4e66cdb52" exitCode=0 Feb 02 11:52:13 crc kubenswrapper[4909]: I0202 11:52:13.797251 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9","Type":"ContainerDied","Data":"5cf466b1ed1986ebcd25bd756f37aa8d90647eb6d8814177f94ceea4e66cdb52"} Feb 02 11:52:13 crc kubenswrapper[4909]: I0202 11:52:13.797593 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 02 11:52:14 crc kubenswrapper[4909]: I0202 11:52:14.806091 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4de2bb2a-bd6a-4a2d-b885-dacdc62949d9","Type":"ContainerStarted","Data":"44be60ec451b7288e63e923aabcb72bce60f77e38dca866edce709e7e5162036"} Feb 02 11:52:14 crc kubenswrapper[4909]: I0202 11:52:14.828947 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=6.828911476 podStartE2EDuration="6.828911476s" podCreationTimestamp="2026-02-02 11:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:52:14.825444567 +0000 UTC m=+4860.571545302" watchObservedRunningTime="2026-02-02 11:52:14.828911476 +0000 UTC m=+4860.575012221" Feb 02 11:52:15 crc kubenswrapper[4909]: I0202 11:52:15.814185 4909 generic.go:334] "Generic (PLEG): container finished" podID="80206191-1878-4ffb-a98e-5d62e577218d" containerID="5d43e4cd78c7087af51787a3ac2585e4293f8e728c295bfff799c8a6fc9b7d63" exitCode=0 Feb 02 11:52:15 crc kubenswrapper[4909]: I0202 11:52:15.814298 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"80206191-1878-4ffb-a98e-5d62e577218d","Type":"ContainerDied","Data":"5d43e4cd78c7087af51787a3ac2585e4293f8e728c295bfff799c8a6fc9b7d63"} Feb 02 11:52:16 crc kubenswrapper[4909]: I0202 11:52:16.732701 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:16 crc kubenswrapper[4909]: I0202 11:52:16.822387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"80206191-1878-4ffb-a98e-5d62e577218d","Type":"ContainerStarted","Data":"0123b1bc93c5291648e80acc125e91574d8fdc17a13c4bb54337208af1fb7387"} Feb 02 11:52:16 crc kubenswrapper[4909]: I0202 11:52:16.846638 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.846614798 podStartE2EDuration="7.846614798s" podCreationTimestamp="2026-02-02 11:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:52:16.841415721 +0000 UTC m=+4862.587516456" watchObservedRunningTime="2026-02-02 11:52:16.846614798 +0000 UTC m=+4862.592715533" Feb 02 11:52:17 crc kubenswrapper[4909]: I0202 11:52:17.439017 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:52:17 crc kubenswrapper[4909]: I0202 11:52:17.494186 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-tc4wb"] Feb 02 11:52:17 crc kubenswrapper[4909]: I0202 11:52:17.494422 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" podUID="a42877f1-8a2d-4fb4-ba49-d543db736628" containerName="dnsmasq-dns" containerID="cri-o://e58b76b6f1bac3e2be25667a46fb297dd22f6b3f946fefcd4d0b705612593e65" gracePeriod=10 Feb 02 11:52:17 crc kubenswrapper[4909]: I0202 11:52:17.831570 4909 generic.go:334] "Generic (PLEG): container finished" podID="a42877f1-8a2d-4fb4-ba49-d543db736628" containerID="e58b76b6f1bac3e2be25667a46fb297dd22f6b3f946fefcd4d0b705612593e65" exitCode=0 Feb 02 11:52:17 crc kubenswrapper[4909]: I0202 11:52:17.831650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" event={"ID":"a42877f1-8a2d-4fb4-ba49-d543db736628","Type":"ContainerDied","Data":"e58b76b6f1bac3e2be25667a46fb297dd22f6b3f946fefcd4d0b705612593e65"} Feb 02 11:52:17 crc kubenswrapper[4909]: I0202 11:52:17.955835 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.109889 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx5b6\" (UniqueName: \"kubernetes.io/projected/a42877f1-8a2d-4fb4-ba49-d543db736628-kube-api-access-fx5b6\") pod \"a42877f1-8a2d-4fb4-ba49-d543db736628\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.109973 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-dns-svc\") pod \"a42877f1-8a2d-4fb4-ba49-d543db736628\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.110079 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-config\") pod \"a42877f1-8a2d-4fb4-ba49-d543db736628\" (UID: \"a42877f1-8a2d-4fb4-ba49-d543db736628\") " Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.120289 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42877f1-8a2d-4fb4-ba49-d543db736628-kube-api-access-fx5b6" (OuterVolumeSpecName: "kube-api-access-fx5b6") pod "a42877f1-8a2d-4fb4-ba49-d543db736628" (UID: "a42877f1-8a2d-4fb4-ba49-d543db736628"). InnerVolumeSpecName "kube-api-access-fx5b6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.145401 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a42877f1-8a2d-4fb4-ba49-d543db736628" (UID: "a42877f1-8a2d-4fb4-ba49-d543db736628"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.146492 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-config" (OuterVolumeSpecName: "config") pod "a42877f1-8a2d-4fb4-ba49-d543db736628" (UID: "a42877f1-8a2d-4fb4-ba49-d543db736628"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.211897 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx5b6\" (UniqueName: \"kubernetes.io/projected/a42877f1-8a2d-4fb4-ba49-d543db736628-kube-api-access-fx5b6\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.211945 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.211958 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a42877f1-8a2d-4fb4-ba49-d543db736628-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.840588 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" event={"ID":"a42877f1-8a2d-4fb4-ba49-d543db736628","Type":"ContainerDied","Data":"0f277178e226c713dadc96a4cbb0914bac45b6c3c8d757459d01e84986d77b2b"} Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.841020 4909 scope.go:117] "RemoveContainer" containerID="e58b76b6f1bac3e2be25667a46fb297dd22f6b3f946fefcd4d0b705612593e65" Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.841177 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d69655f7-tc4wb" Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.864517 4909 scope.go:117] "RemoveContainer" containerID="2a255cd05480ddd3a794bcc74733690e918acfefdc4da3942e85848aea7c76f1" Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.874635 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-tc4wb"] Feb 02 11:52:18 crc kubenswrapper[4909]: I0202 11:52:18.878972 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d69655f7-tc4wb"] Feb 02 11:52:19 crc kubenswrapper[4909]: I0202 11:52:19.026509 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42877f1-8a2d-4fb4-ba49-d543db736628" path="/var/lib/kubelet/pods/a42877f1-8a2d-4fb4-ba49-d543db736628/volumes" Feb 02 11:52:19 crc kubenswrapper[4909]: I0202 11:52:19.578772 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 02 11:52:19 crc kubenswrapper[4909]: I0202 11:52:19.578942 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 02 11:52:21 crc kubenswrapper[4909]: I0202 11:52:21.040831 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:21 crc kubenswrapper[4909]: I0202 11:52:21.041200 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:21 crc kubenswrapper[4909]: I0202 11:52:21.106142 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:21 crc kubenswrapper[4909]: I0202 11:52:21.486183 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 02 11:52:21 crc kubenswrapper[4909]: I0202 11:52:21.892029 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 02 11:52:21 crc kubenswrapper[4909]: I0202 11:52:21.945264 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 02 11:52:21 crc kubenswrapper[4909]: I0202 11:52:21.974264 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.240754 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-96kf5"] Feb 02 11:52:28 crc kubenswrapper[4909]: E0202 11:52:28.242863 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42877f1-8a2d-4fb4-ba49-d543db736628" containerName="init" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.242970 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42877f1-8a2d-4fb4-ba49-d543db736628" containerName="init" Feb 02 11:52:28 crc kubenswrapper[4909]: E0202 11:52:28.243048 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42877f1-8a2d-4fb4-ba49-d543db736628" containerName="dnsmasq-dns" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.243183 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42877f1-8a2d-4fb4-ba49-d543db736628" containerName="dnsmasq-dns" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.243462 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42877f1-8a2d-4fb4-ba49-d543db736628" containerName="dnsmasq-dns" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.258002 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-96kf5"] Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.258156 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-96kf5" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.260254 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.366267 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-operator-scripts\") pod \"root-account-create-update-96kf5\" (UID: \"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c\") " pod="openstack/root-account-create-update-96kf5" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.366322 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv4gq\" (UniqueName: \"kubernetes.io/projected/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-kube-api-access-pv4gq\") pod \"root-account-create-update-96kf5\" (UID: \"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c\") " pod="openstack/root-account-create-update-96kf5" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.467421 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv4gq\" (UniqueName: \"kubernetes.io/projected/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-kube-api-access-pv4gq\") pod \"root-account-create-update-96kf5\" (UID: \"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c\") " pod="openstack/root-account-create-update-96kf5" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.467544 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-operator-scripts\") pod \"root-account-create-update-96kf5\" (UID: \"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c\") " pod="openstack/root-account-create-update-96kf5" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.468242 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-operator-scripts\") pod \"root-account-create-update-96kf5\" (UID: \"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c\") " pod="openstack/root-account-create-update-96kf5" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.486717 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv4gq\" (UniqueName: \"kubernetes.io/projected/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-kube-api-access-pv4gq\") pod \"root-account-create-update-96kf5\" (UID: \"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c\") " pod="openstack/root-account-create-update-96kf5" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.579832 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-96kf5" Feb 02 11:52:28 crc kubenswrapper[4909]: I0202 11:52:28.966460 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-96kf5"] Feb 02 11:52:28 crc kubenswrapper[4909]: W0202 11:52:28.975110 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92d768e9_ac64_4aa6_a8a4_e3f0ac51113c.slice/crio-fa0da93f2ad3ce2e1b0a099d930a6de51d034b2c7e860cf7eec534101513b142 WatchSource:0}: Error finding container fa0da93f2ad3ce2e1b0a099d930a6de51d034b2c7e860cf7eec534101513b142: Status 404 returned error can't find the container with id fa0da93f2ad3ce2e1b0a099d930a6de51d034b2c7e860cf7eec534101513b142 Feb 02 11:52:29 crc kubenswrapper[4909]: I0202 11:52:29.928116 4909 generic.go:334] "Generic (PLEG): container finished" podID="92d768e9-ac64-4aa6-a8a4-e3f0ac51113c" containerID="810c31497df0350394d341a52016e7f54b2e46fbbb38ffebba0774ccbc29b1ff" exitCode=0 Feb 02 11:52:29 crc kubenswrapper[4909]: I0202 11:52:29.928156 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-96kf5" event={"ID":"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c","Type":"ContainerDied","Data":"810c31497df0350394d341a52016e7f54b2e46fbbb38ffebba0774ccbc29b1ff"} Feb 02 11:52:29 crc kubenswrapper[4909]: I0202 11:52:29.928182 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-96kf5" event={"ID":"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c","Type":"ContainerStarted","Data":"fa0da93f2ad3ce2e1b0a099d930a6de51d034b2c7e860cf7eec534101513b142"} Feb 02 11:52:31 crc kubenswrapper[4909]: I0202 11:52:31.205828 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-96kf5" Feb 02 11:52:31 crc kubenswrapper[4909]: I0202 11:52:31.305631 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-operator-scripts\") pod \"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c\" (UID: \"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c\") " Feb 02 11:52:31 crc kubenswrapper[4909]: I0202 11:52:31.305742 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv4gq\" (UniqueName: \"kubernetes.io/projected/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-kube-api-access-pv4gq\") pod \"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c\" (UID: \"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c\") " Feb 02 11:52:31 crc kubenswrapper[4909]: I0202 11:52:31.306437 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92d768e9-ac64-4aa6-a8a4-e3f0ac51113c" (UID: "92d768e9-ac64-4aa6-a8a4-e3f0ac51113c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:52:31 crc kubenswrapper[4909]: I0202 11:52:31.311288 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-kube-api-access-pv4gq" (OuterVolumeSpecName: "kube-api-access-pv4gq") pod "92d768e9-ac64-4aa6-a8a4-e3f0ac51113c" (UID: "92d768e9-ac64-4aa6-a8a4-e3f0ac51113c"). InnerVolumeSpecName "kube-api-access-pv4gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:52:31 crc kubenswrapper[4909]: I0202 11:52:31.407744 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv4gq\" (UniqueName: \"kubernetes.io/projected/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-kube-api-access-pv4gq\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:31 crc kubenswrapper[4909]: I0202 11:52:31.407786 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:31 crc kubenswrapper[4909]: I0202 11:52:31.944574 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-96kf5" event={"ID":"92d768e9-ac64-4aa6-a8a4-e3f0ac51113c","Type":"ContainerDied","Data":"fa0da93f2ad3ce2e1b0a099d930a6de51d034b2c7e860cf7eec534101513b142"} Feb 02 11:52:31 crc kubenswrapper[4909]: I0202 11:52:31.944931 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa0da93f2ad3ce2e1b0a099d930a6de51d034b2c7e860cf7eec534101513b142" Feb 02 11:52:31 crc kubenswrapper[4909]: I0202 11:52:31.944978 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-96kf5" Feb 02 11:52:34 crc kubenswrapper[4909]: I0202 11:52:34.696280 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-96kf5"] Feb 02 11:52:34 crc kubenswrapper[4909]: I0202 11:52:34.702280 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-96kf5"] Feb 02 11:52:35 crc kubenswrapper[4909]: I0202 11:52:35.034738 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d768e9-ac64-4aa6-a8a4-e3f0ac51113c" path="/var/lib/kubelet/pods/92d768e9-ac64-4aa6-a8a4-e3f0ac51113c/volumes" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.747954 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-s4czs"] Feb 02 11:52:39 crc kubenswrapper[4909]: E0202 11:52:39.749006 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92d768e9-ac64-4aa6-a8a4-e3f0ac51113c" containerName="mariadb-account-create-update" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.749023 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d768e9-ac64-4aa6-a8a4-e3f0ac51113c" containerName="mariadb-account-create-update" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.749212 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="92d768e9-ac64-4aa6-a8a4-e3f0ac51113c" containerName="mariadb-account-create-update" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.749803 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s4czs" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.753544 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.754460 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-s4czs"] Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.833590 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7faf3248-fd71-4f4e-960f-a89402aa822c-operator-scripts\") pod \"root-account-create-update-s4czs\" (UID: \"7faf3248-fd71-4f4e-960f-a89402aa822c\") " pod="openstack/root-account-create-update-s4czs" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.833691 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfp24\" (UniqueName: \"kubernetes.io/projected/7faf3248-fd71-4f4e-960f-a89402aa822c-kube-api-access-qfp24\") pod \"root-account-create-update-s4czs\" (UID: \"7faf3248-fd71-4f4e-960f-a89402aa822c\") " pod="openstack/root-account-create-update-s4czs" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.935387 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfp24\" (UniqueName: \"kubernetes.io/projected/7faf3248-fd71-4f4e-960f-a89402aa822c-kube-api-access-qfp24\") pod \"root-account-create-update-s4czs\" (UID: \"7faf3248-fd71-4f4e-960f-a89402aa822c\") " pod="openstack/root-account-create-update-s4czs" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.935483 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7faf3248-fd71-4f4e-960f-a89402aa822c-operator-scripts\") pod \"root-account-create-update-s4czs\" (UID: \"7faf3248-fd71-4f4e-960f-a89402aa822c\") " pod="openstack/root-account-create-update-s4czs" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.936236 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7faf3248-fd71-4f4e-960f-a89402aa822c-operator-scripts\") pod \"root-account-create-update-s4czs\" (UID: \"7faf3248-fd71-4f4e-960f-a89402aa822c\") " pod="openstack/root-account-create-update-s4czs" Feb 02 11:52:39 crc kubenswrapper[4909]: I0202 11:52:39.954773 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfp24\" (UniqueName: \"kubernetes.io/projected/7faf3248-fd71-4f4e-960f-a89402aa822c-kube-api-access-qfp24\") pod \"root-account-create-update-s4czs\" (UID: \"7faf3248-fd71-4f4e-960f-a89402aa822c\") " pod="openstack/root-account-create-update-s4czs" Feb 02 11:52:40 crc kubenswrapper[4909]: I0202 11:52:40.076348 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s4czs" Feb 02 11:52:40 crc kubenswrapper[4909]: I0202 11:52:40.467374 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-s4czs"] Feb 02 11:52:40 crc kubenswrapper[4909]: I0202 11:52:40.731650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-s4czs" event={"ID":"7faf3248-fd71-4f4e-960f-a89402aa822c","Type":"ContainerStarted","Data":"27a6db43f1898ac9e2ebe7807adbde6f140ea3ebfe5b9a9f7253a3946a7a11a9"} Feb 02 11:52:40 crc kubenswrapper[4909]: I0202 11:52:40.732021 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-s4czs" event={"ID":"7faf3248-fd71-4f4e-960f-a89402aa822c","Type":"ContainerStarted","Data":"6070497a08c670451c695c3bdf3823cde29b69e4d823ed941b1eedee4bccfe28"} Feb 02 11:52:40 crc kubenswrapper[4909]: I0202 11:52:40.746287 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-s4czs" podStartSLOduration=1.746267815 podStartE2EDuration="1.746267815s" podCreationTimestamp="2026-02-02 11:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:52:40.743179268 +0000 UTC m=+4886.489280013" watchObservedRunningTime="2026-02-02 11:52:40.746267815 +0000 UTC m=+4886.492368550" Feb 02 11:52:41 crc kubenswrapper[4909]: I0202 11:52:41.744867 4909 generic.go:334] "Generic (PLEG): container finished" podID="624ff2b7-be29-48d9-9463-93df24dda1d7" containerID="45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6" exitCode=0 Feb 02 11:52:41 crc kubenswrapper[4909]: I0202 11:52:41.744988 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"624ff2b7-be29-48d9-9463-93df24dda1d7","Type":"ContainerDied","Data":"45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6"} Feb 02 11:52:41 crc kubenswrapper[4909]: I0202 11:52:41.750770 4909 generic.go:334] "Generic (PLEG): container finished" podID="0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" containerID="1741581c9e2297db3465851611c9fd1d26c3d106a1de2cb85e50f51ff9452ce5" exitCode=0 Feb 02 11:52:41 crc kubenswrapper[4909]: I0202 11:52:41.750881 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5","Type":"ContainerDied","Data":"1741581c9e2297db3465851611c9fd1d26c3d106a1de2cb85e50f51ff9452ce5"} Feb 02 11:52:41 crc kubenswrapper[4909]: I0202 11:52:41.760468 4909 generic.go:334] "Generic (PLEG): container finished" podID="7faf3248-fd71-4f4e-960f-a89402aa822c" containerID="27a6db43f1898ac9e2ebe7807adbde6f140ea3ebfe5b9a9f7253a3946a7a11a9" exitCode=0 Feb 02 11:52:41 crc kubenswrapper[4909]: I0202 11:52:41.760525 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-s4czs" event={"ID":"7faf3248-fd71-4f4e-960f-a89402aa822c","Type":"ContainerDied","Data":"27a6db43f1898ac9e2ebe7807adbde6f140ea3ebfe5b9a9f7253a3946a7a11a9"} Feb 02 11:52:42 crc kubenswrapper[4909]: I0202 11:52:42.768178 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"624ff2b7-be29-48d9-9463-93df24dda1d7","Type":"ContainerStarted","Data":"2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765"} Feb 02 11:52:42 crc kubenswrapper[4909]: I0202 11:52:42.768740 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:52:42 crc kubenswrapper[4909]: I0202 11:52:42.770087 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5","Type":"ContainerStarted","Data":"173d4a6d65cb2b93ed6b1337136f8b36a6f1e7de5d5226fa0d8ef32279162cfe"} Feb 02 11:52:42 crc kubenswrapper[4909]: I0202 11:52:42.770378 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 02 11:52:42 crc kubenswrapper[4909]: I0202 11:52:42.794667 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.794649738 podStartE2EDuration="35.794649738s" podCreationTimestamp="2026-02-02 11:52:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:52:42.788942586 +0000 UTC m=+4888.535043331" watchObservedRunningTime="2026-02-02 11:52:42.794649738 +0000 UTC m=+4888.540750473" Feb 02 11:52:42 crc kubenswrapper[4909]: I0202 11:52:42.815203 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.815183711 podStartE2EDuration="36.815183711s" podCreationTimestamp="2026-02-02 11:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:52:42.81162456 +0000 UTC m=+4888.557725295" watchObservedRunningTime="2026-02-02 11:52:42.815183711 +0000 UTC m=+4888.561284436" Feb 02 11:52:43 crc kubenswrapper[4909]: I0202 11:52:43.076350 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s4czs" Feb 02 11:52:43 crc kubenswrapper[4909]: I0202 11:52:43.178601 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfp24\" (UniqueName: \"kubernetes.io/projected/7faf3248-fd71-4f4e-960f-a89402aa822c-kube-api-access-qfp24\") pod \"7faf3248-fd71-4f4e-960f-a89402aa822c\" (UID: \"7faf3248-fd71-4f4e-960f-a89402aa822c\") " Feb 02 11:52:43 crc kubenswrapper[4909]: I0202 11:52:43.178665 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7faf3248-fd71-4f4e-960f-a89402aa822c-operator-scripts\") pod \"7faf3248-fd71-4f4e-960f-a89402aa822c\" (UID: \"7faf3248-fd71-4f4e-960f-a89402aa822c\") " Feb 02 11:52:43 crc kubenswrapper[4909]: I0202 11:52:43.179172 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7faf3248-fd71-4f4e-960f-a89402aa822c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7faf3248-fd71-4f4e-960f-a89402aa822c" (UID: "7faf3248-fd71-4f4e-960f-a89402aa822c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:52:43 crc kubenswrapper[4909]: I0202 11:52:43.184098 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7faf3248-fd71-4f4e-960f-a89402aa822c-kube-api-access-qfp24" (OuterVolumeSpecName: "kube-api-access-qfp24") pod "7faf3248-fd71-4f4e-960f-a89402aa822c" (UID: "7faf3248-fd71-4f4e-960f-a89402aa822c"). InnerVolumeSpecName "kube-api-access-qfp24". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:52:43 crc kubenswrapper[4909]: I0202 11:52:43.280309 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfp24\" (UniqueName: \"kubernetes.io/projected/7faf3248-fd71-4f4e-960f-a89402aa822c-kube-api-access-qfp24\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:43 crc kubenswrapper[4909]: I0202 11:52:43.280339 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7faf3248-fd71-4f4e-960f-a89402aa822c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 11:52:43 crc kubenswrapper[4909]: I0202 11:52:43.779529 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-s4czs" Feb 02 11:52:43 crc kubenswrapper[4909]: I0202 11:52:43.780971 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-s4czs" event={"ID":"7faf3248-fd71-4f4e-960f-a89402aa822c","Type":"ContainerDied","Data":"6070497a08c670451c695c3bdf3823cde29b69e4d823ed941b1eedee4bccfe28"} Feb 02 11:52:43 crc kubenswrapper[4909]: I0202 11:52:43.781030 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6070497a08c670451c695c3bdf3823cde29b69e4d823ed941b1eedee4bccfe28" Feb 02 11:52:58 crc kubenswrapper[4909]: I0202 11:52:58.295069 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 02 11:52:58 crc kubenswrapper[4909]: I0202 11:52:58.541004 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.778253 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-vq5fj"] Feb 02 11:53:01 crc kubenswrapper[4909]: E0202 11:53:01.778838 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7faf3248-fd71-4f4e-960f-a89402aa822c" containerName="mariadb-account-create-update" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.778857 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7faf3248-fd71-4f4e-960f-a89402aa822c" containerName="mariadb-account-create-update" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.779022 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7faf3248-fd71-4f4e-960f-a89402aa822c" containerName="mariadb-account-create-update" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.780011 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.787510 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-config\") pod \"dnsmasq-dns-54dc9c94cc-vq5fj\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.787717 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmtkv\" (UniqueName: \"kubernetes.io/projected/8c9d8c83-78b4-40fe-9680-b9fecabd3728-kube-api-access-cmtkv\") pod \"dnsmasq-dns-54dc9c94cc-vq5fj\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.787741 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-vq5fj\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.835634 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-vq5fj"] Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.888760 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-config\") pod \"dnsmasq-dns-54dc9c94cc-vq5fj\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.889068 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmtkv\" (UniqueName: \"kubernetes.io/projected/8c9d8c83-78b4-40fe-9680-b9fecabd3728-kube-api-access-cmtkv\") pod \"dnsmasq-dns-54dc9c94cc-vq5fj\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.889118 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-vq5fj\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.889933 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-config\") pod \"dnsmasq-dns-54dc9c94cc-vq5fj\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.890061 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-dns-svc\") pod \"dnsmasq-dns-54dc9c94cc-vq5fj\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:01 crc kubenswrapper[4909]: I0202 11:53:01.908900 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmtkv\" (UniqueName: \"kubernetes.io/projected/8c9d8c83-78b4-40fe-9680-b9fecabd3728-kube-api-access-cmtkv\") pod \"dnsmasq-dns-54dc9c94cc-vq5fj\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:02 crc kubenswrapper[4909]: I0202 11:53:02.099227 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:02 crc kubenswrapper[4909]: I0202 11:53:02.492600 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:53:02 crc kubenswrapper[4909]: I0202 11:53:02.534347 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-vq5fj"] Feb 02 11:53:02 crc kubenswrapper[4909]: W0202 11:53:02.549033 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c9d8c83_78b4_40fe_9680_b9fecabd3728.slice/crio-de78327881cd7dba573819a3b3556786e4b5095f3d80f87b1eb8da6d9fdeab62 WatchSource:0}: Error finding container de78327881cd7dba573819a3b3556786e4b5095f3d80f87b1eb8da6d9fdeab62: Status 404 returned error can't find the container with id de78327881cd7dba573819a3b3556786e4b5095f3d80f87b1eb8da6d9fdeab62 Feb 02 11:53:02 crc kubenswrapper[4909]: I0202 11:53:02.928979 4909 generic.go:334] "Generic (PLEG): container finished" podID="8c9d8c83-78b4-40fe-9680-b9fecabd3728" containerID="168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506" exitCode=0 Feb 02 11:53:02 crc kubenswrapper[4909]: I0202 11:53:02.929086 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" event={"ID":"8c9d8c83-78b4-40fe-9680-b9fecabd3728","Type":"ContainerDied","Data":"168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506"} Feb 02 11:53:02 crc kubenswrapper[4909]: I0202 11:53:02.930153 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" event={"ID":"8c9d8c83-78b4-40fe-9680-b9fecabd3728","Type":"ContainerStarted","Data":"de78327881cd7dba573819a3b3556786e4b5095f3d80f87b1eb8da6d9fdeab62"} Feb 02 11:53:03 crc kubenswrapper[4909]: I0202 11:53:03.192434 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:53:03 crc kubenswrapper[4909]: I0202 11:53:03.937551 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" event={"ID":"8c9d8c83-78b4-40fe-9680-b9fecabd3728","Type":"ContainerStarted","Data":"9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c"} Feb 02 11:53:03 crc kubenswrapper[4909]: I0202 11:53:03.937703 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:03 crc kubenswrapper[4909]: I0202 11:53:03.967556 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" podStartSLOduration=2.967531338 podStartE2EDuration="2.967531338s" podCreationTimestamp="2026-02-02 11:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:53:03.96092111 +0000 UTC m=+4909.707021865" watchObservedRunningTime="2026-02-02 11:53:03.967531338 +0000 UTC m=+4909.713632073" Feb 02 11:53:06 crc kubenswrapper[4909]: I0202 11:53:06.524322 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" containerName="rabbitmq" containerID="cri-o://173d4a6d65cb2b93ed6b1337136f8b36a6f1e7de5d5226fa0d8ef32279162cfe" gracePeriod=604796 Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.216481 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="624ff2b7-be29-48d9-9463-93df24dda1d7" containerName="rabbitmq" containerID="cri-o://2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765" gracePeriod=604796 Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.618441 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rsprd"] Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.619897 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.644305 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rsprd"] Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.671378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xt6h\" (UniqueName: \"kubernetes.io/projected/3121abcc-c55d-493c-afa5-2912418cd7c1-kube-api-access-4xt6h\") pod \"community-operators-rsprd\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.671505 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-catalog-content\") pod \"community-operators-rsprd\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.671531 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-utilities\") pod \"community-operators-rsprd\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.772392 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-catalog-content\") pod \"community-operators-rsprd\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.772452 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-utilities\") pod \"community-operators-rsprd\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.772520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xt6h\" (UniqueName: \"kubernetes.io/projected/3121abcc-c55d-493c-afa5-2912418cd7c1-kube-api-access-4xt6h\") pod \"community-operators-rsprd\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.773010 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-catalog-content\") pod \"community-operators-rsprd\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.773119 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-utilities\") pod \"community-operators-rsprd\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.792130 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xt6h\" (UniqueName: \"kubernetes.io/projected/3121abcc-c55d-493c-afa5-2912418cd7c1-kube-api-access-4xt6h\") pod \"community-operators-rsprd\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:07 crc kubenswrapper[4909]: I0202 11:53:07.940537 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:08 crc kubenswrapper[4909]: I0202 11:53:08.264306 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rsprd"] Feb 02 11:53:08 crc kubenswrapper[4909]: W0202 11:53:08.264510 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3121abcc_c55d_493c_afa5_2912418cd7c1.slice/crio-d6e857697adc4f891b5a51d5f1d26916ba2144fc72e06d1a7d8660b203b13084 WatchSource:0}: Error finding container d6e857697adc4f891b5a51d5f1d26916ba2144fc72e06d1a7d8660b203b13084: Status 404 returned error can't find the container with id d6e857697adc4f891b5a51d5f1d26916ba2144fc72e06d1a7d8660b203b13084 Feb 02 11:53:08 crc kubenswrapper[4909]: I0202 11:53:08.290991 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.243:5671: connect: connection refused" Feb 02 11:53:08 crc kubenswrapper[4909]: I0202 11:53:08.539077 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="624ff2b7-be29-48d9-9463-93df24dda1d7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.244:5671: connect: connection refused" Feb 02 11:53:08 crc kubenswrapper[4909]: I0202 11:53:08.971719 4909 generic.go:334] "Generic (PLEG): container finished" podID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerID="619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df" exitCode=0 Feb 02 11:53:08 crc kubenswrapper[4909]: I0202 11:53:08.971757 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsprd" event={"ID":"3121abcc-c55d-493c-afa5-2912418cd7c1","Type":"ContainerDied","Data":"619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df"} Feb 02 11:53:08 crc kubenswrapper[4909]: I0202 11:53:08.971796 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsprd" event={"ID":"3121abcc-c55d-493c-afa5-2912418cd7c1","Type":"ContainerStarted","Data":"d6e857697adc4f891b5a51d5f1d26916ba2144fc72e06d1a7d8660b203b13084"} Feb 02 11:53:09 crc kubenswrapper[4909]: I0202 11:53:09.981031 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsprd" event={"ID":"3121abcc-c55d-493c-afa5-2912418cd7c1","Type":"ContainerStarted","Data":"0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f"} Feb 02 11:53:10 crc kubenswrapper[4909]: I0202 11:53:10.988727 4909 generic.go:334] "Generic (PLEG): container finished" podID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerID="0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f" exitCode=0 Feb 02 11:53:10 crc kubenswrapper[4909]: I0202 11:53:10.988765 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsprd" event={"ID":"3121abcc-c55d-493c-afa5-2912418cd7c1","Type":"ContainerDied","Data":"0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f"} Feb 02 11:53:11 crc kubenswrapper[4909]: I0202 11:53:11.999554 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsprd" event={"ID":"3121abcc-c55d-493c-afa5-2912418cd7c1","Type":"ContainerStarted","Data":"44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc"} Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.027724 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rsprd" podStartSLOduration=2.53416027 podStartE2EDuration="5.027683302s" podCreationTimestamp="2026-02-02 11:53:07 +0000 UTC" firstStartedPulling="2026-02-02 11:53:08.973375532 +0000 UTC m=+4914.719476267" lastFinishedPulling="2026-02-02 11:53:11.466898564 +0000 UTC m=+4917.212999299" observedRunningTime="2026-02-02 11:53:12.02019947 +0000 UTC m=+4917.766300205" watchObservedRunningTime="2026-02-02 11:53:12.027683302 +0000 UTC m=+4917.773784037" Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.101104 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.145311 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-2cbbb"] Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.145603 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" podUID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" containerName="dnsmasq-dns" containerID="cri-o://09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a" gracePeriod=10 Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.554977 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.690606 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-config\") pod \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.690657 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqltw\" (UniqueName: \"kubernetes.io/projected/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-kube-api-access-qqltw\") pod \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.690701 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-dns-svc\") pod \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\" (UID: \"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f\") " Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.696280 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-kube-api-access-qqltw" (OuterVolumeSpecName: "kube-api-access-qqltw") pod "3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" (UID: "3a9a11fb-cc8f-4e38-94e2-71adebf5db3f"). InnerVolumeSpecName "kube-api-access-qqltw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.729270 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-config" (OuterVolumeSpecName: "config") pod "3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" (UID: "3a9a11fb-cc8f-4e38-94e2-71adebf5db3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.730391 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" (UID: "3a9a11fb-cc8f-4e38-94e2-71adebf5db3f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.792926 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.792973 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqltw\" (UniqueName: \"kubernetes.io/projected/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-kube-api-access-qqltw\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:12 crc kubenswrapper[4909]: I0202 11:53:12.792989 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.008979 4909 generic.go:334] "Generic (PLEG): container finished" podID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" containerID="09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a" exitCode=0 Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.009058 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.009072 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" event={"ID":"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f","Type":"ContainerDied","Data":"09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a"} Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.009143 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" event={"ID":"3a9a11fb-cc8f-4e38-94e2-71adebf5db3f","Type":"ContainerDied","Data":"eaee1ef6c86429639eceb68452a4b6113582ba34388cd7d5484ecdb59be2c8c7"} Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.009170 4909 scope.go:117] "RemoveContainer" containerID="09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.011452 4909 generic.go:334] "Generic (PLEG): container finished" podID="0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" containerID="173d4a6d65cb2b93ed6b1337136f8b36a6f1e7de5d5226fa0d8ef32279162cfe" exitCode=0 Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.012032 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5","Type":"ContainerDied","Data":"173d4a6d65cb2b93ed6b1337136f8b36a6f1e7de5d5226fa0d8ef32279162cfe"} Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.012071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5","Type":"ContainerDied","Data":"4f868ca3c7e2e59b0831c3fc8af10a700cd801c9a298d14e7b0087c15d0f4a03"} Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.012081 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f868ca3c7e2e59b0831c3fc8af10a700cd801c9a298d14e7b0087c15d0f4a03" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.053650 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.063634 4909 scope.go:117] "RemoveContainer" containerID="c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.099261 4909 scope.go:117] "RemoveContainer" containerID="09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.099726 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-server-conf\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.099757 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-plugins\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.099799 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-plugins-conf\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.099850 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2gsx\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-kube-api-access-m2gsx\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.099960 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.099983 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-confd\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.100001 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-config-data\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.100033 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-tls\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.100052 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-erlang-cookie-secret\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.100095 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-erlang-cookie\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.100125 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-pod-info\") pod \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\" (UID: \"0848fba1-b8b4-4abd-9d4f-5adc86a46cd5\") " Feb 02 11:53:13 crc kubenswrapper[4909]: E0202 11:53:13.100741 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a\": container with ID starting with 09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a not found: ID does not exist" containerID="09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.100776 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a"} err="failed to get container status \"09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a\": rpc error: code = NotFound desc = could not find container \"09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a\": container with ID starting with 09d4434446d977b6cc2e803f996a4168803b6153b835b3a169b966f7f5211f6a not found: ID does not exist" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.100817 4909 scope.go:117] "RemoveContainer" containerID="c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.103256 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: E0202 11:53:13.103678 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695\": container with ID starting with c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695 not found: ID does not exist" containerID="c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.103702 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695"} err="failed to get container status \"c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695\": rpc error: code = NotFound desc = could not find container \"c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695\": container with ID starting with c065a4f388f734685d5f47dfc4fde120e8c15370641c310079476921f40b4695 not found: ID does not exist" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.105106 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.105865 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-pod-info" (OuterVolumeSpecName: "pod-info") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.108097 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.112944 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-kube-api-access-m2gsx" (OuterVolumeSpecName: "kube-api-access-m2gsx") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "kube-api-access-m2gsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.113085 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.115546 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.125466 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-config-data" (OuterVolumeSpecName: "config-data") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.156667 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c" (OuterVolumeSpecName: "persistence") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.169768 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-server-conf" (OuterVolumeSpecName: "server-conf") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.199661 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" (UID: "0848fba1-b8b4-4abd-9d4f-5adc86a46cd5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.201927 4909 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.201960 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2gsx\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-kube-api-access-m2gsx\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.202006 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") on node \"crc\" " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.202020 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.202033 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.202044 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.202056 4909 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.202067 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.202081 4909 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-pod-info\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.202090 4909 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-server-conf\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.202100 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.217647 4909 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.217863 4909 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c") on node "crc" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.303358 4909 reconciler_common.go:293] "Volume detached for volume \"pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.742725 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810033 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq57j\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-kube-api-access-dq57j\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810096 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-erlang-cookie\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810134 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/624ff2b7-be29-48d9-9463-93df24dda1d7-pod-info\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810159 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-plugins\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810257 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810294 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-server-conf\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810344 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-confd\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810361 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-plugins-conf\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810378 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-tls\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810397 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-config-data\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.810412 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/624ff2b7-be29-48d9-9463-93df24dda1d7-erlang-cookie-secret\") pod \"624ff2b7-be29-48d9-9463-93df24dda1d7\" (UID: \"624ff2b7-be29-48d9-9463-93df24dda1d7\") " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.813007 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.813285 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/624ff2b7-be29-48d9-9463-93df24dda1d7-pod-info" (OuterVolumeSpecName: "pod-info") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.814672 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.818417 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-kube-api-access-dq57j" (OuterVolumeSpecName: "kube-api-access-dq57j") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "kube-api-access-dq57j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.823182 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.823944 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.835759 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624ff2b7-be29-48d9-9463-93df24dda1d7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.855568 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-config-data" (OuterVolumeSpecName: "config-data") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.855642 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1" (OuterVolumeSpecName: "persistence") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.863755 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-server-conf" (OuterVolumeSpecName: "server-conf") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.903096 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "624ff2b7-be29-48d9-9463-93df24dda1d7" (UID: "624ff2b7-be29-48d9-9463-93df24dda1d7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.911397 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.911600 4909 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/624ff2b7-be29-48d9-9463-93df24dda1d7-pod-info\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.911693 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.911841 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") on node \"crc\" " Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.911924 4909 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-server-conf\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.912000 4909 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.912074 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.912133 4909 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.912212 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/624ff2b7-be29-48d9-9463-93df24dda1d7-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.912270 4909 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/624ff2b7-be29-48d9-9463-93df24dda1d7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.912333 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq57j\" (UniqueName: \"kubernetes.io/projected/624ff2b7-be29-48d9-9463-93df24dda1d7-kube-api-access-dq57j\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.932686 4909 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 11:53:13 crc kubenswrapper[4909]: I0202 11:53:13.933081 4909 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1") on node "crc" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.012837 4909 reconciler_common.go:293] "Volume detached for volume \"pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.020256 4909 generic.go:334] "Generic (PLEG): container finished" podID="624ff2b7-be29-48d9-9463-93df24dda1d7" containerID="2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765" exitCode=0 Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.020321 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.020346 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.020568 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"624ff2b7-be29-48d9-9463-93df24dda1d7","Type":"ContainerDied","Data":"2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765"} Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.020699 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"624ff2b7-be29-48d9-9463-93df24dda1d7","Type":"ContainerDied","Data":"d9a503a4ad3211e4c81d38ba7d1ec444b6ff7cbd81ff34c710b4f805989688a2"} Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.020762 4909 scope.go:117] "RemoveContainer" containerID="2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.040554 4909 scope.go:117] "RemoveContainer" containerID="45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.060391 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.086297 4909 scope.go:117] "RemoveContainer" containerID="2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.086376 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:53:14 crc kubenswrapper[4909]: E0202 11:53:14.090478 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765\": container with ID starting with 2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765 not found: ID does not exist" containerID="2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.090529 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765"} err="failed to get container status \"2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765\": rpc error: code = NotFound desc = could not find container \"2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765\": container with ID starting with 2055745f6c676430942066d93a582a91b21eea78d7486123dabb15bfa5a1c765 not found: ID does not exist" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.090598 4909 scope.go:117] "RemoveContainer" containerID="45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6" Feb 02 11:53:14 crc kubenswrapper[4909]: E0202 11:53:14.091939 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6\": container with ID starting with 45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6 not found: ID does not exist" containerID="45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.092070 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6"} err="failed to get container status \"45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6\": rpc error: code = NotFound desc = could not find container \"45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6\": container with ID starting with 45c0d335d028939c778b94f6a331c64a03b66943f60eec25baaa9a8400f748e6 not found: ID does not exist" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.098413 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.114184 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.125209 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:53:14 crc kubenswrapper[4909]: E0202 11:53:14.125720 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624ff2b7-be29-48d9-9463-93df24dda1d7" containerName="setup-container" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.125743 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="624ff2b7-be29-48d9-9463-93df24dda1d7" containerName="setup-container" Feb 02 11:53:14 crc kubenswrapper[4909]: E0202 11:53:14.125758 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" containerName="setup-container" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.125764 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" containerName="setup-container" Feb 02 11:53:14 crc kubenswrapper[4909]: E0202 11:53:14.125777 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" containerName="init" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.125784 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" containerName="init" Feb 02 11:53:14 crc kubenswrapper[4909]: E0202 11:53:14.125794 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" containerName="dnsmasq-dns" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.125799 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" containerName="dnsmasq-dns" Feb 02 11:53:14 crc kubenswrapper[4909]: E0202 11:53:14.125832 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624ff2b7-be29-48d9-9463-93df24dda1d7" containerName="rabbitmq" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.125840 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="624ff2b7-be29-48d9-9463-93df24dda1d7" containerName="rabbitmq" Feb 02 11:53:14 crc kubenswrapper[4909]: E0202 11:53:14.125852 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" containerName="rabbitmq" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.125859 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" containerName="rabbitmq" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.125986 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" containerName="dnsmasq-dns" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.126005 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" containerName="rabbitmq" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.126016 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="624ff2b7-be29-48d9-9463-93df24dda1d7" containerName="rabbitmq" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.126792 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.129766 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.131126 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.131472 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.131645 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.131874 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.132250 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.132378 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-4pnr5" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.133216 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.134607 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.137492 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.137726 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.138246 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.138405 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.138565 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-h7vb2" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.138742 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.138947 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.139976 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.147324 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.316680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.317097 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.317208 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.317348 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.317452 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.317552 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a578b144-a50a-4b91-9410-493990a51e5a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.317669 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a578b144-a50a-4b91-9410-493990a51e5a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.317771 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a578b144-a50a-4b91-9410-493990a51e5a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.317906 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a578b144-a50a-4b91-9410-493990a51e5a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.318029 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hcr2\" (UniqueName: \"kubernetes.io/projected/64589d75-4c87-4648-ad24-bfbd620e9f2d-kube-api-access-2hcr2\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.318150 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a578b144-a50a-4b91-9410-493990a51e5a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.318255 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhj54\" (UniqueName: \"kubernetes.io/projected/a578b144-a50a-4b91-9410-493990a51e5a-kube-api-access-bhj54\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.318370 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64589d75-4c87-4648-ad24-bfbd620e9f2d-config-data\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.318483 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/64589d75-4c87-4648-ad24-bfbd620e9f2d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.318600 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.318688 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.318793 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.318923 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/64589d75-4c87-4648-ad24-bfbd620e9f2d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.319029 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/64589d75-4c87-4648-ad24-bfbd620e9f2d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.319137 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.319248 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/64589d75-4c87-4648-ad24-bfbd620e9f2d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.319375 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.420693 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.420744 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a578b144-a50a-4b91-9410-493990a51e5a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.420793 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a578b144-a50a-4b91-9410-493990a51e5a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.420830 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a578b144-a50a-4b91-9410-493990a51e5a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.420874 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a578b144-a50a-4b91-9410-493990a51e5a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.420918 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hcr2\" (UniqueName: \"kubernetes.io/projected/64589d75-4c87-4648-ad24-bfbd620e9f2d-kube-api-access-2hcr2\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.420948 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a578b144-a50a-4b91-9410-493990a51e5a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.420972 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhj54\" (UniqueName: \"kubernetes.io/projected/a578b144-a50a-4b91-9410-493990a51e5a-kube-api-access-bhj54\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.420997 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64589d75-4c87-4648-ad24-bfbd620e9f2d-config-data\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421018 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/64589d75-4c87-4648-ad24-bfbd620e9f2d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421071 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421096 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421147 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/64589d75-4c87-4648-ad24-bfbd620e9f2d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421173 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/64589d75-4c87-4648-ad24-bfbd620e9f2d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421200 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/64589d75-4c87-4648-ad24-bfbd620e9f2d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421268 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421288 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421321 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421353 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.421394 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.423449 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.423969 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.425267 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.426104 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/64589d75-4c87-4648-ad24-bfbd620e9f2d-config-data\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.426856 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a578b144-a50a-4b91-9410-493990a51e5a-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.426989 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/64589d75-4c87-4648-ad24-bfbd620e9f2d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.427934 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a578b144-a50a-4b91-9410-493990a51e5a-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.428962 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a578b144-a50a-4b91-9410-493990a51e5a-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.429237 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.429671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/64589d75-4c87-4648-ad24-bfbd620e9f2d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.434491 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/64589d75-4c87-4648-ad24-bfbd620e9f2d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.435860 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.437859 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a578b144-a50a-4b91-9410-493990a51e5a-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.438517 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a578b144-a50a-4b91-9410-493990a51e5a-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.438628 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.440705 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/64589d75-4c87-4648-ad24-bfbd620e9f2d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.441334 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a578b144-a50a-4b91-9410-493990a51e5a-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.444401 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/64589d75-4c87-4648-ad24-bfbd620e9f2d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.449942 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.449990 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6437e80e58251d3d49db1d0b9bf2c70f2dad0ff12b7b4b9fb88a0164972ee98d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.450106 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.450163 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b52abcf93a6cd0d5cc492e6e261fafb6a914b2b22fe9664718a6f49ac518689b/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.450588 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hcr2\" (UniqueName: \"kubernetes.io/projected/64589d75-4c87-4648-ad24-bfbd620e9f2d-kube-api-access-2hcr2\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.450927 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhj54\" (UniqueName: \"kubernetes.io/projected/a578b144-a50a-4b91-9410-493990a51e5a-kube-api-access-bhj54\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.480071 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5d7c2715-3877-4edd-a9c3-c1c47c45414c\") pod \"rabbitmq-server-0\" (UID: \"64589d75-4c87-4648-ad24-bfbd620e9f2d\") " pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.483149 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0ccc338b-a914-4822-8d73-052e6efdf0f1\") pod \"rabbitmq-cell1-server-0\" (UID: \"a578b144-a50a-4b91-9410-493990a51e5a\") " pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.746530 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 02 11:53:14 crc kubenswrapper[4909]: I0202 11:53:14.755908 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:15 crc kubenswrapper[4909]: I0202 11:53:15.029581 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0848fba1-b8b4-4abd-9d4f-5adc86a46cd5" path="/var/lib/kubelet/pods/0848fba1-b8b4-4abd-9d4f-5adc86a46cd5/volumes" Feb 02 11:53:15 crc kubenswrapper[4909]: I0202 11:53:15.030929 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624ff2b7-be29-48d9-9463-93df24dda1d7" path="/var/lib/kubelet/pods/624ff2b7-be29-48d9-9463-93df24dda1d7/volumes" Feb 02 11:53:15 crc kubenswrapper[4909]: I0202 11:53:15.230317 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 02 11:53:15 crc kubenswrapper[4909]: I0202 11:53:15.293608 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 02 11:53:15 crc kubenswrapper[4909]: W0202 11:53:15.301080 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda578b144_a50a_4b91_9410_493990a51e5a.slice/crio-588430c49719c5f87a2ae62aaf9ff22f1f4120f131e0e82bfdc2ec2764d03911 WatchSource:0}: Error finding container 588430c49719c5f87a2ae62aaf9ff22f1f4120f131e0e82bfdc2ec2764d03911: Status 404 returned error can't find the container with id 588430c49719c5f87a2ae62aaf9ff22f1f4120f131e0e82bfdc2ec2764d03911 Feb 02 11:53:16 crc kubenswrapper[4909]: I0202 11:53:16.037165 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"64589d75-4c87-4648-ad24-bfbd620e9f2d","Type":"ContainerStarted","Data":"dfc236b045eb596b00024acde76287ef5fe2e8438de1073f0a0b87ba210b69e4"} Feb 02 11:53:16 crc kubenswrapper[4909]: I0202 11:53:16.039580 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a578b144-a50a-4b91-9410-493990a51e5a","Type":"ContainerStarted","Data":"588430c49719c5f87a2ae62aaf9ff22f1f4120f131e0e82bfdc2ec2764d03911"} Feb 02 11:53:17 crc kubenswrapper[4909]: I0202 11:53:17.047355 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a578b144-a50a-4b91-9410-493990a51e5a","Type":"ContainerStarted","Data":"78ef4200fd86fd774bf292d2eae8596863ccdac7f717f92bb37dad0e3242eb63"} Feb 02 11:53:17 crc kubenswrapper[4909]: I0202 11:53:17.048857 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"64589d75-4c87-4648-ad24-bfbd620e9f2d","Type":"ContainerStarted","Data":"3a13224ba734079693a54baab1f89462dab842fbf79715f1836c992ce5312885"} Feb 02 11:53:17 crc kubenswrapper[4909]: I0202 11:53:17.438316 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" podUID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.242:5353: i/o timeout" Feb 02 11:53:17 crc kubenswrapper[4909]: I0202 11:53:17.940894 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:17 crc kubenswrapper[4909]: I0202 11:53:17.941267 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:17 crc kubenswrapper[4909]: I0202 11:53:17.984080 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:18 crc kubenswrapper[4909]: I0202 11:53:18.103394 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:18 crc kubenswrapper[4909]: I0202 11:53:18.217498 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rsprd"] Feb 02 11:53:19 crc kubenswrapper[4909]: I0202 11:53:19.511138 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:53:19 crc kubenswrapper[4909]: I0202 11:53:19.511501 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.076137 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rsprd" podUID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerName="registry-server" containerID="cri-o://44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc" gracePeriod=2 Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.438522 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.506272 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-catalog-content\") pod \"3121abcc-c55d-493c-afa5-2912418cd7c1\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.506352 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-utilities\") pod \"3121abcc-c55d-493c-afa5-2912418cd7c1\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.506428 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xt6h\" (UniqueName: \"kubernetes.io/projected/3121abcc-c55d-493c-afa5-2912418cd7c1-kube-api-access-4xt6h\") pod \"3121abcc-c55d-493c-afa5-2912418cd7c1\" (UID: \"3121abcc-c55d-493c-afa5-2912418cd7c1\") " Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.508125 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-utilities" (OuterVolumeSpecName: "utilities") pod "3121abcc-c55d-493c-afa5-2912418cd7c1" (UID: "3121abcc-c55d-493c-afa5-2912418cd7c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.513307 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3121abcc-c55d-493c-afa5-2912418cd7c1-kube-api-access-4xt6h" (OuterVolumeSpecName: "kube-api-access-4xt6h") pod "3121abcc-c55d-493c-afa5-2912418cd7c1" (UID: "3121abcc-c55d-493c-afa5-2912418cd7c1"). InnerVolumeSpecName "kube-api-access-4xt6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.558933 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3121abcc-c55d-493c-afa5-2912418cd7c1" (UID: "3121abcc-c55d-493c-afa5-2912418cd7c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.608901 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.608934 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xt6h\" (UniqueName: \"kubernetes.io/projected/3121abcc-c55d-493c-afa5-2912418cd7c1-kube-api-access-4xt6h\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:20 crc kubenswrapper[4909]: I0202 11:53:20.608950 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3121abcc-c55d-493c-afa5-2912418cd7c1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.084092 4909 generic.go:334] "Generic (PLEG): container finished" podID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerID="44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc" exitCode=0 Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.084138 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsprd" event={"ID":"3121abcc-c55d-493c-afa5-2912418cd7c1","Type":"ContainerDied","Data":"44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc"} Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.084164 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rsprd" event={"ID":"3121abcc-c55d-493c-afa5-2912418cd7c1","Type":"ContainerDied","Data":"d6e857697adc4f891b5a51d5f1d26916ba2144fc72e06d1a7d8660b203b13084"} Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.084185 4909 scope.go:117] "RemoveContainer" containerID="44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc" Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.084311 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rsprd" Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.102107 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rsprd"] Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.102189 4909 scope.go:117] "RemoveContainer" containerID="0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f" Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.107311 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rsprd"] Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.121341 4909 scope.go:117] "RemoveContainer" containerID="619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df" Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.143528 4909 scope.go:117] "RemoveContainer" containerID="44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc" Feb 02 11:53:21 crc kubenswrapper[4909]: E0202 11:53:21.144548 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc\": container with ID starting with 44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc not found: ID does not exist" containerID="44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc" Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.144593 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc"} err="failed to get container status \"44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc\": rpc error: code = NotFound desc = could not find container \"44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc\": container with ID starting with 44988263f3cce49dc8bb73c35fb95740260229d1922f7bba2cfe377559fa4abc not found: ID does not exist" Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.144621 4909 scope.go:117] "RemoveContainer" containerID="0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f" Feb 02 11:53:21 crc kubenswrapper[4909]: E0202 11:53:21.144902 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f\": container with ID starting with 0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f not found: ID does not exist" containerID="0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f" Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.144922 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f"} err="failed to get container status \"0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f\": rpc error: code = NotFound desc = could not find container \"0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f\": container with ID starting with 0c321f96456a4470f204fe167c7a4ad5bf39b771a48d4274b44fab15ab2a4a0f not found: ID does not exist" Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.144937 4909 scope.go:117] "RemoveContainer" containerID="619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df" Feb 02 11:53:21 crc kubenswrapper[4909]: E0202 11:53:21.145183 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df\": container with ID starting with 619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df not found: ID does not exist" containerID="619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df" Feb 02 11:53:21 crc kubenswrapper[4909]: I0202 11:53:21.145210 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df"} err="failed to get container status \"619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df\": rpc error: code = NotFound desc = could not find container \"619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df\": container with ID starting with 619326215baa645338c6134513aa5a58d34136edc7789165cb8adc5b4cea22df not found: ID does not exist" Feb 02 11:53:23 crc kubenswrapper[4909]: I0202 11:53:23.024153 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3121abcc-c55d-493c-afa5-2912418cd7c1" path="/var/lib/kubelet/pods/3121abcc-c55d-493c-afa5-2912418cd7c1/volumes" Feb 02 11:53:43 crc kubenswrapper[4909]: I0202 11:53:43.044498 4909 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod3a9a11fb-cc8f-4e38-94e2-71adebf5db3f"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod3a9a11fb-cc8f-4e38-94e2-71adebf5db3f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3a9a11fb_cc8f_4e38_94e2_71adebf5db3f.slice" Feb 02 11:53:43 crc kubenswrapper[4909]: E0202 11:53:43.045021 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod3a9a11fb-cc8f-4e38-94e2-71adebf5db3f] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod3a9a11fb-cc8f-4e38-94e2-71adebf5db3f] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3a9a11fb_cc8f_4e38_94e2_71adebf5db3f.slice" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" podUID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" Feb 02 11:53:43 crc kubenswrapper[4909]: I0202 11:53:43.221726 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589cf688cc-2cbbb" Feb 02 11:53:43 crc kubenswrapper[4909]: I0202 11:53:43.255560 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-2cbbb"] Feb 02 11:53:43 crc kubenswrapper[4909]: I0202 11:53:43.261461 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589cf688cc-2cbbb"] Feb 02 11:53:45 crc kubenswrapper[4909]: I0202 11:53:45.026083 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a9a11fb-cc8f-4e38-94e2-71adebf5db3f" path="/var/lib/kubelet/pods/3a9a11fb-cc8f-4e38-94e2-71adebf5db3f/volumes" Feb 02 11:53:48 crc kubenswrapper[4909]: I0202 11:53:48.267315 4909 generic.go:334] "Generic (PLEG): container finished" podID="64589d75-4c87-4648-ad24-bfbd620e9f2d" containerID="3a13224ba734079693a54baab1f89462dab842fbf79715f1836c992ce5312885" exitCode=0 Feb 02 11:53:48 crc kubenswrapper[4909]: I0202 11:53:48.267422 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"64589d75-4c87-4648-ad24-bfbd620e9f2d","Type":"ContainerDied","Data":"3a13224ba734079693a54baab1f89462dab842fbf79715f1836c992ce5312885"} Feb 02 11:53:48 crc kubenswrapper[4909]: I0202 11:53:48.271098 4909 generic.go:334] "Generic (PLEG): container finished" podID="a578b144-a50a-4b91-9410-493990a51e5a" containerID="78ef4200fd86fd774bf292d2eae8596863ccdac7f717f92bb37dad0e3242eb63" exitCode=0 Feb 02 11:53:48 crc kubenswrapper[4909]: I0202 11:53:48.271152 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a578b144-a50a-4b91-9410-493990a51e5a","Type":"ContainerDied","Data":"78ef4200fd86fd774bf292d2eae8596863ccdac7f717f92bb37dad0e3242eb63"} Feb 02 11:53:49 crc kubenswrapper[4909]: I0202 11:53:49.280983 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a578b144-a50a-4b91-9410-493990a51e5a","Type":"ContainerStarted","Data":"10b7d30a96461c3b6ab78eed9c0caca291861debd7e4843d76dcc5b860e15493"} Feb 02 11:53:49 crc kubenswrapper[4909]: I0202 11:53:49.281758 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:53:49 crc kubenswrapper[4909]: I0202 11:53:49.283365 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"64589d75-4c87-4648-ad24-bfbd620e9f2d","Type":"ContainerStarted","Data":"22a52c05f2a896e50a4162df5fd0fa16c294c4356344eda4ea2ab36ee904df4f"} Feb 02 11:53:49 crc kubenswrapper[4909]: I0202 11:53:49.283554 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 02 11:53:49 crc kubenswrapper[4909]: I0202 11:53:49.307356 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.307333728 podStartE2EDuration="35.307333728s" podCreationTimestamp="2026-02-02 11:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:53:49.305078054 +0000 UTC m=+4955.051178799" watchObservedRunningTime="2026-02-02 11:53:49.307333728 +0000 UTC m=+4955.053434463" Feb 02 11:53:49 crc kubenswrapper[4909]: I0202 11:53:49.336388 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.336365352 podStartE2EDuration="35.336365352s" podCreationTimestamp="2026-02-02 11:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:53:49.327966414 +0000 UTC m=+4955.074067159" watchObservedRunningTime="2026-02-02 11:53:49.336365352 +0000 UTC m=+4955.082466087" Feb 02 11:53:49 crc kubenswrapper[4909]: I0202 11:53:49.511600 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:53:49 crc kubenswrapper[4909]: I0202 11:53:49.511669 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:54:04 crc kubenswrapper[4909]: I0202 11:54:04.748982 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 02 11:54:04 crc kubenswrapper[4909]: I0202 11:54:04.757948 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.253147 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 02 11:54:08 crc kubenswrapper[4909]: E0202 11:54:08.253751 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerName="extract-content" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.253765 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerName="extract-content" Feb 02 11:54:08 crc kubenswrapper[4909]: E0202 11:54:08.253778 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerName="registry-server" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.253784 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerName="registry-server" Feb 02 11:54:08 crc kubenswrapper[4909]: E0202 11:54:08.253800 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerName="extract-utilities" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.253828 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerName="extract-utilities" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.253958 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3121abcc-c55d-493c-afa5-2912418cd7c1" containerName="registry-server" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.254463 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.257601 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dc4ld" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.268137 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.339066 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8stcl\" (UniqueName: \"kubernetes.io/projected/3ee8edb4-c8f4-442d-8c89-81748fe1f25e-kube-api-access-8stcl\") pod \"mariadb-client\" (UID: \"3ee8edb4-c8f4-442d-8c89-81748fe1f25e\") " pod="openstack/mariadb-client" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.442084 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8stcl\" (UniqueName: \"kubernetes.io/projected/3ee8edb4-c8f4-442d-8c89-81748fe1f25e-kube-api-access-8stcl\") pod \"mariadb-client\" (UID: \"3ee8edb4-c8f4-442d-8c89-81748fe1f25e\") " pod="openstack/mariadb-client" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.463360 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8stcl\" (UniqueName: \"kubernetes.io/projected/3ee8edb4-c8f4-442d-8c89-81748fe1f25e-kube-api-access-8stcl\") pod \"mariadb-client\" (UID: \"3ee8edb4-c8f4-442d-8c89-81748fe1f25e\") " pod="openstack/mariadb-client" Feb 02 11:54:08 crc kubenswrapper[4909]: I0202 11:54:08.585394 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:54:09 crc kubenswrapper[4909]: I0202 11:54:09.078961 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:54:09 crc kubenswrapper[4909]: W0202 11:54:09.084351 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ee8edb4_c8f4_442d_8c89_81748fe1f25e.slice/crio-63cd894a0b7bb7e23f613d11bb33f3e03d094a23247599eccb997a32a88ef9d5 WatchSource:0}: Error finding container 63cd894a0b7bb7e23f613d11bb33f3e03d094a23247599eccb997a32a88ef9d5: Status 404 returned error can't find the container with id 63cd894a0b7bb7e23f613d11bb33f3e03d094a23247599eccb997a32a88ef9d5 Feb 02 11:54:09 crc kubenswrapper[4909]: I0202 11:54:09.417799 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"3ee8edb4-c8f4-442d-8c89-81748fe1f25e","Type":"ContainerStarted","Data":"63cd894a0b7bb7e23f613d11bb33f3e03d094a23247599eccb997a32a88ef9d5"} Feb 02 11:54:10 crc kubenswrapper[4909]: I0202 11:54:10.427116 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"3ee8edb4-c8f4-442d-8c89-81748fe1f25e","Type":"ContainerStarted","Data":"0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad"} Feb 02 11:54:10 crc kubenswrapper[4909]: I0202 11:54:10.444571 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.786811218 podStartE2EDuration="2.444544807s" podCreationTimestamp="2026-02-02 11:54:08 +0000 UTC" firstStartedPulling="2026-02-02 11:54:09.087326793 +0000 UTC m=+4974.833427528" lastFinishedPulling="2026-02-02 11:54:09.745060382 +0000 UTC m=+4975.491161117" observedRunningTime="2026-02-02 11:54:10.443787925 +0000 UTC m=+4976.189888660" watchObservedRunningTime="2026-02-02 11:54:10.444544807 +0000 UTC m=+4976.190645542" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.015356 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4cvbg"] Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.018010 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.029764 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cvbg"] Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.127927 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7pvn\" (UniqueName: \"kubernetes.io/projected/d8f3129e-a33d-4814-8810-6820475178c5-kube-api-access-g7pvn\") pod \"redhat-marketplace-4cvbg\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.128021 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-utilities\") pod \"redhat-marketplace-4cvbg\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.128121 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-catalog-content\") pod \"redhat-marketplace-4cvbg\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.229583 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-utilities\") pod \"redhat-marketplace-4cvbg\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.229992 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-catalog-content\") pod \"redhat-marketplace-4cvbg\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.230127 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7pvn\" (UniqueName: \"kubernetes.io/projected/d8f3129e-a33d-4814-8810-6820475178c5-kube-api-access-g7pvn\") pod \"redhat-marketplace-4cvbg\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.230221 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-utilities\") pod \"redhat-marketplace-4cvbg\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.230476 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-catalog-content\") pod \"redhat-marketplace-4cvbg\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.254387 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7pvn\" (UniqueName: \"kubernetes.io/projected/d8f3129e-a33d-4814-8810-6820475178c5-kube-api-access-g7pvn\") pod \"redhat-marketplace-4cvbg\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.337220 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:14 crc kubenswrapper[4909]: I0202 11:54:14.794071 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cvbg"] Feb 02 11:54:15 crc kubenswrapper[4909]: I0202 11:54:15.459906 4909 generic.go:334] "Generic (PLEG): container finished" podID="d8f3129e-a33d-4814-8810-6820475178c5" containerID="d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7" exitCode=0 Feb 02 11:54:15 crc kubenswrapper[4909]: I0202 11:54:15.459957 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cvbg" event={"ID":"d8f3129e-a33d-4814-8810-6820475178c5","Type":"ContainerDied","Data":"d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7"} Feb 02 11:54:15 crc kubenswrapper[4909]: I0202 11:54:15.459986 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cvbg" event={"ID":"d8f3129e-a33d-4814-8810-6820475178c5","Type":"ContainerStarted","Data":"ecbd4dcc47d0d05f40676bc93847c04ea7c6690089e16d93c90fe5a148101e9e"} Feb 02 11:54:16 crc kubenswrapper[4909]: I0202 11:54:16.473438 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cvbg" event={"ID":"d8f3129e-a33d-4814-8810-6820475178c5","Type":"ContainerStarted","Data":"2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59"} Feb 02 11:54:17 crc kubenswrapper[4909]: I0202 11:54:17.481088 4909 generic.go:334] "Generic (PLEG): container finished" podID="d8f3129e-a33d-4814-8810-6820475178c5" containerID="2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59" exitCode=0 Feb 02 11:54:17 crc kubenswrapper[4909]: I0202 11:54:17.481127 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cvbg" event={"ID":"d8f3129e-a33d-4814-8810-6820475178c5","Type":"ContainerDied","Data":"2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59"} Feb 02 11:54:18 crc kubenswrapper[4909]: I0202 11:54:18.491201 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cvbg" event={"ID":"d8f3129e-a33d-4814-8810-6820475178c5","Type":"ContainerStarted","Data":"4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615"} Feb 02 11:54:18 crc kubenswrapper[4909]: I0202 11:54:18.512989 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4cvbg" podStartSLOduration=3.040883935 podStartE2EDuration="5.512974367s" podCreationTimestamp="2026-02-02 11:54:13 +0000 UTC" firstStartedPulling="2026-02-02 11:54:15.461868338 +0000 UTC m=+4981.207969073" lastFinishedPulling="2026-02-02 11:54:17.93395877 +0000 UTC m=+4983.680059505" observedRunningTime="2026-02-02 11:54:18.508567152 +0000 UTC m=+4984.254667887" watchObservedRunningTime="2026-02-02 11:54:18.512974367 +0000 UTC m=+4984.259075102" Feb 02 11:54:19 crc kubenswrapper[4909]: E0202 11:54:19.093854 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f3129e_a33d_4814_8810_6820475178c5.slice/crio-d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:54:19 crc kubenswrapper[4909]: I0202 11:54:19.511504 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 11:54:19 crc kubenswrapper[4909]: I0202 11:54:19.511568 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 11:54:19 crc kubenswrapper[4909]: I0202 11:54:19.511618 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 11:54:19 crc kubenswrapper[4909]: I0202 11:54:19.512542 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 11:54:19 crc kubenswrapper[4909]: I0202 11:54:19.512608 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" gracePeriod=600 Feb 02 11:54:19 crc kubenswrapper[4909]: E0202 11:54:19.639474 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:54:20 crc kubenswrapper[4909]: I0202 11:54:20.518299 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" exitCode=0 Feb 02 11:54:20 crc kubenswrapper[4909]: I0202 11:54:20.518345 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83"} Feb 02 11:54:20 crc kubenswrapper[4909]: I0202 11:54:20.518701 4909 scope.go:117] "RemoveContainer" containerID="38ee1278ab1a7ba78b053e71d2e465c14550aab95c120841b2bbca3648b7d0d2" Feb 02 11:54:20 crc kubenswrapper[4909]: I0202 11:54:20.519356 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:54:20 crc kubenswrapper[4909]: E0202 11:54:20.519637 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:54:22 crc kubenswrapper[4909]: I0202 11:54:22.769991 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:54:22 crc kubenswrapper[4909]: I0202 11:54:22.770487 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="3ee8edb4-c8f4-442d-8c89-81748fe1f25e" containerName="mariadb-client" containerID="cri-o://0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad" gracePeriod=30 Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.251429 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.393314 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8stcl\" (UniqueName: \"kubernetes.io/projected/3ee8edb4-c8f4-442d-8c89-81748fe1f25e-kube-api-access-8stcl\") pod \"3ee8edb4-c8f4-442d-8c89-81748fe1f25e\" (UID: \"3ee8edb4-c8f4-442d-8c89-81748fe1f25e\") " Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.400114 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ee8edb4-c8f4-442d-8c89-81748fe1f25e-kube-api-access-8stcl" (OuterVolumeSpecName: "kube-api-access-8stcl") pod "3ee8edb4-c8f4-442d-8c89-81748fe1f25e" (UID: "3ee8edb4-c8f4-442d-8c89-81748fe1f25e"). InnerVolumeSpecName "kube-api-access-8stcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.495652 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8stcl\" (UniqueName: \"kubernetes.io/projected/3ee8edb4-c8f4-442d-8c89-81748fe1f25e-kube-api-access-8stcl\") on node \"crc\" DevicePath \"\"" Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.542492 4909 generic.go:334] "Generic (PLEG): container finished" podID="3ee8edb4-c8f4-442d-8c89-81748fe1f25e" containerID="0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad" exitCode=143 Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.542543 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"3ee8edb4-c8f4-442d-8c89-81748fe1f25e","Type":"ContainerDied","Data":"0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad"} Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.542573 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"3ee8edb4-c8f4-442d-8c89-81748fe1f25e","Type":"ContainerDied","Data":"63cd894a0b7bb7e23f613d11bb33f3e03d094a23247599eccb997a32a88ef9d5"} Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.542592 4909 scope.go:117] "RemoveContainer" containerID="0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad" Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.542700 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.562353 4909 scope.go:117] "RemoveContainer" containerID="0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad" Feb 02 11:54:23 crc kubenswrapper[4909]: E0202 11:54:23.562975 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad\": container with ID starting with 0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad not found: ID does not exist" containerID="0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad" Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.563021 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad"} err="failed to get container status \"0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad\": rpc error: code = NotFound desc = could not find container \"0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad\": container with ID starting with 0b56a7af4d524d0a218f2cc995cc57d5a4f8075e3ce785850d5aa3eed270cfad not found: ID does not exist" Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.569604 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:54:23 crc kubenswrapper[4909]: I0202 11:54:23.576387 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:54:24 crc kubenswrapper[4909]: I0202 11:54:24.337936 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:24 crc kubenswrapper[4909]: I0202 11:54:24.338248 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:24 crc kubenswrapper[4909]: I0202 11:54:24.382911 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:24 crc kubenswrapper[4909]: I0202 11:54:24.594415 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:24 crc kubenswrapper[4909]: I0202 11:54:24.642564 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cvbg"] Feb 02 11:54:25 crc kubenswrapper[4909]: I0202 11:54:25.027172 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ee8edb4-c8f4-442d-8c89-81748fe1f25e" path="/var/lib/kubelet/pods/3ee8edb4-c8f4-442d-8c89-81748fe1f25e/volumes" Feb 02 11:54:26 crc kubenswrapper[4909]: I0202 11:54:26.566590 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4cvbg" podUID="d8f3129e-a33d-4814-8810-6820475178c5" containerName="registry-server" containerID="cri-o://4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615" gracePeriod=2 Feb 02 11:54:26 crc kubenswrapper[4909]: I0202 11:54:26.979170 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.147694 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-utilities\") pod \"d8f3129e-a33d-4814-8810-6820475178c5\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.147848 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-catalog-content\") pod \"d8f3129e-a33d-4814-8810-6820475178c5\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.148042 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7pvn\" (UniqueName: \"kubernetes.io/projected/d8f3129e-a33d-4814-8810-6820475178c5-kube-api-access-g7pvn\") pod \"d8f3129e-a33d-4814-8810-6820475178c5\" (UID: \"d8f3129e-a33d-4814-8810-6820475178c5\") " Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.150489 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-utilities" (OuterVolumeSpecName: "utilities") pod "d8f3129e-a33d-4814-8810-6820475178c5" (UID: "d8f3129e-a33d-4814-8810-6820475178c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.154400 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f3129e-a33d-4814-8810-6820475178c5-kube-api-access-g7pvn" (OuterVolumeSpecName: "kube-api-access-g7pvn") pod "d8f3129e-a33d-4814-8810-6820475178c5" (UID: "d8f3129e-a33d-4814-8810-6820475178c5"). InnerVolumeSpecName "kube-api-access-g7pvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.172668 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8f3129e-a33d-4814-8810-6820475178c5" (UID: "d8f3129e-a33d-4814-8810-6820475178c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.251853 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7pvn\" (UniqueName: \"kubernetes.io/projected/d8f3129e-a33d-4814-8810-6820475178c5-kube-api-access-g7pvn\") on node \"crc\" DevicePath \"\"" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.251890 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.251900 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8f3129e-a33d-4814-8810-6820475178c5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.575146 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cvbg" event={"ID":"d8f3129e-a33d-4814-8810-6820475178c5","Type":"ContainerDied","Data":"4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615"} Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.575206 4909 scope.go:117] "RemoveContainer" containerID="4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.575151 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4cvbg" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.575927 4909 generic.go:334] "Generic (PLEG): container finished" podID="d8f3129e-a33d-4814-8810-6820475178c5" containerID="4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615" exitCode=0 Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.575963 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4cvbg" event={"ID":"d8f3129e-a33d-4814-8810-6820475178c5","Type":"ContainerDied","Data":"ecbd4dcc47d0d05f40676bc93847c04ea7c6690089e16d93c90fe5a148101e9e"} Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.594485 4909 scope.go:117] "RemoveContainer" containerID="2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.611601 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cvbg"] Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.617745 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4cvbg"] Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.634941 4909 scope.go:117] "RemoveContainer" containerID="d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.652997 4909 scope.go:117] "RemoveContainer" containerID="4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615" Feb 02 11:54:27 crc kubenswrapper[4909]: E0202 11:54:27.653505 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615\": container with ID starting with 4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615 not found: ID does not exist" containerID="4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.653544 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615"} err="failed to get container status \"4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615\": rpc error: code = NotFound desc = could not find container \"4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615\": container with ID starting with 4abbdc61e696994c3d4b810769265af8fc2778a4c49d65857cb483945481c615 not found: ID does not exist" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.653567 4909 scope.go:117] "RemoveContainer" containerID="2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59" Feb 02 11:54:27 crc kubenswrapper[4909]: E0202 11:54:27.654023 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59\": container with ID starting with 2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59 not found: ID does not exist" containerID="2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.654069 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59"} err="failed to get container status \"2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59\": rpc error: code = NotFound desc = could not find container \"2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59\": container with ID starting with 2b3ef707dc018ddf215ca57db5b0bef06a170a6bd3de78a448cf435a7c031d59 not found: ID does not exist" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.654104 4909 scope.go:117] "RemoveContainer" containerID="d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7" Feb 02 11:54:27 crc kubenswrapper[4909]: E0202 11:54:27.654442 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7\": container with ID starting with d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7 not found: ID does not exist" containerID="d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7" Feb 02 11:54:27 crc kubenswrapper[4909]: I0202 11:54:27.654487 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7"} err="failed to get container status \"d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7\": rpc error: code = NotFound desc = could not find container \"d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7\": container with ID starting with d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7 not found: ID does not exist" Feb 02 11:54:29 crc kubenswrapper[4909]: I0202 11:54:29.026202 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f3129e-a33d-4814-8810-6820475178c5" path="/var/lib/kubelet/pods/d8f3129e-a33d-4814-8810-6820475178c5/volumes" Feb 02 11:54:29 crc kubenswrapper[4909]: E0202 11:54:29.273082 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f3129e_a33d_4814_8810_6820475178c5.slice/crio-d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:54:35 crc kubenswrapper[4909]: I0202 11:54:35.023145 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:54:35 crc kubenswrapper[4909]: E0202 11:54:35.023748 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:54:39 crc kubenswrapper[4909]: E0202 11:54:39.438440 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f3129e_a33d_4814_8810_6820475178c5.slice/crio-d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:54:49 crc kubenswrapper[4909]: I0202 11:54:49.016328 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:54:49 crc kubenswrapper[4909]: E0202 11:54:49.017098 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:54:49 crc kubenswrapper[4909]: E0202 11:54:49.596379 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f3129e_a33d_4814_8810_6820475178c5.slice/crio-d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:54:59 crc kubenswrapper[4909]: E0202 11:54:59.768141 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f3129e_a33d_4814_8810_6820475178c5.slice/crio-d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:55:02 crc kubenswrapper[4909]: I0202 11:55:02.016293 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:55:02 crc kubenswrapper[4909]: E0202 11:55:02.016962 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:55:09 crc kubenswrapper[4909]: E0202 11:55:09.940328 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8f3129e_a33d_4814_8810_6820475178c5.slice/crio-d4f2fef771147a39ab2e4d94d07d4bb70860e9f6e23fac03540f59144d051eb7.scope\": RecentStats: unable to find data in memory cache]" Feb 02 11:55:14 crc kubenswrapper[4909]: I0202 11:55:14.016753 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:55:14 crc kubenswrapper[4909]: E0202 11:55:14.018192 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:55:27 crc kubenswrapper[4909]: I0202 11:55:27.017384 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:55:27 crc kubenswrapper[4909]: E0202 11:55:27.019728 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:55:42 crc kubenswrapper[4909]: I0202 11:55:42.016326 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:55:42 crc kubenswrapper[4909]: E0202 11:55:42.017157 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:55:54 crc kubenswrapper[4909]: I0202 11:55:54.016576 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:55:54 crc kubenswrapper[4909]: E0202 11:55:54.017435 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:56:05 crc kubenswrapper[4909]: I0202 11:56:05.020166 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:56:05 crc kubenswrapper[4909]: E0202 11:56:05.020887 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:56:18 crc kubenswrapper[4909]: I0202 11:56:18.016440 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:56:18 crc kubenswrapper[4909]: E0202 11:56:18.017185 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:56:20 crc kubenswrapper[4909]: I0202 11:56:20.545055 4909 scope.go:117] "RemoveContainer" containerID="59e7b291e9fa579cffe6a5085cab33a158144b59d037995f955f9fb7e2f0ba0a" Feb 02 11:56:29 crc kubenswrapper[4909]: I0202 11:56:29.017031 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:56:29 crc kubenswrapper[4909]: E0202 11:56:29.017899 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:56:44 crc kubenswrapper[4909]: I0202 11:56:44.016394 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:56:44 crc kubenswrapper[4909]: E0202 11:56:44.017414 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:56:59 crc kubenswrapper[4909]: I0202 11:56:59.016680 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:56:59 crc kubenswrapper[4909]: E0202 11:56:59.017481 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:57:14 crc kubenswrapper[4909]: I0202 11:57:14.016439 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:57:14 crc kubenswrapper[4909]: E0202 11:57:14.017240 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:57:26 crc kubenswrapper[4909]: I0202 11:57:26.016273 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:57:26 crc kubenswrapper[4909]: E0202 11:57:26.017065 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:57:38 crc kubenswrapper[4909]: I0202 11:57:38.016375 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:57:38 crc kubenswrapper[4909]: E0202 11:57:38.017141 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:57:52 crc kubenswrapper[4909]: I0202 11:57:52.017635 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:57:52 crc kubenswrapper[4909]: E0202 11:57:52.018756 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.689535 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 02 11:57:57 crc kubenswrapper[4909]: E0202 11:57:57.690428 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f3129e-a33d-4814-8810-6820475178c5" containerName="extract-utilities" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.690445 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f3129e-a33d-4814-8810-6820475178c5" containerName="extract-utilities" Feb 02 11:57:57 crc kubenswrapper[4909]: E0202 11:57:57.690468 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ee8edb4-c8f4-442d-8c89-81748fe1f25e" containerName="mariadb-client" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.690475 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ee8edb4-c8f4-442d-8c89-81748fe1f25e" containerName="mariadb-client" Feb 02 11:57:57 crc kubenswrapper[4909]: E0202 11:57:57.690498 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f3129e-a33d-4814-8810-6820475178c5" containerName="registry-server" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.690508 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f3129e-a33d-4814-8810-6820475178c5" containerName="registry-server" Feb 02 11:57:57 crc kubenswrapper[4909]: E0202 11:57:57.690522 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f3129e-a33d-4814-8810-6820475178c5" containerName="extract-content" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.690528 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f3129e-a33d-4814-8810-6820475178c5" containerName="extract-content" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.690674 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f3129e-a33d-4814-8810-6820475178c5" containerName="registry-server" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.690695 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ee8edb4-c8f4-442d-8c89-81748fe1f25e" containerName="mariadb-client" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.691217 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.693310 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dc4ld" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.703492 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.790735 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\") pod \"mariadb-copy-data\" (UID: \"e2846250-fddd-45b4-9dd2-f432fca93762\") " pod="openstack/mariadb-copy-data" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.790823 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfcck\" (UniqueName: \"kubernetes.io/projected/e2846250-fddd-45b4-9dd2-f432fca93762-kube-api-access-lfcck\") pod \"mariadb-copy-data\" (UID: \"e2846250-fddd-45b4-9dd2-f432fca93762\") " pod="openstack/mariadb-copy-data" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.892524 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\") pod \"mariadb-copy-data\" (UID: \"e2846250-fddd-45b4-9dd2-f432fca93762\") " pod="openstack/mariadb-copy-data" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.892598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfcck\" (UniqueName: \"kubernetes.io/projected/e2846250-fddd-45b4-9dd2-f432fca93762-kube-api-access-lfcck\") pod \"mariadb-copy-data\" (UID: \"e2846250-fddd-45b4-9dd2-f432fca93762\") " pod="openstack/mariadb-copy-data" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.896078 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.896115 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\") pod \"mariadb-copy-data\" (UID: \"e2846250-fddd-45b4-9dd2-f432fca93762\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/775e026747ac0dae2d85704fb431dd58d6e695ea2135c7b74d5d3d0b644f21d2/globalmount\"" pod="openstack/mariadb-copy-data" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.919654 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\") pod \"mariadb-copy-data\" (UID: \"e2846250-fddd-45b4-9dd2-f432fca93762\") " pod="openstack/mariadb-copy-data" Feb 02 11:57:57 crc kubenswrapper[4909]: I0202 11:57:57.921621 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfcck\" (UniqueName: \"kubernetes.io/projected/e2846250-fddd-45b4-9dd2-f432fca93762-kube-api-access-lfcck\") pod \"mariadb-copy-data\" (UID: \"e2846250-fddd-45b4-9dd2-f432fca93762\") " pod="openstack/mariadb-copy-data" Feb 02 11:57:58 crc kubenswrapper[4909]: I0202 11:57:58.053441 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 02 11:57:58 crc kubenswrapper[4909]: I0202 11:57:58.606391 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 02 11:57:59 crc kubenswrapper[4909]: I0202 11:57:59.352360 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e2846250-fddd-45b4-9dd2-f432fca93762","Type":"ContainerStarted","Data":"9c27f823792342dbc1595f839bc948799613166a05cd2289d49cee190b5fcc0d"} Feb 02 11:57:59 crc kubenswrapper[4909]: I0202 11:57:59.352679 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e2846250-fddd-45b4-9dd2-f432fca93762","Type":"ContainerStarted","Data":"c8410835049d3a4e81d4e06fbb1386477685edeb67be6ae4abcce306ffcdfe5d"} Feb 02 11:57:59 crc kubenswrapper[4909]: I0202 11:57:59.367149 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.36712852 podStartE2EDuration="3.36712852s" podCreationTimestamp="2026-02-02 11:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:57:59.366154803 +0000 UTC m=+5205.112255538" watchObservedRunningTime="2026-02-02 11:57:59.36712852 +0000 UTC m=+5205.113229255" Feb 02 11:58:02 crc kubenswrapper[4909]: I0202 11:58:02.008735 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 02 11:58:02 crc kubenswrapper[4909]: I0202 11:58:02.010306 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:58:02 crc kubenswrapper[4909]: I0202 11:58:02.015848 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:58:02 crc kubenswrapper[4909]: I0202 11:58:02.053907 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrhq9\" (UniqueName: \"kubernetes.io/projected/6de80c09-aed0-4046-a7a5-6139a0d40950-kube-api-access-xrhq9\") pod \"mariadb-client\" (UID: \"6de80c09-aed0-4046-a7a5-6139a0d40950\") " pod="openstack/mariadb-client" Feb 02 11:58:02 crc kubenswrapper[4909]: I0202 11:58:02.156696 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrhq9\" (UniqueName: \"kubernetes.io/projected/6de80c09-aed0-4046-a7a5-6139a0d40950-kube-api-access-xrhq9\") pod \"mariadb-client\" (UID: \"6de80c09-aed0-4046-a7a5-6139a0d40950\") " pod="openstack/mariadb-client" Feb 02 11:58:02 crc kubenswrapper[4909]: I0202 11:58:02.178648 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrhq9\" (UniqueName: \"kubernetes.io/projected/6de80c09-aed0-4046-a7a5-6139a0d40950-kube-api-access-xrhq9\") pod \"mariadb-client\" (UID: \"6de80c09-aed0-4046-a7a5-6139a0d40950\") " pod="openstack/mariadb-client" Feb 02 11:58:02 crc kubenswrapper[4909]: I0202 11:58:02.337783 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:58:02 crc kubenswrapper[4909]: I0202 11:58:02.814607 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:58:02 crc kubenswrapper[4909]: W0202 11:58:02.818558 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6de80c09_aed0_4046_a7a5_6139a0d40950.slice/crio-8690313c5a0584e3b87dfc1c6c3d86a3bb6f4553da3d2c317ee45161a41c957b WatchSource:0}: Error finding container 8690313c5a0584e3b87dfc1c6c3d86a3bb6f4553da3d2c317ee45161a41c957b: Status 404 returned error can't find the container with id 8690313c5a0584e3b87dfc1c6c3d86a3bb6f4553da3d2c317ee45161a41c957b Feb 02 11:58:03 crc kubenswrapper[4909]: I0202 11:58:03.380963 4909 generic.go:334] "Generic (PLEG): container finished" podID="6de80c09-aed0-4046-a7a5-6139a0d40950" containerID="d893751d08cdf79e417447ab69787b107e4abfb8df30d85f5101fdd6eac9ee14" exitCode=0 Feb 02 11:58:03 crc kubenswrapper[4909]: I0202 11:58:03.381282 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"6de80c09-aed0-4046-a7a5-6139a0d40950","Type":"ContainerDied","Data":"d893751d08cdf79e417447ab69787b107e4abfb8df30d85f5101fdd6eac9ee14"} Feb 02 11:58:03 crc kubenswrapper[4909]: I0202 11:58:03.381317 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"6de80c09-aed0-4046-a7a5-6139a0d40950","Type":"ContainerStarted","Data":"8690313c5a0584e3b87dfc1c6c3d86a3bb6f4553da3d2c317ee45161a41c957b"} Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.658112 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.686087 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_6de80c09-aed0-4046-a7a5-6139a0d40950/mariadb-client/0.log" Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.712646 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.718026 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.799524 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrhq9\" (UniqueName: \"kubernetes.io/projected/6de80c09-aed0-4046-a7a5-6139a0d40950-kube-api-access-xrhq9\") pod \"6de80c09-aed0-4046-a7a5-6139a0d40950\" (UID: \"6de80c09-aed0-4046-a7a5-6139a0d40950\") " Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.807250 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6de80c09-aed0-4046-a7a5-6139a0d40950-kube-api-access-xrhq9" (OuterVolumeSpecName: "kube-api-access-xrhq9") pod "6de80c09-aed0-4046-a7a5-6139a0d40950" (UID: "6de80c09-aed0-4046-a7a5-6139a0d40950"). InnerVolumeSpecName "kube-api-access-xrhq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.830606 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 02 11:58:04 crc kubenswrapper[4909]: E0202 11:58:04.831058 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6de80c09-aed0-4046-a7a5-6139a0d40950" containerName="mariadb-client" Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.831081 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6de80c09-aed0-4046-a7a5-6139a0d40950" containerName="mariadb-client" Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.831249 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6de80c09-aed0-4046-a7a5-6139a0d40950" containerName="mariadb-client" Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.831831 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.841324 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:58:04 crc kubenswrapper[4909]: I0202 11:58:04.901391 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrhq9\" (UniqueName: \"kubernetes.io/projected/6de80c09-aed0-4046-a7a5-6139a0d40950-kube-api-access-xrhq9\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:05 crc kubenswrapper[4909]: I0202 11:58:05.003591 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd2qd\" (UniqueName: \"kubernetes.io/projected/aae3978f-219f-4634-87ce-cf6577c52eb4-kube-api-access-pd2qd\") pod \"mariadb-client\" (UID: \"aae3978f-219f-4634-87ce-cf6577c52eb4\") " pod="openstack/mariadb-client" Feb 02 11:58:05 crc kubenswrapper[4909]: I0202 11:58:05.025582 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6de80c09-aed0-4046-a7a5-6139a0d40950" path="/var/lib/kubelet/pods/6de80c09-aed0-4046-a7a5-6139a0d40950/volumes" Feb 02 11:58:05 crc kubenswrapper[4909]: I0202 11:58:05.105524 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd2qd\" (UniqueName: \"kubernetes.io/projected/aae3978f-219f-4634-87ce-cf6577c52eb4-kube-api-access-pd2qd\") pod \"mariadb-client\" (UID: \"aae3978f-219f-4634-87ce-cf6577c52eb4\") " pod="openstack/mariadb-client" Feb 02 11:58:05 crc kubenswrapper[4909]: I0202 11:58:05.127110 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd2qd\" (UniqueName: \"kubernetes.io/projected/aae3978f-219f-4634-87ce-cf6577c52eb4-kube-api-access-pd2qd\") pod \"mariadb-client\" (UID: \"aae3978f-219f-4634-87ce-cf6577c52eb4\") " pod="openstack/mariadb-client" Feb 02 11:58:05 crc kubenswrapper[4909]: I0202 11:58:05.150106 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:58:05 crc kubenswrapper[4909]: I0202 11:58:05.354267 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:58:05 crc kubenswrapper[4909]: I0202 11:58:05.398523 4909 scope.go:117] "RemoveContainer" containerID="d893751d08cdf79e417447ab69787b107e4abfb8df30d85f5101fdd6eac9ee14" Feb 02 11:58:05 crc kubenswrapper[4909]: I0202 11:58:05.398532 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:58:05 crc kubenswrapper[4909]: I0202 11:58:05.401062 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"aae3978f-219f-4634-87ce-cf6577c52eb4","Type":"ContainerStarted","Data":"86e7a5743671e4f5d6500813742fec4699fce8420ac097786aa3b0d5055bc273"} Feb 02 11:58:06 crc kubenswrapper[4909]: I0202 11:58:06.413384 4909 generic.go:334] "Generic (PLEG): container finished" podID="aae3978f-219f-4634-87ce-cf6577c52eb4" containerID="48c29c1509e33cdd6317cb7dc255f44fde9a51dcbf54f58d7f1d92c391208e1b" exitCode=0 Feb 02 11:58:06 crc kubenswrapper[4909]: I0202 11:58:06.413515 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"aae3978f-219f-4634-87ce-cf6577c52eb4","Type":"ContainerDied","Data":"48c29c1509e33cdd6317cb7dc255f44fde9a51dcbf54f58d7f1d92c391208e1b"} Feb 02 11:58:07 crc kubenswrapper[4909]: I0202 11:58:07.016486 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:58:07 crc kubenswrapper[4909]: E0202 11:58:07.016978 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:58:07 crc kubenswrapper[4909]: I0202 11:58:07.728618 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:58:07 crc kubenswrapper[4909]: I0202 11:58:07.753926 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_aae3978f-219f-4634-87ce-cf6577c52eb4/mariadb-client/0.log" Feb 02 11:58:07 crc kubenswrapper[4909]: I0202 11:58:07.777729 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:58:07 crc kubenswrapper[4909]: I0202 11:58:07.783432 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 02 11:58:07 crc kubenswrapper[4909]: I0202 11:58:07.842824 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd2qd\" (UniqueName: \"kubernetes.io/projected/aae3978f-219f-4634-87ce-cf6577c52eb4-kube-api-access-pd2qd\") pod \"aae3978f-219f-4634-87ce-cf6577c52eb4\" (UID: \"aae3978f-219f-4634-87ce-cf6577c52eb4\") " Feb 02 11:58:07 crc kubenswrapper[4909]: I0202 11:58:07.864236 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae3978f-219f-4634-87ce-cf6577c52eb4-kube-api-access-pd2qd" (OuterVolumeSpecName: "kube-api-access-pd2qd") pod "aae3978f-219f-4634-87ce-cf6577c52eb4" (UID: "aae3978f-219f-4634-87ce-cf6577c52eb4"). InnerVolumeSpecName "kube-api-access-pd2qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:58:07 crc kubenswrapper[4909]: I0202 11:58:07.944327 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd2qd\" (UniqueName: \"kubernetes.io/projected/aae3978f-219f-4634-87ce-cf6577c52eb4-kube-api-access-pd2qd\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:08 crc kubenswrapper[4909]: I0202 11:58:08.426746 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86e7a5743671e4f5d6500813742fec4699fce8420ac097786aa3b0d5055bc273" Feb 02 11:58:08 crc kubenswrapper[4909]: I0202 11:58:08.426820 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 02 11:58:09 crc kubenswrapper[4909]: I0202 11:58:09.025415 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae3978f-219f-4634-87ce-cf6577c52eb4" path="/var/lib/kubelet/pods/aae3978f-219f-4634-87ce-cf6577c52eb4/volumes" Feb 02 11:58:18 crc kubenswrapper[4909]: I0202 11:58:18.016786 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:58:18 crc kubenswrapper[4909]: E0202 11:58:18.018075 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:58:20 crc kubenswrapper[4909]: I0202 11:58:20.658252 4909 scope.go:117] "RemoveContainer" containerID="1741581c9e2297db3465851611c9fd1d26c3d106a1de2cb85e50f51ff9452ce5" Feb 02 11:58:30 crc kubenswrapper[4909]: I0202 11:58:30.016198 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:58:30 crc kubenswrapper[4909]: E0202 11:58:30.016960 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.740428 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 11:58:37 crc kubenswrapper[4909]: E0202 11:58:37.741294 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae3978f-219f-4634-87ce-cf6577c52eb4" containerName="mariadb-client" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.741308 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae3978f-219f-4634-87ce-cf6577c52eb4" containerName="mariadb-client" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.741437 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae3978f-219f-4634-87ce-cf6577c52eb4" containerName="mariadb-client" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.743482 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.749410 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.749417 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.750301 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.750373 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.750391 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cf4wc" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.753174 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.762741 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.764546 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.778312 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.779911 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.785640 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.801189 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.909696 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-84753cc4-263a-4c04-93c2-da2c3a57c3e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84753cc4-263a-4c04-93c2-da2c3a57c3e1\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.909749 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5aa0410b-2229-496d-a4df-7769afab71c3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.909775 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067f53b-b646-4cfb-82c6-71cc14a45dcb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.909817 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb9n8\" (UniqueName: \"kubernetes.io/projected/0067f53b-b646-4cfb-82c6-71cc14a45dcb-kube-api-access-tb9n8\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.909839 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b9c7ae-e055-434d-b016-2dcca5daf712-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910045 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5aa0410b-2229-496d-a4df-7769afab71c3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910060 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4jtm\" (UniqueName: \"kubernetes.io/projected/5aa0410b-2229-496d-a4df-7769afab71c3-kube-api-access-d4jtm\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910199 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067f53b-b646-4cfb-82c6-71cc14a45dcb-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910294 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36b9c7ae-e055-434d-b016-2dcca5daf712-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910329 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0067f53b-b646-4cfb-82c6-71cc14a45dcb-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0b44b95e-b40b-4f6d-bb19-75adb6d761b0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b44b95e-b40b-4f6d-bb19-75adb6d761b0\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910399 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa0410b-2229-496d-a4df-7769afab71c3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910418 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067f53b-b646-4cfb-82c6-71cc14a45dcb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910441 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36b9c7ae-e055-434d-b016-2dcca5daf712-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910507 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa0410b-2229-496d-a4df-7769afab71c3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910530 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36b9c7ae-e055-434d-b016-2dcca5daf712-config\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910558 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa0410b-2229-496d-a4df-7769afab71c3-config\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910576 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b9c7ae-e055-434d-b016-2dcca5daf712-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910597 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpwgr\" (UniqueName: \"kubernetes.io/projected/36b9c7ae-e055-434d-b016-2dcca5daf712-kube-api-access-bpwgr\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910613 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa0410b-2229-496d-a4df-7769afab71c3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910630 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0067f53b-b646-4cfb-82c6-71cc14a45dcb-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910662 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-43c1a8d5-c48a-495c-b18b-79b5fc03820f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43c1a8d5-c48a-495c-b18b-79b5fc03820f\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910686 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0067f53b-b646-4cfb-82c6-71cc14a45dcb-config\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:37 crc kubenswrapper[4909]: I0202 11:58:37.910704 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b9c7ae-e055-434d-b016-2dcca5daf712-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.011794 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-84753cc4-263a-4c04-93c2-da2c3a57c3e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84753cc4-263a-4c04-93c2-da2c3a57c3e1\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.011862 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5aa0410b-2229-496d-a4df-7769afab71c3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.011891 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067f53b-b646-4cfb-82c6-71cc14a45dcb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.011913 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb9n8\" (UniqueName: \"kubernetes.io/projected/0067f53b-b646-4cfb-82c6-71cc14a45dcb-kube-api-access-tb9n8\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.011931 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b9c7ae-e055-434d-b016-2dcca5daf712-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.011951 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5aa0410b-2229-496d-a4df-7769afab71c3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.011968 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4jtm\" (UniqueName: \"kubernetes.io/projected/5aa0410b-2229-496d-a4df-7769afab71c3-kube-api-access-d4jtm\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.011989 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067f53b-b646-4cfb-82c6-71cc14a45dcb-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012010 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36b9c7ae-e055-434d-b016-2dcca5daf712-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012028 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0067f53b-b646-4cfb-82c6-71cc14a45dcb-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012054 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0b44b95e-b40b-4f6d-bb19-75adb6d761b0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b44b95e-b40b-4f6d-bb19-75adb6d761b0\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012069 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa0410b-2229-496d-a4df-7769afab71c3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012084 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067f53b-b646-4cfb-82c6-71cc14a45dcb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012103 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36b9c7ae-e055-434d-b016-2dcca5daf712-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012121 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa0410b-2229-496d-a4df-7769afab71c3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012139 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36b9c7ae-e055-434d-b016-2dcca5daf712-config\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012161 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa0410b-2229-496d-a4df-7769afab71c3-config\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012179 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b9c7ae-e055-434d-b016-2dcca5daf712-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012204 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpwgr\" (UniqueName: \"kubernetes.io/projected/36b9c7ae-e055-434d-b016-2dcca5daf712-kube-api-access-bpwgr\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012220 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa0410b-2229-496d-a4df-7769afab71c3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012236 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0067f53b-b646-4cfb-82c6-71cc14a45dcb-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012286 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-43c1a8d5-c48a-495c-b18b-79b5fc03820f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43c1a8d5-c48a-495c-b18b-79b5fc03820f\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012310 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0067f53b-b646-4cfb-82c6-71cc14a45dcb-config\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012335 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b9c7ae-e055-434d-b016-2dcca5daf712-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.012839 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5aa0410b-2229-496d-a4df-7769afab71c3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.013481 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5aa0410b-2229-496d-a4df-7769afab71c3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.013621 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aa0410b-2229-496d-a4df-7769afab71c3-config\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.013944 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0067f53b-b646-4cfb-82c6-71cc14a45dcb-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.014011 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36b9c7ae-e055-434d-b016-2dcca5daf712-config\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.014447 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/36b9c7ae-e055-434d-b016-2dcca5daf712-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.020638 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067f53b-b646-4cfb-82c6-71cc14a45dcb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.021684 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0067f53b-b646-4cfb-82c6-71cc14a45dcb-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.022498 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa0410b-2229-496d-a4df-7769afab71c3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.023469 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b9c7ae-e055-434d-b016-2dcca5daf712-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.023872 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0067f53b-b646-4cfb-82c6-71cc14a45dcb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.027131 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/36b9c7ae-e055-434d-b016-2dcca5daf712-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.028392 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.028455 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-84753cc4-263a-4c04-93c2-da2c3a57c3e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84753cc4-263a-4c04-93c2-da2c3a57c3e1\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/eabe9a2882917c8d4152db3db8e3cea8be86cda7aaafc78ca3263d8ee501bf4f/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.028473 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.028724 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.028764 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0b44b95e-b40b-4f6d-bb19-75adb6d761b0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b44b95e-b40b-4f6d-bb19-75adb6d761b0\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fc66d6b238cdf9aef63a6f5182dd206cac400051f01596a8d48a5310b19d2fe2/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.029403 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0067f53b-b646-4cfb-82c6-71cc14a45dcb-config\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.030607 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0067f53b-b646-4cfb-82c6-71cc14a45dcb-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.031139 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-43c1a8d5-c48a-495c-b18b-79b5fc03820f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43c1a8d5-c48a-495c-b18b-79b5fc03820f\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6a53317b4f481f9f81b77fa06633bb5a22f98214d3a12e449f98d21a6bedf690/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.032329 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aa0410b-2229-496d-a4df-7769afab71c3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.032392 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b9c7ae-e055-434d-b016-2dcca5daf712-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.032902 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpwgr\" (UniqueName: \"kubernetes.io/projected/36b9c7ae-e055-434d-b016-2dcca5daf712-kube-api-access-bpwgr\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.036344 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b9c7ae-e055-434d-b016-2dcca5daf712-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.039229 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5aa0410b-2229-496d-a4df-7769afab71c3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.049759 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb9n8\" (UniqueName: \"kubernetes.io/projected/0067f53b-b646-4cfb-82c6-71cc14a45dcb-kube-api-access-tb9n8\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.052466 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4jtm\" (UniqueName: \"kubernetes.io/projected/5aa0410b-2229-496d-a4df-7769afab71c3-kube-api-access-d4jtm\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.074470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-84753cc4-263a-4c04-93c2-da2c3a57c3e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-84753cc4-263a-4c04-93c2-da2c3a57c3e1\") pod \"ovsdbserver-nb-2\" (UID: \"0067f53b-b646-4cfb-82c6-71cc14a45dcb\") " pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.074586 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-43c1a8d5-c48a-495c-b18b-79b5fc03820f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43c1a8d5-c48a-495c-b18b-79b5fc03820f\") pod \"ovsdbserver-nb-0\" (UID: \"5aa0410b-2229-496d-a4df-7769afab71c3\") " pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.077851 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0b44b95e-b40b-4f6d-bb19-75adb6d761b0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0b44b95e-b40b-4f6d-bb19-75adb6d761b0\") pod \"ovsdbserver-nb-1\" (UID: \"36b9c7ae-e055-434d-b016-2dcca5daf712\") " pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.083160 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.101391 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:38 crc kubenswrapper[4909]: I0202 11:58:38.374299 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.264870 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.326350 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.327693 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.330516 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.334337 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.334358 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.334488 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-xnrqx" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.354340 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.371995 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.373555 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.383037 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.385106 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.392764 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.402684 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.458011 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qbtrr"] Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.464794 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.472362 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbtrr"] Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.481229 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b94aba5-26fe-4f14-9268-e8666aac57ec-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.481281 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b94aba5-26fe-4f14-9268-e8666aac57ec-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.481333 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8d6c9d17-2880-49ad-b4e6-6fd38dabf489\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d6c9d17-2880-49ad-b4e6-6fd38dabf489\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.481358 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9hqn\" (UniqueName: \"kubernetes.io/projected/4b94aba5-26fe-4f14-9268-e8666aac57ec-kube-api-access-w9hqn\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.481409 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b94aba5-26fe-4f14-9268-e8666aac57ec-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.481436 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94aba5-26fe-4f14-9268-e8666aac57ec-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.481502 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94aba5-26fe-4f14-9268-e8666aac57ec-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.481554 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b94aba5-26fe-4f14-9268-e8666aac57ec-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.583202 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30354f86-0667-4f76-a9c7-0e54ab2c5d83-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.583657 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30354f86-0667-4f76-a9c7-0e54ab2c5d83-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.583695 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2615f23e-9130-4801-af8f-a00a8b06b63f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2615f23e-9130-4801-af8f-a00a8b06b63f\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.583730 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c85aca97-8461-48fb-a4a1-981e77c72536\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c85aca97-8461-48fb-a4a1-981e77c72536\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.583754 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/51928bd6-d607-4438-ae21-b49df832df5c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.583831 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-catalog-content\") pod \"redhat-operators-qbtrr\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.583868 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b94aba5-26fe-4f14-9268-e8666aac57ec-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.583909 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b94aba5-26fe-4f14-9268-e8666aac57ec-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.583948 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51928bd6-d607-4438-ae21-b49df832df5c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.583974 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-utilities\") pod \"redhat-operators-qbtrr\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30354f86-0667-4f76-a9c7-0e54ab2c5d83-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584031 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8d6c9d17-2880-49ad-b4e6-6fd38dabf489\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d6c9d17-2880-49ad-b4e6-6fd38dabf489\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584061 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9hqn\" (UniqueName: \"kubernetes.io/projected/4b94aba5-26fe-4f14-9268-e8666aac57ec-kube-api-access-w9hqn\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584085 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51928bd6-d607-4438-ae21-b49df832df5c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584110 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b94aba5-26fe-4f14-9268-e8666aac57ec-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30354f86-0667-4f76-a9c7-0e54ab2c5d83-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584160 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51928bd6-d607-4438-ae21-b49df832df5c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584186 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94aba5-26fe-4f14-9268-e8666aac57ec-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584222 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30354f86-0667-4f76-a9c7-0e54ab2c5d83-config\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94aba5-26fe-4f14-9268-e8666aac57ec-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584275 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30354f86-0667-4f76-a9c7-0e54ab2c5d83-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584307 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b94aba5-26fe-4f14-9268-e8666aac57ec-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584337 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51928bd6-d607-4438-ae21-b49df832df5c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584366 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vqq\" (UniqueName: \"kubernetes.io/projected/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-kube-api-access-v9vqq\") pod \"redhat-operators-qbtrr\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584393 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch9cs\" (UniqueName: \"kubernetes.io/projected/30354f86-0667-4f76-a9c7-0e54ab2c5d83-kube-api-access-ch9cs\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584423 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxgn2\" (UniqueName: \"kubernetes.io/projected/51928bd6-d607-4438-ae21-b49df832df5c-kube-api-access-mxgn2\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.584446 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51928bd6-d607-4438-ae21-b49df832df5c-config\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.586844 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b94aba5-26fe-4f14-9268-e8666aac57ec-config\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.588612 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4b94aba5-26fe-4f14-9268-e8666aac57ec-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.591186 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4b94aba5-26fe-4f14-9268-e8666aac57ec-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.591753 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94aba5-26fe-4f14-9268-e8666aac57ec-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.594690 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b94aba5-26fe-4f14-9268-e8666aac57ec-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.596031 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.596083 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8d6c9d17-2880-49ad-b4e6-6fd38dabf489\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d6c9d17-2880-49ad-b4e6-6fd38dabf489\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3f7407642613ccc4cc444c396f4494a891053b2e9c1ff6a3e43770eaf263f306/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.596237 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b94aba5-26fe-4f14-9268-e8666aac57ec-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.609373 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9hqn\" (UniqueName: \"kubernetes.io/projected/4b94aba5-26fe-4f14-9268-e8666aac57ec-kube-api-access-w9hqn\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.644938 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8d6c9d17-2880-49ad-b4e6-6fd38dabf489\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8d6c9d17-2880-49ad-b4e6-6fd38dabf489\") pod \"ovsdbserver-sb-0\" (UID: \"4b94aba5-26fe-4f14-9268-e8666aac57ec\") " pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686330 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30354f86-0667-4f76-a9c7-0e54ab2c5d83-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686380 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30354f86-0667-4f76-a9c7-0e54ab2c5d83-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686405 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2615f23e-9130-4801-af8f-a00a8b06b63f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2615f23e-9130-4801-af8f-a00a8b06b63f\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686433 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c85aca97-8461-48fb-a4a1-981e77c72536\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c85aca97-8461-48fb-a4a1-981e77c72536\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686449 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/51928bd6-d607-4438-ae21-b49df832df5c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686484 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-catalog-content\") pod \"redhat-operators-qbtrr\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686522 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51928bd6-d607-4438-ae21-b49df832df5c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-utilities\") pod \"redhat-operators-qbtrr\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686556 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30354f86-0667-4f76-a9c7-0e54ab2c5d83-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686579 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51928bd6-d607-4438-ae21-b49df832df5c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686601 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51928bd6-d607-4438-ae21-b49df832df5c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686619 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30354f86-0667-4f76-a9c7-0e54ab2c5d83-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686646 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30354f86-0667-4f76-a9c7-0e54ab2c5d83-config\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686666 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30354f86-0667-4f76-a9c7-0e54ab2c5d83-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51928bd6-d607-4438-ae21-b49df832df5c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686706 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vqq\" (UniqueName: \"kubernetes.io/projected/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-kube-api-access-v9vqq\") pod \"redhat-operators-qbtrr\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686727 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch9cs\" (UniqueName: \"kubernetes.io/projected/30354f86-0667-4f76-a9c7-0e54ab2c5d83-kube-api-access-ch9cs\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686748 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51928bd6-d607-4438-ae21-b49df832df5c-config\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686768 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxgn2\" (UniqueName: \"kubernetes.io/projected/51928bd6-d607-4438-ae21-b49df832df5c-kube-api-access-mxgn2\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.686800 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/30354f86-0667-4f76-a9c7-0e54ab2c5d83-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.687167 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-catalog-content\") pod \"redhat-operators-qbtrr\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.687195 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-utilities\") pod \"redhat-operators-qbtrr\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.688282 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51928bd6-d607-4438-ae21-b49df832df5c-config\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.688282 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/51928bd6-d607-4438-ae21-b49df832df5c-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.688581 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/30354f86-0667-4f76-a9c7-0e54ab2c5d83-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.688752 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/51928bd6-d607-4438-ae21-b49df832df5c-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.689800 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30354f86-0667-4f76-a9c7-0e54ab2c5d83-config\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.692352 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.692389 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2615f23e-9130-4801-af8f-a00a8b06b63f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2615f23e-9130-4801-af8f-a00a8b06b63f\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6f91ea30929dba831c8973f9626352517aab40248ffd85b9f68d02b7da11d5da/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.692780 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30354f86-0667-4f76-a9c7-0e54ab2c5d83-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.692862 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/51928bd6-d607-4438-ae21-b49df832df5c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.692908 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30354f86-0667-4f76-a9c7-0e54ab2c5d83-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.695663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/51928bd6-d607-4438-ae21-b49df832df5c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.696044 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.696087 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c85aca97-8461-48fb-a4a1-981e77c72536\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c85aca97-8461-48fb-a4a1-981e77c72536\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1abff88394ed85a0694a5aa7f0651071650610a6b8b8fc2741ecf8ff3a23ac14/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.699340 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/30354f86-0667-4f76-a9c7-0e54ab2c5d83-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.703756 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51928bd6-d607-4438-ae21-b49df832df5c-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.707635 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch9cs\" (UniqueName: \"kubernetes.io/projected/30354f86-0667-4f76-a9c7-0e54ab2c5d83-kube-api-access-ch9cs\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.713929 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vqq\" (UniqueName: \"kubernetes.io/projected/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-kube-api-access-v9vqq\") pod \"redhat-operators-qbtrr\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.715950 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxgn2\" (UniqueName: \"kubernetes.io/projected/51928bd6-d607-4438-ae21-b49df832df5c-kube-api-access-mxgn2\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.738223 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.748221 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c85aca97-8461-48fb-a4a1-981e77c72536\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c85aca97-8461-48fb-a4a1-981e77c72536\") pod \"ovsdbserver-sb-2\" (UID: \"30354f86-0667-4f76-a9c7-0e54ab2c5d83\") " pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.767573 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2615f23e-9130-4801-af8f-a00a8b06b63f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2615f23e-9130-4801-af8f-a00a8b06b63f\") pod \"ovsdbserver-sb-1\" (UID: \"51928bd6-d607-4438-ae21-b49df832df5c\") " pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.771821 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.800953 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.802409 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.808376 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:39 crc kubenswrapper[4909]: I0202 11:58:39.961916 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 02 11:58:40 crc kubenswrapper[4909]: I0202 11:58:40.173225 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 02 11:58:40 crc kubenswrapper[4909]: I0202 11:58:40.217493 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0067f53b-b646-4cfb-82c6-71cc14a45dcb","Type":"ContainerStarted","Data":"744da9584e5c1dacc4eb9b0a4041d324eca9a9ae284fd0688d1e229fd1693c38"} Feb 02 11:58:40 crc kubenswrapper[4909]: I0202 11:58:40.231032 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"36b9c7ae-e055-434d-b016-2dcca5daf712","Type":"ContainerStarted","Data":"020417bed22d64af0d1400304ef09d07b9f004ae39b3488468aaf7437049ce69"} Feb 02 11:58:40 crc kubenswrapper[4909]: I0202 11:58:40.231086 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"36b9c7ae-e055-434d-b016-2dcca5daf712","Type":"ContainerStarted","Data":"0d64cb2ca16d514fd33d0a4bd448334cb8c8c2bc59062ad48cb037089ee8bbe3"} Feb 02 11:58:40 crc kubenswrapper[4909]: I0202 11:58:40.231099 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"36b9c7ae-e055-434d-b016-2dcca5daf712","Type":"ContainerStarted","Data":"f35faf6f454d9922603bcd0bf4c13866fdbb534e8d7676d0da155b128eba9784"} Feb 02 11:58:40 crc kubenswrapper[4909]: I0202 11:58:40.242023 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5aa0410b-2229-496d-a4df-7769afab71c3","Type":"ContainerStarted","Data":"876a7ec91ead1d49bb878a91a5b4b022604279ccd5606a16e502160fdcc54640"} Feb 02 11:58:40 crc kubenswrapper[4909]: I0202 11:58:40.264159 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.264140741 podStartE2EDuration="4.264140741s" podCreationTimestamp="2026-02-02 11:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:58:40.263548634 +0000 UTC m=+5246.009649369" watchObservedRunningTime="2026-02-02 11:58:40.264140741 +0000 UTC m=+5246.010241476" Feb 02 11:58:40 crc kubenswrapper[4909]: W0202 11:58:40.497714 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30354f86_0667_4f76_a9c7_0e54ab2c5d83.slice/crio-62d3b8da8bcfd4952565deeae447990673b8d2f4d04e0b0a733df5aae4220247 WatchSource:0}: Error finding container 62d3b8da8bcfd4952565deeae447990673b8d2f4d04e0b0a733df5aae4220247: Status 404 returned error can't find the container with id 62d3b8da8bcfd4952565deeae447990673b8d2f4d04e0b0a733df5aae4220247 Feb 02 11:58:40 crc kubenswrapper[4909]: I0202 11:58:40.498079 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 02 11:58:40 crc kubenswrapper[4909]: I0202 11:58:40.639416 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 02 11:58:40 crc kubenswrapper[4909]: I0202 11:58:40.649655 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qbtrr"] Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.103703 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.251982 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5aa0410b-2229-496d-a4df-7769afab71c3","Type":"ContainerStarted","Data":"0d5588ad85d67a5894fb823fe55e8244f3efaea3c8bca2c6a89def9f137c15da"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.252038 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5aa0410b-2229-496d-a4df-7769afab71c3","Type":"ContainerStarted","Data":"4d10bc2e302e9042bd7725df8eda5f82cac06141fa7f4fca21bf8f345b7f4ee1"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.254750 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0067f53b-b646-4cfb-82c6-71cc14a45dcb","Type":"ContainerStarted","Data":"7131b8a28b551a10414207556de09c82899a3df156c8d1a1c9ce8b9023beabce"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.254886 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0067f53b-b646-4cfb-82c6-71cc14a45dcb","Type":"ContainerStarted","Data":"f999acd9b7ff5c32143c389037d993f86b5514285447fb51435ecc35b048e124"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.256979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"30354f86-0667-4f76-a9c7-0e54ab2c5d83","Type":"ContainerStarted","Data":"d6bead9de7bfa7025d524ed1e8de79e37dc76fde618d9c6d1abe9cbbb929650e"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.257021 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"30354f86-0667-4f76-a9c7-0e54ab2c5d83","Type":"ContainerStarted","Data":"744b3e5c9b59a1362c68742e5d668fdd6903a9a404acd4e7e78f672f865217ea"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.257036 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"30354f86-0667-4f76-a9c7-0e54ab2c5d83","Type":"ContainerStarted","Data":"62d3b8da8bcfd4952565deeae447990673b8d2f4d04e0b0a733df5aae4220247"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.258743 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"51928bd6-d607-4438-ae21-b49df832df5c","Type":"ContainerStarted","Data":"449fb21eebde317ebbf327bbb1dea82c871e00e1372bee52e32148131896853e"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.258770 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"51928bd6-d607-4438-ae21-b49df832df5c","Type":"ContainerStarted","Data":"842f78ba63eced20744da3c8729e41481a12327678d5b5f186761ce41ec21444"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.258780 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"51928bd6-d607-4438-ae21-b49df832df5c","Type":"ContainerStarted","Data":"7185c38e3093e3b112c4765d9aff9ccf0ab228fdf2a4d538ea02e04fa1338d06"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.261598 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4b94aba5-26fe-4f14-9268-e8666aac57ec","Type":"ContainerStarted","Data":"00a9eb497c30db3911eda74f8a7b65d1cd38cb1f55414e54892298b93515d2b4"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.261623 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4b94aba5-26fe-4f14-9268-e8666aac57ec","Type":"ContainerStarted","Data":"e226c38d431b5b74b31d92ab16a75a88f2d30655731c42405d964bf00a6193a7"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.261634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4b94aba5-26fe-4f14-9268-e8666aac57ec","Type":"ContainerStarted","Data":"ed71b4de025c42fe75d248fc8a9b3a4e810e85f34fd1e774b04c0069fe062a67"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.263586 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerID="520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403" exitCode=0 Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.264472 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrr" event={"ID":"ce04e8dd-8db5-4467-b9dd-8f389fcf9820","Type":"ContainerDied","Data":"520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.264492 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrr" event={"ID":"ce04e8dd-8db5-4467-b9dd-8f389fcf9820","Type":"ContainerStarted","Data":"bb9dc2fe3447234e0b93e8ee89018d7964d6e5c48b73517fef3c55622660a1f9"} Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.265270 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.280671 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.280650673 podStartE2EDuration="5.280650673s" podCreationTimestamp="2026-02-02 11:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:58:41.275402447 +0000 UTC m=+5247.021503182" watchObservedRunningTime="2026-02-02 11:58:41.280650673 +0000 UTC m=+5247.026751408" Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.338578 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.338562745 podStartE2EDuration="3.338562745s" podCreationTimestamp="2026-02-02 11:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:58:41.303146129 +0000 UTC m=+5247.049246854" watchObservedRunningTime="2026-02-02 11:58:41.338562745 +0000 UTC m=+5247.084663470" Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.374913 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=5.374885346 podStartE2EDuration="5.374885346s" podCreationTimestamp="2026-02-02 11:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:58:41.34123991 +0000 UTC m=+5247.087340645" watchObservedRunningTime="2026-02-02 11:58:41.374885346 +0000 UTC m=+5247.120986081" Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.375047 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.375494 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.375488303 podStartE2EDuration="3.375488303s" podCreationTimestamp="2026-02-02 11:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:58:41.371633346 +0000 UTC m=+5247.117734071" watchObservedRunningTime="2026-02-02 11:58:41.375488303 +0000 UTC m=+5247.121589038" Feb 02 11:58:41 crc kubenswrapper[4909]: I0202 11:58:41.430294 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.430274749 podStartE2EDuration="3.430274749s" podCreationTimestamp="2026-02-02 11:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:58:41.428358395 +0000 UTC m=+5247.174459130" watchObservedRunningTime="2026-02-02 11:58:41.430274749 +0000 UTC m=+5247.176375484" Feb 02 11:58:42 crc kubenswrapper[4909]: I0202 11:58:42.738735 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:42 crc kubenswrapper[4909]: I0202 11:58:42.772280 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:42 crc kubenswrapper[4909]: I0202 11:58:42.804305 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:43 crc kubenswrapper[4909]: I0202 11:58:43.083569 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:43 crc kubenswrapper[4909]: I0202 11:58:43.102425 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:43 crc kubenswrapper[4909]: I0202 11:58:43.280287 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerID="73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2" exitCode=0 Feb 02 11:58:43 crc kubenswrapper[4909]: I0202 11:58:43.280795 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrr" event={"ID":"ce04e8dd-8db5-4467-b9dd-8f389fcf9820","Type":"ContainerDied","Data":"73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2"} Feb 02 11:58:43 crc kubenswrapper[4909]: I0202 11:58:43.374669 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.084334 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.125261 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.148791 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.290947 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrr" event={"ID":"ce04e8dd-8db5-4467-b9dd-8f389fcf9820","Type":"ContainerStarted","Data":"784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872"} Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.314139 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qbtrr" podStartSLOduration=2.921207158 podStartE2EDuration="5.314121124s" podCreationTimestamp="2026-02-02 11:58:39 +0000 UTC" firstStartedPulling="2026-02-02 11:58:41.265074219 +0000 UTC m=+5247.011174954" lastFinishedPulling="2026-02-02 11:58:43.657988185 +0000 UTC m=+5249.404088920" observedRunningTime="2026-02-02 11:58:44.307384136 +0000 UTC m=+5250.053484881" watchObservedRunningTime="2026-02-02 11:58:44.314121124 +0000 UTC m=+5250.060221859" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.332145 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.417010 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.574409 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fbfbc95b5-2lr7s"] Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.576370 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.578855 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.583663 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fbfbc95b5-2lr7s"] Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.695743 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjpq8\" (UniqueName: \"kubernetes.io/projected/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-kube-api-access-cjpq8\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.695920 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.696008 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-dns-svc\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.696064 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-config\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.738550 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.772358 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.797853 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjpq8\" (UniqueName: \"kubernetes.io/projected/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-kube-api-access-cjpq8\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.797979 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.798057 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-dns-svc\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.798110 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-config\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.799102 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-dns-svc\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.799104 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-ovsdbserver-nb\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.799307 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-config\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.803707 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.818709 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjpq8\" (UniqueName: \"kubernetes.io/projected/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-kube-api-access-cjpq8\") pod \"dnsmasq-dns-7fbfbc95b5-2lr7s\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:44 crc kubenswrapper[4909]: I0202 11:58:44.905129 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:45 crc kubenswrapper[4909]: I0202 11:58:45.023675 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:58:45 crc kubenswrapper[4909]: E0202 11:58:45.024412 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:58:45 crc kubenswrapper[4909]: I0202 11:58:45.339514 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fbfbc95b5-2lr7s"] Feb 02 11:58:45 crc kubenswrapper[4909]: I0202 11:58:45.356304 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 02 11:58:45 crc kubenswrapper[4909]: I0202 11:58:45.357643 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 02 11:58:45 crc kubenswrapper[4909]: I0202 11:58:45.783606 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:45 crc kubenswrapper[4909]: I0202 11:58:45.826531 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:45 crc kubenswrapper[4909]: I0202 11:58:45.838179 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 02 11:58:45 crc kubenswrapper[4909]: I0202 11:58:45.852853 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:45 crc kubenswrapper[4909]: I0202 11:58:45.878168 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 02 11:58:45 crc kubenswrapper[4909]: I0202 11:58:45.902162 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.109880 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fbfbc95b5-2lr7s"] Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.144181 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f48d44b77-lkn6v"] Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.145684 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.148433 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.162144 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f48d44b77-lkn6v"] Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.246839 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-nb\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.246920 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-dns-svc\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.247206 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-config\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.247270 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45kfb\" (UniqueName: \"kubernetes.io/projected/c3d6e381-a830-4c0f-9f33-294d0c937f1f-kube-api-access-45kfb\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.247383 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.307116 4909 generic.go:334] "Generic (PLEG): container finished" podID="2dc1a9f2-07f0-440f-a4df-7cf06a153f47" containerID="2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8" exitCode=0 Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.307172 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" event={"ID":"2dc1a9f2-07f0-440f-a4df-7cf06a153f47","Type":"ContainerDied","Data":"2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8"} Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.307213 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" event={"ID":"2dc1a9f2-07f0-440f-a4df-7cf06a153f47","Type":"ContainerStarted","Data":"3fb92ae3a6b7987131e370f46aab44cf747f71f9f4027825f1198c511252b616"} Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.350001 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-dns-svc\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.350246 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-config\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.350302 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45kfb\" (UniqueName: \"kubernetes.io/projected/c3d6e381-a830-4c0f-9f33-294d0c937f1f-kube-api-access-45kfb\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.350334 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.350441 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-nb\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.350520 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-dns-svc\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.351493 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-sb\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.351686 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-config\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.352239 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-nb\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.373155 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45kfb\" (UniqueName: \"kubernetes.io/projected/c3d6e381-a830-4c0f-9f33-294d0c937f1f-kube-api-access-45kfb\") pod \"dnsmasq-dns-7f48d44b77-lkn6v\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.469466 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:46 crc kubenswrapper[4909]: I0202 11:58:46.909041 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f48d44b77-lkn6v"] Feb 02 11:58:46 crc kubenswrapper[4909]: W0202 11:58:46.917426 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3d6e381_a830_4c0f_9f33_294d0c937f1f.slice/crio-9640480efce321725fe29a36fed2edf866076c5a2dc587212a828965929fe783 WatchSource:0}: Error finding container 9640480efce321725fe29a36fed2edf866076c5a2dc587212a828965929fe783: Status 404 returned error can't find the container with id 9640480efce321725fe29a36fed2edf866076c5a2dc587212a828965929fe783 Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.315330 4909 generic.go:334] "Generic (PLEG): container finished" podID="c3d6e381-a830-4c0f-9f33-294d0c937f1f" containerID="8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e" exitCode=0 Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.315417 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" event={"ID":"c3d6e381-a830-4c0f-9f33-294d0c937f1f","Type":"ContainerDied","Data":"8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e"} Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.315459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" event={"ID":"c3d6e381-a830-4c0f-9f33-294d0c937f1f","Type":"ContainerStarted","Data":"9640480efce321725fe29a36fed2edf866076c5a2dc587212a828965929fe783"} Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.321259 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" event={"ID":"2dc1a9f2-07f0-440f-a4df-7cf06a153f47","Type":"ContainerStarted","Data":"3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab"} Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.321380 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" podUID="2dc1a9f2-07f0-440f-a4df-7cf06a153f47" containerName="dnsmasq-dns" containerID="cri-o://3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab" gracePeriod=10 Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.321442 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.351501 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" podStartSLOduration=3.351482192 podStartE2EDuration="3.351482192s" podCreationTimestamp="2026-02-02 11:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:58:47.350232028 +0000 UTC m=+5253.096332763" watchObservedRunningTime="2026-02-02 11:58:47.351482192 +0000 UTC m=+5253.097582927" Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.761617 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.877557 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-config\") pod \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.877709 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-ovsdbserver-nb\") pod \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.877841 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjpq8\" (UniqueName: \"kubernetes.io/projected/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-kube-api-access-cjpq8\") pod \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.877862 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-dns-svc\") pod \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\" (UID: \"2dc1a9f2-07f0-440f-a4df-7cf06a153f47\") " Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.883413 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-kube-api-access-cjpq8" (OuterVolumeSpecName: "kube-api-access-cjpq8") pod "2dc1a9f2-07f0-440f-a4df-7cf06a153f47" (UID: "2dc1a9f2-07f0-440f-a4df-7cf06a153f47"). InnerVolumeSpecName "kube-api-access-cjpq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.917177 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2dc1a9f2-07f0-440f-a4df-7cf06a153f47" (UID: "2dc1a9f2-07f0-440f-a4df-7cf06a153f47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.919783 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-config" (OuterVolumeSpecName: "config") pod "2dc1a9f2-07f0-440f-a4df-7cf06a153f47" (UID: "2dc1a9f2-07f0-440f-a4df-7cf06a153f47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.922367 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2dc1a9f2-07f0-440f-a4df-7cf06a153f47" (UID: "2dc1a9f2-07f0-440f-a4df-7cf06a153f47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.980337 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.980374 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjpq8\" (UniqueName: \"kubernetes.io/projected/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-kube-api-access-cjpq8\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.980391 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:47 crc kubenswrapper[4909]: I0202 11:58:47.980403 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dc1a9f2-07f0-440f-a4df-7cf06a153f47-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.295030 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 02 11:58:48 crc kubenswrapper[4909]: E0202 11:58:48.296350 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc1a9f2-07f0-440f-a4df-7cf06a153f47" containerName="dnsmasq-dns" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.296386 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc1a9f2-07f0-440f-a4df-7cf06a153f47" containerName="dnsmasq-dns" Feb 02 11:58:48 crc kubenswrapper[4909]: E0202 11:58:48.296453 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc1a9f2-07f0-440f-a4df-7cf06a153f47" containerName="init" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.296460 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc1a9f2-07f0-440f-a4df-7cf06a153f47" containerName="init" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.296788 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc1a9f2-07f0-440f-a4df-7cf06a153f47" containerName="dnsmasq-dns" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.298014 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.300880 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.306363 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.331332 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" event={"ID":"c3d6e381-a830-4c0f-9f33-294d0c937f1f","Type":"ContainerStarted","Data":"f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb"} Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.331423 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.333676 4909 generic.go:334] "Generic (PLEG): container finished" podID="2dc1a9f2-07f0-440f-a4df-7cf06a153f47" containerID="3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab" exitCode=0 Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.333710 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" event={"ID":"2dc1a9f2-07f0-440f-a4df-7cf06a153f47","Type":"ContainerDied","Data":"3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab"} Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.333732 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" event={"ID":"2dc1a9f2-07f0-440f-a4df-7cf06a153f47","Type":"ContainerDied","Data":"3fb92ae3a6b7987131e370f46aab44cf747f71f9f4027825f1198c511252b616"} Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.333749 4909 scope.go:117] "RemoveContainer" containerID="3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.333740 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fbfbc95b5-2lr7s" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.350165 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" podStartSLOduration=2.350118108 podStartE2EDuration="2.350118108s" podCreationTimestamp="2026-02-02 11:58:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:58:48.348114132 +0000 UTC m=+5254.094214877" watchObservedRunningTime="2026-02-02 11:58:48.350118108 +0000 UTC m=+5254.096218843" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.355950 4909 scope.go:117] "RemoveContainer" containerID="2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.378961 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fbfbc95b5-2lr7s"] Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.385087 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fbfbc95b5-2lr7s"] Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.385279 4909 scope.go:117] "RemoveContainer" containerID="3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab" Feb 02 11:58:48 crc kubenswrapper[4909]: E0202 11:58:48.386212 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab\": container with ID starting with 3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab not found: ID does not exist" containerID="3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.386246 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab"} err="failed to get container status \"3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab\": rpc error: code = NotFound desc = could not find container \"3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab\": container with ID starting with 3631230233db3448baedabd92e36be8bda64f62b7eab5dd97a70a4823cfc96ab not found: ID does not exist" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.386269 4909 scope.go:117] "RemoveContainer" containerID="2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8" Feb 02 11:58:48 crc kubenswrapper[4909]: E0202 11:58:48.386962 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8\": container with ID starting with 2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8 not found: ID does not exist" containerID="2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.386987 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8"} err="failed to get container status \"2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8\": rpc error: code = NotFound desc = could not find container \"2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8\": container with ID starting with 2b840b46e9ce42de5095cbe1f6018166ea9d1c810d3a9d837c27454c42dd10d8 not found: ID does not exist" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.390894 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6eda687f-6113-4809-8889-105099e40afa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eda687f-6113-4809-8889-105099e40afa\") pod \"ovn-copy-data\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.391110 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7xvw\" (UniqueName: \"kubernetes.io/projected/799e15d2-1ed2-4524-8b75-582f98eb003e-kube-api-access-l7xvw\") pod \"ovn-copy-data\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.391241 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/799e15d2-1ed2-4524-8b75-582f98eb003e-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.493454 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6eda687f-6113-4809-8889-105099e40afa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eda687f-6113-4809-8889-105099e40afa\") pod \"ovn-copy-data\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.493548 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7xvw\" (UniqueName: \"kubernetes.io/projected/799e15d2-1ed2-4524-8b75-582f98eb003e-kube-api-access-l7xvw\") pod \"ovn-copy-data\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.493610 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/799e15d2-1ed2-4524-8b75-582f98eb003e-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.501898 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/799e15d2-1ed2-4524-8b75-582f98eb003e-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.502033 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.502092 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6eda687f-6113-4809-8889-105099e40afa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eda687f-6113-4809-8889-105099e40afa\") pod \"ovn-copy-data\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/101e62cdbb0277661547122cc195cef3bbb53d526e8bdfccae6767edd6a80239/globalmount\"" pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.517771 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7xvw\" (UniqueName: \"kubernetes.io/projected/799e15d2-1ed2-4524-8b75-582f98eb003e-kube-api-access-l7xvw\") pod \"ovn-copy-data\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.544657 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6eda687f-6113-4809-8889-105099e40afa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eda687f-6113-4809-8889-105099e40afa\") pod \"ovn-copy-data\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " pod="openstack/ovn-copy-data" Feb 02 11:58:48 crc kubenswrapper[4909]: I0202 11:58:48.626953 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 02 11:58:49 crc kubenswrapper[4909]: I0202 11:58:49.027072 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc1a9f2-07f0-440f-a4df-7cf06a153f47" path="/var/lib/kubelet/pods/2dc1a9f2-07f0-440f-a4df-7cf06a153f47/volumes" Feb 02 11:58:49 crc kubenswrapper[4909]: I0202 11:58:49.131713 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 02 11:58:49 crc kubenswrapper[4909]: I0202 11:58:49.354122 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"799e15d2-1ed2-4524-8b75-582f98eb003e","Type":"ContainerStarted","Data":"c239e02f47721c51b53842592422846ddaa2f40b82d346837a9ccff31a493823"} Feb 02 11:58:49 crc kubenswrapper[4909]: I0202 11:58:49.809221 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:49 crc kubenswrapper[4909]: I0202 11:58:49.809283 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:49 crc kubenswrapper[4909]: I0202 11:58:49.857248 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:50 crc kubenswrapper[4909]: I0202 11:58:50.361550 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"799e15d2-1ed2-4524-8b75-582f98eb003e","Type":"ContainerStarted","Data":"70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95"} Feb 02 11:58:50 crc kubenswrapper[4909]: I0202 11:58:50.378119 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.9066219650000003 podStartE2EDuration="3.378097751s" podCreationTimestamp="2026-02-02 11:58:47 +0000 UTC" firstStartedPulling="2026-02-02 11:58:49.144291419 +0000 UTC m=+5254.890392154" lastFinishedPulling="2026-02-02 11:58:49.615767195 +0000 UTC m=+5255.361867940" observedRunningTime="2026-02-02 11:58:50.376794375 +0000 UTC m=+5256.122895110" watchObservedRunningTime="2026-02-02 11:58:50.378097751 +0000 UTC m=+5256.124198496" Feb 02 11:58:50 crc kubenswrapper[4909]: I0202 11:58:50.409512 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:50 crc kubenswrapper[4909]: I0202 11:58:50.464246 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbtrr"] Feb 02 11:58:52 crc kubenswrapper[4909]: I0202 11:58:52.375948 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qbtrr" podUID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerName="registry-server" containerID="cri-o://784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872" gracePeriod=2 Feb 02 11:58:52 crc kubenswrapper[4909]: I0202 11:58:52.794369 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:52 crc kubenswrapper[4909]: I0202 11:58:52.867224 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-utilities\") pod \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " Feb 02 11:58:52 crc kubenswrapper[4909]: I0202 11:58:52.867274 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9vqq\" (UniqueName: \"kubernetes.io/projected/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-kube-api-access-v9vqq\") pod \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " Feb 02 11:58:52 crc kubenswrapper[4909]: I0202 11:58:52.867303 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-catalog-content\") pod \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\" (UID: \"ce04e8dd-8db5-4467-b9dd-8f389fcf9820\") " Feb 02 11:58:52 crc kubenswrapper[4909]: I0202 11:58:52.868899 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-utilities" (OuterVolumeSpecName: "utilities") pod "ce04e8dd-8db5-4467-b9dd-8f389fcf9820" (UID: "ce04e8dd-8db5-4467-b9dd-8f389fcf9820"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:58:52 crc kubenswrapper[4909]: I0202 11:58:52.873056 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-kube-api-access-v9vqq" (OuterVolumeSpecName: "kube-api-access-v9vqq") pod "ce04e8dd-8db5-4467-b9dd-8f389fcf9820" (UID: "ce04e8dd-8db5-4467-b9dd-8f389fcf9820"). InnerVolumeSpecName "kube-api-access-v9vqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:58:52 crc kubenswrapper[4909]: I0202 11:58:52.970005 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:52 crc kubenswrapper[4909]: I0202 11:58:52.970047 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9vqq\" (UniqueName: \"kubernetes.io/projected/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-kube-api-access-v9vqq\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.011284 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce04e8dd-8db5-4467-b9dd-8f389fcf9820" (UID: "ce04e8dd-8db5-4467-b9dd-8f389fcf9820"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.071193 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce04e8dd-8db5-4467-b9dd-8f389fcf9820-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.385516 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerID="784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872" exitCode=0 Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.385566 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qbtrr" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.385580 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrr" event={"ID":"ce04e8dd-8db5-4467-b9dd-8f389fcf9820","Type":"ContainerDied","Data":"784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872"} Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.386027 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qbtrr" event={"ID":"ce04e8dd-8db5-4467-b9dd-8f389fcf9820","Type":"ContainerDied","Data":"bb9dc2fe3447234e0b93e8ee89018d7964d6e5c48b73517fef3c55622660a1f9"} Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.386064 4909 scope.go:117] "RemoveContainer" containerID="784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.412716 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qbtrr"] Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.415328 4909 scope.go:117] "RemoveContainer" containerID="73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.422542 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qbtrr"] Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.438448 4909 scope.go:117] "RemoveContainer" containerID="520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.468164 4909 scope.go:117] "RemoveContainer" containerID="784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872" Feb 02 11:58:53 crc kubenswrapper[4909]: E0202 11:58:53.468748 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872\": container with ID starting with 784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872 not found: ID does not exist" containerID="784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.468787 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872"} err="failed to get container status \"784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872\": rpc error: code = NotFound desc = could not find container \"784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872\": container with ID starting with 784b6b973617119ffb34a703a7738e00093570a653d9279f946420c70862a872 not found: ID does not exist" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.468829 4909 scope.go:117] "RemoveContainer" containerID="73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2" Feb 02 11:58:53 crc kubenswrapper[4909]: E0202 11:58:53.469221 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2\": container with ID starting with 73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2 not found: ID does not exist" containerID="73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.469277 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2"} err="failed to get container status \"73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2\": rpc error: code = NotFound desc = could not find container \"73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2\": container with ID starting with 73c2124f5468a82dd2ccdb9d50d5261fd4a6c36086b274b4de5514b364f26ce2 not found: ID does not exist" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.469311 4909 scope.go:117] "RemoveContainer" containerID="520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403" Feb 02 11:58:53 crc kubenswrapper[4909]: E0202 11:58:53.469864 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403\": container with ID starting with 520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403 not found: ID does not exist" containerID="520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403" Feb 02 11:58:53 crc kubenswrapper[4909]: I0202 11:58:53.469909 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403"} err="failed to get container status \"520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403\": rpc error: code = NotFound desc = could not find container \"520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403\": container with ID starting with 520080219ddb1421f91ff9ec598d763a1a7212879c49e0c58b11c51a63ad8403 not found: ID does not exist" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.949259 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 02 11:58:54 crc kubenswrapper[4909]: E0202 11:58:54.949833 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerName="extract-utilities" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.949847 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerName="extract-utilities" Feb 02 11:58:54 crc kubenswrapper[4909]: E0202 11:58:54.949869 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerName="registry-server" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.949876 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerName="registry-server" Feb 02 11:58:54 crc kubenswrapper[4909]: E0202 11:58:54.949889 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerName="extract-content" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.949896 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerName="extract-content" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.950029 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" containerName="registry-server" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.950947 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.952703 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.952991 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.955266 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gbt8m" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.963341 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 02 11:58:54 crc kubenswrapper[4909]: I0202 11:58:54.976459 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.057642 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce04e8dd-8db5-4467-b9dd-8f389fcf9820" path="/var/lib/kubelet/pods/ce04e8dd-8db5-4467-b9dd-8f389fcf9820/volumes" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.109159 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-scripts\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.109216 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.109250 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x5bs\" (UniqueName: \"kubernetes.io/projected/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-kube-api-access-2x5bs\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.109315 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.109356 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.109489 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.109514 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-config\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.210663 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-scripts\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.210727 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.210754 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x5bs\" (UniqueName: \"kubernetes.io/projected/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-kube-api-access-2x5bs\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.210823 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.210852 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.210941 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.210963 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-config\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.211764 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-config\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.211784 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-scripts\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.212386 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.218151 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.227398 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.228365 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.230173 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x5bs\" (UniqueName: \"kubernetes.io/projected/2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2-kube-api-access-2x5bs\") pod \"ovn-northd-0\" (UID: \"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2\") " pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.277149 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 02 11:58:55 crc kubenswrapper[4909]: I0202 11:58:55.726907 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 02 11:58:55 crc kubenswrapper[4909]: W0202 11:58:55.733563 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b1a7f7b_d7aa_4a2f_b868_00e64797a3f2.slice/crio-95189e9a057c5caa0eb1cf468a12727df4b67e60351d2b1a248b66edb68787a1 WatchSource:0}: Error finding container 95189e9a057c5caa0eb1cf468a12727df4b67e60351d2b1a248b66edb68787a1: Status 404 returned error can't find the container with id 95189e9a057c5caa0eb1cf468a12727df4b67e60351d2b1a248b66edb68787a1 Feb 02 11:58:56 crc kubenswrapper[4909]: I0202 11:58:56.417153 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2","Type":"ContainerStarted","Data":"29606055fad26c149b71e1a24a5954b6b2b70cd15bdaaa13394a48dbb4dfb123"} Feb 02 11:58:56 crc kubenswrapper[4909]: I0202 11:58:56.417205 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2","Type":"ContainerStarted","Data":"51da461748346ece7970efd3cb0a8eacf71fc374c03381a428660f85d9d89723"} Feb 02 11:58:56 crc kubenswrapper[4909]: I0202 11:58:56.417220 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2","Type":"ContainerStarted","Data":"95189e9a057c5caa0eb1cf468a12727df4b67e60351d2b1a248b66edb68787a1"} Feb 02 11:58:56 crc kubenswrapper[4909]: I0202 11:58:56.417892 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 02 11:58:56 crc kubenswrapper[4909]: I0202 11:58:56.438952 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.438934541 podStartE2EDuration="2.438934541s" podCreationTimestamp="2026-02-02 11:58:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:58:56.435705232 +0000 UTC m=+5262.181805967" watchObservedRunningTime="2026-02-02 11:58:56.438934541 +0000 UTC m=+5262.185035276" Feb 02 11:58:56 crc kubenswrapper[4909]: I0202 11:58:56.481967 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:58:56 crc kubenswrapper[4909]: I0202 11:58:56.540854 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-vq5fj"] Feb 02 11:58:56 crc kubenswrapper[4909]: I0202 11:58:56.541502 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" podUID="8c9d8c83-78b4-40fe-9680-b9fecabd3728" containerName="dnsmasq-dns" containerID="cri-o://9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c" gracePeriod=10 Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.040416 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.139482 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmtkv\" (UniqueName: \"kubernetes.io/projected/8c9d8c83-78b4-40fe-9680-b9fecabd3728-kube-api-access-cmtkv\") pod \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.139719 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-config\") pod \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.139768 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-dns-svc\") pod \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\" (UID: \"8c9d8c83-78b4-40fe-9680-b9fecabd3728\") " Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.148093 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c9d8c83-78b4-40fe-9680-b9fecabd3728-kube-api-access-cmtkv" (OuterVolumeSpecName: "kube-api-access-cmtkv") pod "8c9d8c83-78b4-40fe-9680-b9fecabd3728" (UID: "8c9d8c83-78b4-40fe-9680-b9fecabd3728"). InnerVolumeSpecName "kube-api-access-cmtkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.176202 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-config" (OuterVolumeSpecName: "config") pod "8c9d8c83-78b4-40fe-9680-b9fecabd3728" (UID: "8c9d8c83-78b4-40fe-9680-b9fecabd3728"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.182270 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c9d8c83-78b4-40fe-9680-b9fecabd3728" (UID: "8c9d8c83-78b4-40fe-9680-b9fecabd3728"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.242048 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.242089 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c9d8c83-78b4-40fe-9680-b9fecabd3728-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.242102 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmtkv\" (UniqueName: \"kubernetes.io/projected/8c9d8c83-78b4-40fe-9680-b9fecabd3728-kube-api-access-cmtkv\") on node \"crc\" DevicePath \"\"" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.425251 4909 generic.go:334] "Generic (PLEG): container finished" podID="8c9d8c83-78b4-40fe-9680-b9fecabd3728" containerID="9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c" exitCode=0 Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.425300 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.425290 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" event={"ID":"8c9d8c83-78b4-40fe-9680-b9fecabd3728","Type":"ContainerDied","Data":"9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c"} Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.425990 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54dc9c94cc-vq5fj" event={"ID":"8c9d8c83-78b4-40fe-9680-b9fecabd3728","Type":"ContainerDied","Data":"de78327881cd7dba573819a3b3556786e4b5095f3d80f87b1eb8da6d9fdeab62"} Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.426010 4909 scope.go:117] "RemoveContainer" containerID="9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.443253 4909 scope.go:117] "RemoveContainer" containerID="168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.460670 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-vq5fj"] Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.467236 4909 scope.go:117] "RemoveContainer" containerID="9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.467239 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54dc9c94cc-vq5fj"] Feb 02 11:58:57 crc kubenswrapper[4909]: E0202 11:58:57.467680 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c\": container with ID starting with 9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c not found: ID does not exist" containerID="9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.467718 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c"} err="failed to get container status \"9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c\": rpc error: code = NotFound desc = could not find container \"9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c\": container with ID starting with 9314a08470357547443738a9d12fcedbbd12b5cc7af79f59752923e34d32701c not found: ID does not exist" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.467743 4909 scope.go:117] "RemoveContainer" containerID="168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506" Feb 02 11:58:57 crc kubenswrapper[4909]: E0202 11:58:57.468227 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506\": container with ID starting with 168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506 not found: ID does not exist" containerID="168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506" Feb 02 11:58:57 crc kubenswrapper[4909]: I0202 11:58:57.468268 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506"} err="failed to get container status \"168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506\": rpc error: code = NotFound desc = could not find container \"168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506\": container with ID starting with 168050b6beedc681e89d77de5c6c0a44d2f7bd4a0462b09729eeb933bbfd9506 not found: ID does not exist" Feb 02 11:58:58 crc kubenswrapper[4909]: I0202 11:58:58.017529 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:58:58 crc kubenswrapper[4909]: E0202 11:58:58.017736 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:58:59 crc kubenswrapper[4909]: I0202 11:58:59.027068 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c9d8c83-78b4-40fe-9680-b9fecabd3728" path="/var/lib/kubelet/pods/8c9d8c83-78b4-40fe-9680-b9fecabd3728/volumes" Feb 02 11:58:59 crc kubenswrapper[4909]: I0202 11:58:59.995345 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-6sn6c"] Feb 02 11:58:59 crc kubenswrapper[4909]: E0202 11:58:59.995954 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c9d8c83-78b4-40fe-9680-b9fecabd3728" containerName="dnsmasq-dns" Feb 02 11:58:59 crc kubenswrapper[4909]: I0202 11:58:59.995971 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c9d8c83-78b4-40fe-9680-b9fecabd3728" containerName="dnsmasq-dns" Feb 02 11:58:59 crc kubenswrapper[4909]: E0202 11:58:59.995981 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c9d8c83-78b4-40fe-9680-b9fecabd3728" containerName="init" Feb 02 11:58:59 crc kubenswrapper[4909]: I0202 11:58:59.995987 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c9d8c83-78b4-40fe-9680-b9fecabd3728" containerName="init" Feb 02 11:58:59 crc kubenswrapper[4909]: I0202 11:58:59.996154 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c9d8c83-78b4-40fe-9680-b9fecabd3728" containerName="dnsmasq-dns" Feb 02 11:58:59 crc kubenswrapper[4909]: I0202 11:58:59.996661 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6sn6c" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.004804 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6sn6c"] Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.086333 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2pkj\" (UniqueName: \"kubernetes.io/projected/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-kube-api-access-d2pkj\") pod \"keystone-db-create-6sn6c\" (UID: \"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0\") " pod="openstack/keystone-db-create-6sn6c" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.086455 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-operator-scripts\") pod \"keystone-db-create-6sn6c\" (UID: \"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0\") " pod="openstack/keystone-db-create-6sn6c" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.100397 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f700-account-create-update-cczfc"] Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.101367 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f700-account-create-update-cczfc" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.112483 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.114942 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f700-account-create-update-cczfc"] Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.190068 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2pkj\" (UniqueName: \"kubernetes.io/projected/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-kube-api-access-d2pkj\") pod \"keystone-db-create-6sn6c\" (UID: \"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0\") " pod="openstack/keystone-db-create-6sn6c" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.190170 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2br\" (UniqueName: \"kubernetes.io/projected/411171cf-4974-4e9a-bcb5-683f05db89cd-kube-api-access-rk2br\") pod \"keystone-f700-account-create-update-cczfc\" (UID: \"411171cf-4974-4e9a-bcb5-683f05db89cd\") " pod="openstack/keystone-f700-account-create-update-cczfc" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.190202 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411171cf-4974-4e9a-bcb5-683f05db89cd-operator-scripts\") pod \"keystone-f700-account-create-update-cczfc\" (UID: \"411171cf-4974-4e9a-bcb5-683f05db89cd\") " pod="openstack/keystone-f700-account-create-update-cczfc" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.190228 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-operator-scripts\") pod \"keystone-db-create-6sn6c\" (UID: \"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0\") " pod="openstack/keystone-db-create-6sn6c" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.191063 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-operator-scripts\") pod \"keystone-db-create-6sn6c\" (UID: \"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0\") " pod="openstack/keystone-db-create-6sn6c" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.212872 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2pkj\" (UniqueName: \"kubernetes.io/projected/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-kube-api-access-d2pkj\") pod \"keystone-db-create-6sn6c\" (UID: \"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0\") " pod="openstack/keystone-db-create-6sn6c" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.291293 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2br\" (UniqueName: \"kubernetes.io/projected/411171cf-4974-4e9a-bcb5-683f05db89cd-kube-api-access-rk2br\") pod \"keystone-f700-account-create-update-cczfc\" (UID: \"411171cf-4974-4e9a-bcb5-683f05db89cd\") " pod="openstack/keystone-f700-account-create-update-cczfc" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.291345 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411171cf-4974-4e9a-bcb5-683f05db89cd-operator-scripts\") pod \"keystone-f700-account-create-update-cczfc\" (UID: \"411171cf-4974-4e9a-bcb5-683f05db89cd\") " pod="openstack/keystone-f700-account-create-update-cczfc" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.292117 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411171cf-4974-4e9a-bcb5-683f05db89cd-operator-scripts\") pod \"keystone-f700-account-create-update-cczfc\" (UID: \"411171cf-4974-4e9a-bcb5-683f05db89cd\") " pod="openstack/keystone-f700-account-create-update-cczfc" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.311707 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6sn6c" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.324173 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2br\" (UniqueName: \"kubernetes.io/projected/411171cf-4974-4e9a-bcb5-683f05db89cd-kube-api-access-rk2br\") pod \"keystone-f700-account-create-update-cczfc\" (UID: \"411171cf-4974-4e9a-bcb5-683f05db89cd\") " pod="openstack/keystone-f700-account-create-update-cczfc" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.417198 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f700-account-create-update-cczfc" Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.776741 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6sn6c"] Feb 02 11:59:00 crc kubenswrapper[4909]: I0202 11:59:00.883974 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f700-account-create-update-cczfc"] Feb 02 11:59:00 crc kubenswrapper[4909]: W0202 11:59:00.884078 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod411171cf_4974_4e9a_bcb5_683f05db89cd.slice/crio-488761e7e00dcb33f23ecf489c4eb83848796e7297484f148abf1f0ca14f0d1b WatchSource:0}: Error finding container 488761e7e00dcb33f23ecf489c4eb83848796e7297484f148abf1f0ca14f0d1b: Status 404 returned error can't find the container with id 488761e7e00dcb33f23ecf489c4eb83848796e7297484f148abf1f0ca14f0d1b Feb 02 11:59:01 crc kubenswrapper[4909]: I0202 11:59:01.463626 4909 generic.go:334] "Generic (PLEG): container finished" podID="411171cf-4974-4e9a-bcb5-683f05db89cd" containerID="1456a2c0e7924fdc4e90027876166a4c836dbc12a2bb519dba885c39a8a6c9af" exitCode=0 Feb 02 11:59:01 crc kubenswrapper[4909]: I0202 11:59:01.463684 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f700-account-create-update-cczfc" event={"ID":"411171cf-4974-4e9a-bcb5-683f05db89cd","Type":"ContainerDied","Data":"1456a2c0e7924fdc4e90027876166a4c836dbc12a2bb519dba885c39a8a6c9af"} Feb 02 11:59:01 crc kubenswrapper[4909]: I0202 11:59:01.464017 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f700-account-create-update-cczfc" event={"ID":"411171cf-4974-4e9a-bcb5-683f05db89cd","Type":"ContainerStarted","Data":"488761e7e00dcb33f23ecf489c4eb83848796e7297484f148abf1f0ca14f0d1b"} Feb 02 11:59:01 crc kubenswrapper[4909]: I0202 11:59:01.465453 4909 generic.go:334] "Generic (PLEG): container finished" podID="5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0" containerID="964b1fd9c5945004adb5e8710440af70b154a112941bf9a64a64b0441cb9f2c9" exitCode=0 Feb 02 11:59:01 crc kubenswrapper[4909]: I0202 11:59:01.465497 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6sn6c" event={"ID":"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0","Type":"ContainerDied","Data":"964b1fd9c5945004adb5e8710440af70b154a112941bf9a64a64b0441cb9f2c9"} Feb 02 11:59:01 crc kubenswrapper[4909]: I0202 11:59:01.465521 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6sn6c" event={"ID":"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0","Type":"ContainerStarted","Data":"71c1ab2e523571b277ce3a6db2cbedc17aff2092464c35dc28098316d60d473b"} Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.766128 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6sn6c" Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.839420 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2pkj\" (UniqueName: \"kubernetes.io/projected/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-kube-api-access-d2pkj\") pod \"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0\" (UID: \"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0\") " Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.839523 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-operator-scripts\") pod \"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0\" (UID: \"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0\") " Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.840295 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0" (UID: "5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.845547 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-kube-api-access-d2pkj" (OuterVolumeSpecName: "kube-api-access-d2pkj") pod "5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0" (UID: "5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0"). InnerVolumeSpecName "kube-api-access-d2pkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.888785 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f700-account-create-update-cczfc" Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.942083 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk2br\" (UniqueName: \"kubernetes.io/projected/411171cf-4974-4e9a-bcb5-683f05db89cd-kube-api-access-rk2br\") pod \"411171cf-4974-4e9a-bcb5-683f05db89cd\" (UID: \"411171cf-4974-4e9a-bcb5-683f05db89cd\") " Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.942175 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411171cf-4974-4e9a-bcb5-683f05db89cd-operator-scripts\") pod \"411171cf-4974-4e9a-bcb5-683f05db89cd\" (UID: \"411171cf-4974-4e9a-bcb5-683f05db89cd\") " Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.942636 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2pkj\" (UniqueName: \"kubernetes.io/projected/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-kube-api-access-d2pkj\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.942658 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.943265 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/411171cf-4974-4e9a-bcb5-683f05db89cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "411171cf-4974-4e9a-bcb5-683f05db89cd" (UID: "411171cf-4974-4e9a-bcb5-683f05db89cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:59:02 crc kubenswrapper[4909]: I0202 11:59:02.944904 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411171cf-4974-4e9a-bcb5-683f05db89cd-kube-api-access-rk2br" (OuterVolumeSpecName: "kube-api-access-rk2br") pod "411171cf-4974-4e9a-bcb5-683f05db89cd" (UID: "411171cf-4974-4e9a-bcb5-683f05db89cd"). InnerVolumeSpecName "kube-api-access-rk2br". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:59:03 crc kubenswrapper[4909]: I0202 11:59:03.044689 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk2br\" (UniqueName: \"kubernetes.io/projected/411171cf-4974-4e9a-bcb5-683f05db89cd-kube-api-access-rk2br\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:03 crc kubenswrapper[4909]: I0202 11:59:03.044717 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411171cf-4974-4e9a-bcb5-683f05db89cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:03 crc kubenswrapper[4909]: I0202 11:59:03.481188 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6sn6c" Feb 02 11:59:03 crc kubenswrapper[4909]: I0202 11:59:03.481183 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6sn6c" event={"ID":"5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0","Type":"ContainerDied","Data":"71c1ab2e523571b277ce3a6db2cbedc17aff2092464c35dc28098316d60d473b"} Feb 02 11:59:03 crc kubenswrapper[4909]: I0202 11:59:03.481308 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71c1ab2e523571b277ce3a6db2cbedc17aff2092464c35dc28098316d60d473b" Feb 02 11:59:03 crc kubenswrapper[4909]: I0202 11:59:03.482387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f700-account-create-update-cczfc" event={"ID":"411171cf-4974-4e9a-bcb5-683f05db89cd","Type":"ContainerDied","Data":"488761e7e00dcb33f23ecf489c4eb83848796e7297484f148abf1f0ca14f0d1b"} Feb 02 11:59:03 crc kubenswrapper[4909]: I0202 11:59:03.482416 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f700-account-create-update-cczfc" Feb 02 11:59:03 crc kubenswrapper[4909]: I0202 11:59:03.482424 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="488761e7e00dcb33f23ecf489c4eb83848796e7297484f148abf1f0ca14f0d1b" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.330364 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.700132 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2tcsj"] Feb 02 11:59:05 crc kubenswrapper[4909]: E0202 11:59:05.700640 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411171cf-4974-4e9a-bcb5-683f05db89cd" containerName="mariadb-account-create-update" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.700665 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="411171cf-4974-4e9a-bcb5-683f05db89cd" containerName="mariadb-account-create-update" Feb 02 11:59:05 crc kubenswrapper[4909]: E0202 11:59:05.700688 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0" containerName="mariadb-database-create" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.700699 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0" containerName="mariadb-database-create" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.700929 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="411171cf-4974-4e9a-bcb5-683f05db89cd" containerName="mariadb-account-create-update" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.700958 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0" containerName="mariadb-database-create" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.701583 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.704225 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g8l9j" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.704503 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.704679 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.705058 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.720068 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2tcsj"] Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.790490 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm2zd\" (UniqueName: \"kubernetes.io/projected/ce660c78-0b8c-4d99-aa8f-f03338f8d887-kube-api-access-vm2zd\") pod \"keystone-db-sync-2tcsj\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.790549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-config-data\") pod \"keystone-db-sync-2tcsj\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.790845 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-combined-ca-bundle\") pod \"keystone-db-sync-2tcsj\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.896095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-config-data\") pod \"keystone-db-sync-2tcsj\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.896342 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-combined-ca-bundle\") pod \"keystone-db-sync-2tcsj\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.896449 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm2zd\" (UniqueName: \"kubernetes.io/projected/ce660c78-0b8c-4d99-aa8f-f03338f8d887-kube-api-access-vm2zd\") pod \"keystone-db-sync-2tcsj\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.903744 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-combined-ca-bundle\") pod \"keystone-db-sync-2tcsj\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.904082 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-config-data\") pod \"keystone-db-sync-2tcsj\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:05 crc kubenswrapper[4909]: I0202 11:59:05.915824 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm2zd\" (UniqueName: \"kubernetes.io/projected/ce660c78-0b8c-4d99-aa8f-f03338f8d887-kube-api-access-vm2zd\") pod \"keystone-db-sync-2tcsj\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:06 crc kubenswrapper[4909]: I0202 11:59:06.021517 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:06 crc kubenswrapper[4909]: I0202 11:59:06.453340 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2tcsj"] Feb 02 11:59:06 crc kubenswrapper[4909]: W0202 11:59:06.453543 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce660c78_0b8c_4d99_aa8f_f03338f8d887.slice/crio-9ab4dc2e4782e1d15168c74f10b8cfa43c1d857c1a14a3cc82e983d0fa3610a7 WatchSource:0}: Error finding container 9ab4dc2e4782e1d15168c74f10b8cfa43c1d857c1a14a3cc82e983d0fa3610a7: Status 404 returned error can't find the container with id 9ab4dc2e4782e1d15168c74f10b8cfa43c1d857c1a14a3cc82e983d0fa3610a7 Feb 02 11:59:06 crc kubenswrapper[4909]: I0202 11:59:06.509885 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2tcsj" event={"ID":"ce660c78-0b8c-4d99-aa8f-f03338f8d887","Type":"ContainerStarted","Data":"9ab4dc2e4782e1d15168c74f10b8cfa43c1d857c1a14a3cc82e983d0fa3610a7"} Feb 02 11:59:07 crc kubenswrapper[4909]: I0202 11:59:07.522580 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2tcsj" event={"ID":"ce660c78-0b8c-4d99-aa8f-f03338f8d887","Type":"ContainerStarted","Data":"b992f4fd6affa56df4ee86e6e874d8b121694670e52820573223a18390617a11"} Feb 02 11:59:07 crc kubenswrapper[4909]: I0202 11:59:07.545769 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2tcsj" podStartSLOduration=2.545742607 podStartE2EDuration="2.545742607s" podCreationTimestamp="2026-02-02 11:59:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:59:07.539599536 +0000 UTC m=+5273.285700271" watchObservedRunningTime="2026-02-02 11:59:07.545742607 +0000 UTC m=+5273.291843342" Feb 02 11:59:08 crc kubenswrapper[4909]: I0202 11:59:08.530704 4909 generic.go:334] "Generic (PLEG): container finished" podID="ce660c78-0b8c-4d99-aa8f-f03338f8d887" containerID="b992f4fd6affa56df4ee86e6e874d8b121694670e52820573223a18390617a11" exitCode=0 Feb 02 11:59:08 crc kubenswrapper[4909]: I0202 11:59:08.530758 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2tcsj" event={"ID":"ce660c78-0b8c-4d99-aa8f-f03338f8d887","Type":"ContainerDied","Data":"b992f4fd6affa56df4ee86e6e874d8b121694670e52820573223a18390617a11"} Feb 02 11:59:09 crc kubenswrapper[4909]: I0202 11:59:09.016361 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:59:09 crc kubenswrapper[4909]: E0202 11:59:09.016683 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 11:59:09 crc kubenswrapper[4909]: I0202 11:59:09.854307 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:09 crc kubenswrapper[4909]: I0202 11:59:09.966024 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-config-data\") pod \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " Feb 02 11:59:09 crc kubenswrapper[4909]: I0202 11:59:09.966101 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm2zd\" (UniqueName: \"kubernetes.io/projected/ce660c78-0b8c-4d99-aa8f-f03338f8d887-kube-api-access-vm2zd\") pod \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " Feb 02 11:59:09 crc kubenswrapper[4909]: I0202 11:59:09.966130 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-combined-ca-bundle\") pod \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\" (UID: \"ce660c78-0b8c-4d99-aa8f-f03338f8d887\") " Feb 02 11:59:09 crc kubenswrapper[4909]: I0202 11:59:09.971778 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce660c78-0b8c-4d99-aa8f-f03338f8d887-kube-api-access-vm2zd" (OuterVolumeSpecName: "kube-api-access-vm2zd") pod "ce660c78-0b8c-4d99-aa8f-f03338f8d887" (UID: "ce660c78-0b8c-4d99-aa8f-f03338f8d887"). InnerVolumeSpecName "kube-api-access-vm2zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:59:09 crc kubenswrapper[4909]: I0202 11:59:09.992071 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce660c78-0b8c-4d99-aa8f-f03338f8d887" (UID: "ce660c78-0b8c-4d99-aa8f-f03338f8d887"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.009975 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-config-data" (OuterVolumeSpecName: "config-data") pod "ce660c78-0b8c-4d99-aa8f-f03338f8d887" (UID: "ce660c78-0b8c-4d99-aa8f-f03338f8d887"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.069125 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.069163 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm2zd\" (UniqueName: \"kubernetes.io/projected/ce660c78-0b8c-4d99-aa8f-f03338f8d887-kube-api-access-vm2zd\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.069179 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce660c78-0b8c-4d99-aa8f-f03338f8d887-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.561038 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2tcsj" event={"ID":"ce660c78-0b8c-4d99-aa8f-f03338f8d887","Type":"ContainerDied","Data":"9ab4dc2e4782e1d15168c74f10b8cfa43c1d857c1a14a3cc82e983d0fa3610a7"} Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.561724 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ab4dc2e4782e1d15168c74f10b8cfa43c1d857c1a14a3cc82e983d0fa3610a7" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.561486 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2tcsj" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.779273 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65fc77cdfc-77wl4"] Feb 02 11:59:10 crc kubenswrapper[4909]: E0202 11:59:10.779864 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce660c78-0b8c-4d99-aa8f-f03338f8d887" containerName="keystone-db-sync" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.779905 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce660c78-0b8c-4d99-aa8f-f03338f8d887" containerName="keystone-db-sync" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.780088 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce660c78-0b8c-4d99-aa8f-f03338f8d887" containerName="keystone-db-sync" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.781210 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.803698 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65fc77cdfc-77wl4"] Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.834324 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-m5zsd"] Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.835737 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.838760 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.839058 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.839362 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.839586 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.839792 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g8l9j" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.860735 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m5zsd"] Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.888871 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-config-data\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.889321 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-scripts\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.889419 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-dns-svc\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.889543 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-combined-ca-bundle\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.889643 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-credential-keys\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.889797 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9nfl\" (UniqueName: \"kubernetes.io/projected/396df48c-8fae-4e7b-9e40-7e0185de2ac8-kube-api-access-w9nfl\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.889973 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g2nb\" (UniqueName: \"kubernetes.io/projected/9155aa19-b965-4d0c-900d-053448dbe0af-kube-api-access-7g2nb\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.890062 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-fernet-keys\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.892778 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-config\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.893078 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-sb\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.893299 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-nb\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.994942 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-scripts\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.995004 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-dns-svc\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.995047 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-combined-ca-bundle\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.995065 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-credential-keys\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.995108 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9nfl\" (UniqueName: \"kubernetes.io/projected/396df48c-8fae-4e7b-9e40-7e0185de2ac8-kube-api-access-w9nfl\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.995171 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g2nb\" (UniqueName: \"kubernetes.io/projected/9155aa19-b965-4d0c-900d-053448dbe0af-kube-api-access-7g2nb\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.995214 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-fernet-keys\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.995250 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-config\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.995280 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-sb\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.995339 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-nb\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.995364 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-config-data\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.996457 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-dns-svc\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.996487 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-config\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.996682 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-sb\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:10 crc kubenswrapper[4909]: I0202 11:59:10.998498 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-nb\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:10.999992 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-credential-keys\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:11.000091 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-combined-ca-bundle\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:11.000289 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-config-data\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:11.008019 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-fernet-keys\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:11.008348 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-scripts\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:11.019529 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g2nb\" (UniqueName: \"kubernetes.io/projected/9155aa19-b965-4d0c-900d-053448dbe0af-kube-api-access-7g2nb\") pod \"dnsmasq-dns-65fc77cdfc-77wl4\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:11.019530 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9nfl\" (UniqueName: \"kubernetes.io/projected/396df48c-8fae-4e7b-9e40-7e0185de2ac8-kube-api-access-w9nfl\") pod \"keystone-bootstrap-m5zsd\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:11.108191 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:11.176997 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:11.622483 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65fc77cdfc-77wl4"] Feb 02 11:59:11 crc kubenswrapper[4909]: I0202 11:59:11.731063 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m5zsd"] Feb 02 11:59:12 crc kubenswrapper[4909]: I0202 11:59:12.577495 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m5zsd" event={"ID":"396df48c-8fae-4e7b-9e40-7e0185de2ac8","Type":"ContainerStarted","Data":"4b2762d3e81516838c026874fac5de21b6e4eb372af75a58811af530077b7c16"} Feb 02 11:59:12 crc kubenswrapper[4909]: I0202 11:59:12.577956 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m5zsd" event={"ID":"396df48c-8fae-4e7b-9e40-7e0185de2ac8","Type":"ContainerStarted","Data":"2aa0001fe589a0e160968dea1aedaa0a2ae33d7741cd2a44cb136d38817a64d7"} Feb 02 11:59:12 crc kubenswrapper[4909]: I0202 11:59:12.579505 4909 generic.go:334] "Generic (PLEG): container finished" podID="9155aa19-b965-4d0c-900d-053448dbe0af" containerID="177d496d3499a8e2e3d0378841b96fb5a1a89415da4ee397d1d686e910cf3f89" exitCode=0 Feb 02 11:59:12 crc kubenswrapper[4909]: I0202 11:59:12.579542 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" event={"ID":"9155aa19-b965-4d0c-900d-053448dbe0af","Type":"ContainerDied","Data":"177d496d3499a8e2e3d0378841b96fb5a1a89415da4ee397d1d686e910cf3f89"} Feb 02 11:59:12 crc kubenswrapper[4909]: I0202 11:59:12.579562 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" event={"ID":"9155aa19-b965-4d0c-900d-053448dbe0af","Type":"ContainerStarted","Data":"cdf3001c5f2c70a9ded00d1aca6ec61a01bb3f335733170f6a592128ede2ed8e"} Feb 02 11:59:12 crc kubenswrapper[4909]: I0202 11:59:12.602123 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-m5zsd" podStartSLOduration=2.6021061210000003 podStartE2EDuration="2.602106121s" podCreationTimestamp="2026-02-02 11:59:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:59:12.598638425 +0000 UTC m=+5278.344739160" watchObservedRunningTime="2026-02-02 11:59:12.602106121 +0000 UTC m=+5278.348206876" Feb 02 11:59:13 crc kubenswrapper[4909]: I0202 11:59:13.590206 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" event={"ID":"9155aa19-b965-4d0c-900d-053448dbe0af","Type":"ContainerStarted","Data":"e50a5af3f8c4c38ccdf23f9f356c02ab53d37a0ec4decd79846bb16017738094"} Feb 02 11:59:13 crc kubenswrapper[4909]: I0202 11:59:13.590531 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:13 crc kubenswrapper[4909]: I0202 11:59:13.607694 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" podStartSLOduration=3.607674168 podStartE2EDuration="3.607674168s" podCreationTimestamp="2026-02-02 11:59:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:59:13.60412988 +0000 UTC m=+5279.350230625" watchObservedRunningTime="2026-02-02 11:59:13.607674168 +0000 UTC m=+5279.353774893" Feb 02 11:59:15 crc kubenswrapper[4909]: I0202 11:59:15.605077 4909 generic.go:334] "Generic (PLEG): container finished" podID="396df48c-8fae-4e7b-9e40-7e0185de2ac8" containerID="4b2762d3e81516838c026874fac5de21b6e4eb372af75a58811af530077b7c16" exitCode=0 Feb 02 11:59:15 crc kubenswrapper[4909]: I0202 11:59:15.605165 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m5zsd" event={"ID":"396df48c-8fae-4e7b-9e40-7e0185de2ac8","Type":"ContainerDied","Data":"4b2762d3e81516838c026874fac5de21b6e4eb372af75a58811af530077b7c16"} Feb 02 11:59:16 crc kubenswrapper[4909]: I0202 11:59:16.940062 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.009921 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-combined-ca-bundle\") pod \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.009985 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-config-data\") pod \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.010033 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-credential-keys\") pod \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.010065 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-fernet-keys\") pod \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.010094 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-scripts\") pod \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.010145 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9nfl\" (UniqueName: \"kubernetes.io/projected/396df48c-8fae-4e7b-9e40-7e0185de2ac8-kube-api-access-w9nfl\") pod \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\" (UID: \"396df48c-8fae-4e7b-9e40-7e0185de2ac8\") " Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.015939 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-scripts" (OuterVolumeSpecName: "scripts") pod "396df48c-8fae-4e7b-9e40-7e0185de2ac8" (UID: "396df48c-8fae-4e7b-9e40-7e0185de2ac8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.016006 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/396df48c-8fae-4e7b-9e40-7e0185de2ac8-kube-api-access-w9nfl" (OuterVolumeSpecName: "kube-api-access-w9nfl") pod "396df48c-8fae-4e7b-9e40-7e0185de2ac8" (UID: "396df48c-8fae-4e7b-9e40-7e0185de2ac8"). InnerVolumeSpecName "kube-api-access-w9nfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.016545 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "396df48c-8fae-4e7b-9e40-7e0185de2ac8" (UID: "396df48c-8fae-4e7b-9e40-7e0185de2ac8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.016559 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "396df48c-8fae-4e7b-9e40-7e0185de2ac8" (UID: "396df48c-8fae-4e7b-9e40-7e0185de2ac8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.033675 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-config-data" (OuterVolumeSpecName: "config-data") pod "396df48c-8fae-4e7b-9e40-7e0185de2ac8" (UID: "396df48c-8fae-4e7b-9e40-7e0185de2ac8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.042581 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "396df48c-8fae-4e7b-9e40-7e0185de2ac8" (UID: "396df48c-8fae-4e7b-9e40-7e0185de2ac8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.111771 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.111807 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.111865 4909 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.111875 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.111884 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/396df48c-8fae-4e7b-9e40-7e0185de2ac8-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.111895 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9nfl\" (UniqueName: \"kubernetes.io/projected/396df48c-8fae-4e7b-9e40-7e0185de2ac8-kube-api-access-w9nfl\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.620254 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m5zsd" event={"ID":"396df48c-8fae-4e7b-9e40-7e0185de2ac8","Type":"ContainerDied","Data":"2aa0001fe589a0e160968dea1aedaa0a2ae33d7741cd2a44cb136d38817a64d7"} Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.620305 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aa0001fe589a0e160968dea1aedaa0a2ae33d7741cd2a44cb136d38817a64d7" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.620309 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m5zsd" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.688040 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-m5zsd"] Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.693940 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-m5zsd"] Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.780887 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-js4dl"] Feb 02 11:59:17 crc kubenswrapper[4909]: E0202 11:59:17.781297 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="396df48c-8fae-4e7b-9e40-7e0185de2ac8" containerName="keystone-bootstrap" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.781312 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="396df48c-8fae-4e7b-9e40-7e0185de2ac8" containerName="keystone-bootstrap" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.781494 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="396df48c-8fae-4e7b-9e40-7e0185de2ac8" containerName="keystone-bootstrap" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.782200 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.793927 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-js4dl"] Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.832024 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.832327 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.832407 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.832506 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.832269 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zdzt\" (UniqueName: \"kubernetes.io/projected/822011f9-e6a3-4a9e-9efd-57a0615fbe69-kube-api-access-8zdzt\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.832735 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g8l9j" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.832843 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-combined-ca-bundle\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.832874 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-fernet-keys\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.833080 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-config-data\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.833168 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-credential-keys\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.833198 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-scripts\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.934753 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-config-data\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.934806 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-credential-keys\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.934872 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-scripts\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.934909 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zdzt\" (UniqueName: \"kubernetes.io/projected/822011f9-e6a3-4a9e-9efd-57a0615fbe69-kube-api-access-8zdzt\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.935007 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-combined-ca-bundle\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.935028 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-fernet-keys\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.939194 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-scripts\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.941079 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-config-data\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.942895 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-fernet-keys\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.943434 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-credential-keys\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.950415 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-combined-ca-bundle\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:17 crc kubenswrapper[4909]: I0202 11:59:17.954910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zdzt\" (UniqueName: \"kubernetes.io/projected/822011f9-e6a3-4a9e-9efd-57a0615fbe69-kube-api-access-8zdzt\") pod \"keystone-bootstrap-js4dl\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:18 crc kubenswrapper[4909]: I0202 11:59:18.150292 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:18 crc kubenswrapper[4909]: I0202 11:59:18.550220 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-js4dl"] Feb 02 11:59:18 crc kubenswrapper[4909]: W0202 11:59:18.553291 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod822011f9_e6a3_4a9e_9efd_57a0615fbe69.slice/crio-d72eb891f3080eba7600d509771afa7ab24385f781721f90f4358de27d44a91e WatchSource:0}: Error finding container d72eb891f3080eba7600d509771afa7ab24385f781721f90f4358de27d44a91e: Status 404 returned error can't find the container with id d72eb891f3080eba7600d509771afa7ab24385f781721f90f4358de27d44a91e Feb 02 11:59:18 crc kubenswrapper[4909]: I0202 11:59:18.630507 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-js4dl" event={"ID":"822011f9-e6a3-4a9e-9efd-57a0615fbe69","Type":"ContainerStarted","Data":"d72eb891f3080eba7600d509771afa7ab24385f781721f90f4358de27d44a91e"} Feb 02 11:59:19 crc kubenswrapper[4909]: I0202 11:59:19.024965 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="396df48c-8fae-4e7b-9e40-7e0185de2ac8" path="/var/lib/kubelet/pods/396df48c-8fae-4e7b-9e40-7e0185de2ac8/volumes" Feb 02 11:59:19 crc kubenswrapper[4909]: I0202 11:59:19.640190 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-js4dl" event={"ID":"822011f9-e6a3-4a9e-9efd-57a0615fbe69","Type":"ContainerStarted","Data":"656cfb0a9ef48145fe7201d7cc35dc862681571cc74b41c47675bab7c96240a9"} Feb 02 11:59:19 crc kubenswrapper[4909]: I0202 11:59:19.662261 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-js4dl" podStartSLOduration=2.662241656 podStartE2EDuration="2.662241656s" podCreationTimestamp="2026-02-02 11:59:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:59:19.661923147 +0000 UTC m=+5285.408023882" watchObservedRunningTime="2026-02-02 11:59:19.662241656 +0000 UTC m=+5285.408342391" Feb 02 11:59:20 crc kubenswrapper[4909]: I0202 11:59:20.719517 4909 scope.go:117] "RemoveContainer" containerID="810c31497df0350394d341a52016e7f54b2e46fbbb38ffebba0774ccbc29b1ff" Feb 02 11:59:20 crc kubenswrapper[4909]: I0202 11:59:20.748000 4909 scope.go:117] "RemoveContainer" containerID="173d4a6d65cb2b93ed6b1337136f8b36a6f1e7de5d5226fa0d8ef32279162cfe" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.110620 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.163375 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f48d44b77-lkn6v"] Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.164040 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" podUID="c3d6e381-a830-4c0f-9f33-294d0c937f1f" containerName="dnsmasq-dns" containerID="cri-o://f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb" gracePeriod=10 Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.575242 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.653988 4909 generic.go:334] "Generic (PLEG): container finished" podID="822011f9-e6a3-4a9e-9efd-57a0615fbe69" containerID="656cfb0a9ef48145fe7201d7cc35dc862681571cc74b41c47675bab7c96240a9" exitCode=0 Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.654061 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-js4dl" event={"ID":"822011f9-e6a3-4a9e-9efd-57a0615fbe69","Type":"ContainerDied","Data":"656cfb0a9ef48145fe7201d7cc35dc862681571cc74b41c47675bab7c96240a9"} Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.655645 4909 generic.go:334] "Generic (PLEG): container finished" podID="c3d6e381-a830-4c0f-9f33-294d0c937f1f" containerID="f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb" exitCode=0 Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.655676 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" event={"ID":"c3d6e381-a830-4c0f-9f33-294d0c937f1f","Type":"ContainerDied","Data":"f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb"} Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.655692 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.655705 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" event={"ID":"c3d6e381-a830-4c0f-9f33-294d0c937f1f","Type":"ContainerDied","Data":"9640480efce321725fe29a36fed2edf866076c5a2dc587212a828965929fe783"} Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.655729 4909 scope.go:117] "RemoveContainer" containerID="f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.679117 4909 scope.go:117] "RemoveContainer" containerID="8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.698248 4909 scope.go:117] "RemoveContainer" containerID="f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb" Feb 02 11:59:21 crc kubenswrapper[4909]: E0202 11:59:21.698630 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb\": container with ID starting with f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb not found: ID does not exist" containerID="f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.698668 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb"} err="failed to get container status \"f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb\": rpc error: code = NotFound desc = could not find container \"f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb\": container with ID starting with f325f1f64fd6fd33ea1aead32ff0baeab14f7fe2e130aaf9a40b56c78d83aabb not found: ID does not exist" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.698689 4909 scope.go:117] "RemoveContainer" containerID="8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e" Feb 02 11:59:21 crc kubenswrapper[4909]: E0202 11:59:21.699058 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e\": container with ID starting with 8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e not found: ID does not exist" containerID="8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.699088 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e"} err="failed to get container status \"8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e\": rpc error: code = NotFound desc = could not find container \"8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e\": container with ID starting with 8d62c73103866c88f5ec23e9d55f9e2cd720095a8ddeab7ae27794fbb420791e not found: ID does not exist" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.717630 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-dns-svc\") pod \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.717784 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-sb\") pod \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.717864 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-nb\") pod \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.717910 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45kfb\" (UniqueName: \"kubernetes.io/projected/c3d6e381-a830-4c0f-9f33-294d0c937f1f-kube-api-access-45kfb\") pod \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.718549 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-config\") pod \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\" (UID: \"c3d6e381-a830-4c0f-9f33-294d0c937f1f\") " Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.722894 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3d6e381-a830-4c0f-9f33-294d0c937f1f-kube-api-access-45kfb" (OuterVolumeSpecName: "kube-api-access-45kfb") pod "c3d6e381-a830-4c0f-9f33-294d0c937f1f" (UID: "c3d6e381-a830-4c0f-9f33-294d0c937f1f"). InnerVolumeSpecName "kube-api-access-45kfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.754396 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-config" (OuterVolumeSpecName: "config") pod "c3d6e381-a830-4c0f-9f33-294d0c937f1f" (UID: "c3d6e381-a830-4c0f-9f33-294d0c937f1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.754965 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c3d6e381-a830-4c0f-9f33-294d0c937f1f" (UID: "c3d6e381-a830-4c0f-9f33-294d0c937f1f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.756638 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c3d6e381-a830-4c0f-9f33-294d0c937f1f" (UID: "c3d6e381-a830-4c0f-9f33-294d0c937f1f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.760625 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3d6e381-a830-4c0f-9f33-294d0c937f1f" (UID: "c3d6e381-a830-4c0f-9f33-294d0c937f1f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.820706 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.820749 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45kfb\" (UniqueName: \"kubernetes.io/projected/c3d6e381-a830-4c0f-9f33-294d0c937f1f-kube-api-access-45kfb\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.820770 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-config\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.820781 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.820793 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3d6e381-a830-4c0f-9f33-294d0c937f1f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:21 crc kubenswrapper[4909]: I0202 11:59:21.997093 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f48d44b77-lkn6v"] Feb 02 11:59:22 crc kubenswrapper[4909]: I0202 11:59:22.003282 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f48d44b77-lkn6v"] Feb 02 11:59:22 crc kubenswrapper[4909]: I0202 11:59:22.016436 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 11:59:22 crc kubenswrapper[4909]: I0202 11:59:22.665529 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"c8a4532cb0e21b24c65ea2a45893f4d1a368d3a1bac20d18498ef08f25007082"} Feb 02 11:59:22 crc kubenswrapper[4909]: I0202 11:59:22.980789 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.026119 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3d6e381-a830-4c0f-9f33-294d0c937f1f" path="/var/lib/kubelet/pods/c3d6e381-a830-4c0f-9f33-294d0c937f1f/volumes" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.042618 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-combined-ca-bundle\") pod \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.042701 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-scripts\") pod \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.042832 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-credential-keys\") pod \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.042868 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zdzt\" (UniqueName: \"kubernetes.io/projected/822011f9-e6a3-4a9e-9efd-57a0615fbe69-kube-api-access-8zdzt\") pod \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.042894 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-config-data\") pod \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.042925 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-fernet-keys\") pod \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\" (UID: \"822011f9-e6a3-4a9e-9efd-57a0615fbe69\") " Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.048652 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-scripts" (OuterVolumeSpecName: "scripts") pod "822011f9-e6a3-4a9e-9efd-57a0615fbe69" (UID: "822011f9-e6a3-4a9e-9efd-57a0615fbe69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.048675 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "822011f9-e6a3-4a9e-9efd-57a0615fbe69" (UID: "822011f9-e6a3-4a9e-9efd-57a0615fbe69"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.063119 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "822011f9-e6a3-4a9e-9efd-57a0615fbe69" (UID: "822011f9-e6a3-4a9e-9efd-57a0615fbe69"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.064976 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822011f9-e6a3-4a9e-9efd-57a0615fbe69-kube-api-access-8zdzt" (OuterVolumeSpecName: "kube-api-access-8zdzt") pod "822011f9-e6a3-4a9e-9efd-57a0615fbe69" (UID: "822011f9-e6a3-4a9e-9efd-57a0615fbe69"). InnerVolumeSpecName "kube-api-access-8zdzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.068162 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-config-data" (OuterVolumeSpecName: "config-data") pod "822011f9-e6a3-4a9e-9efd-57a0615fbe69" (UID: "822011f9-e6a3-4a9e-9efd-57a0615fbe69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.073967 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "822011f9-e6a3-4a9e-9efd-57a0615fbe69" (UID: "822011f9-e6a3-4a9e-9efd-57a0615fbe69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.144310 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.144333 4909 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.144342 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zdzt\" (UniqueName: \"kubernetes.io/projected/822011f9-e6a3-4a9e-9efd-57a0615fbe69-kube-api-access-8zdzt\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.144351 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.144361 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.144369 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822011f9-e6a3-4a9e-9efd-57a0615fbe69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.674499 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-js4dl" event={"ID":"822011f9-e6a3-4a9e-9efd-57a0615fbe69","Type":"ContainerDied","Data":"d72eb891f3080eba7600d509771afa7ab24385f781721f90f4358de27d44a91e"} Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.674765 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d72eb891f3080eba7600d509771afa7ab24385f781721f90f4358de27d44a91e" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.674560 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-js4dl" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.758304 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55fd6bf9fc-5zh8j"] Feb 02 11:59:23 crc kubenswrapper[4909]: E0202 11:59:23.758605 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d6e381-a830-4c0f-9f33-294d0c937f1f" containerName="init" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.758622 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d6e381-a830-4c0f-9f33-294d0c937f1f" containerName="init" Feb 02 11:59:23 crc kubenswrapper[4909]: E0202 11:59:23.758637 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822011f9-e6a3-4a9e-9efd-57a0615fbe69" containerName="keystone-bootstrap" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.758645 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="822011f9-e6a3-4a9e-9efd-57a0615fbe69" containerName="keystone-bootstrap" Feb 02 11:59:23 crc kubenswrapper[4909]: E0202 11:59:23.758659 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3d6e381-a830-4c0f-9f33-294d0c937f1f" containerName="dnsmasq-dns" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.758666 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3d6e381-a830-4c0f-9f33-294d0c937f1f" containerName="dnsmasq-dns" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.758883 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="822011f9-e6a3-4a9e-9efd-57a0615fbe69" containerName="keystone-bootstrap" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.758901 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3d6e381-a830-4c0f-9f33-294d0c937f1f" containerName="dnsmasq-dns" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.759382 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.761465 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.761558 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-g8l9j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.762580 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.762764 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.763127 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.763409 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.772343 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55fd6bf9fc-5zh8j"] Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.855186 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-scripts\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.855274 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-config-data\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.855310 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-credential-keys\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.855336 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-internal-tls-certs\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.855423 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-fernet-keys\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.855465 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6jch\" (UniqueName: \"kubernetes.io/projected/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-kube-api-access-k6jch\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.855495 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-public-tls-certs\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.855532 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-combined-ca-bundle\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.957506 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6jch\" (UniqueName: \"kubernetes.io/projected/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-kube-api-access-k6jch\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.957558 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-public-tls-certs\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.957668 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-combined-ca-bundle\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.958517 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-scripts\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.958591 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-config-data\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.958633 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-credential-keys\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.958654 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-internal-tls-certs\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.958718 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-fernet-keys\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.966769 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-config-data\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.969538 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-scripts\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.969588 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-public-tls-certs\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.969728 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-fernet-keys\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.969745 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-credential-keys\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.971255 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-combined-ca-bundle\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.973253 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-internal-tls-certs\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:23 crc kubenswrapper[4909]: I0202 11:59:23.984487 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6jch\" (UniqueName: \"kubernetes.io/projected/4fb66e6a-8ffd-4afa-9764-42bfb3cef442-kube-api-access-k6jch\") pod \"keystone-55fd6bf9fc-5zh8j\" (UID: \"4fb66e6a-8ffd-4afa-9764-42bfb3cef442\") " pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:24 crc kubenswrapper[4909]: I0202 11:59:24.074477 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:24 crc kubenswrapper[4909]: I0202 11:59:24.531846 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55fd6bf9fc-5zh8j"] Feb 02 11:59:24 crc kubenswrapper[4909]: I0202 11:59:24.686083 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55fd6bf9fc-5zh8j" event={"ID":"4fb66e6a-8ffd-4afa-9764-42bfb3cef442","Type":"ContainerStarted","Data":"9b646e3c597cd3f18d25128c2433ee3c48b310c40fcd1759313fffbf5bc8c470"} Feb 02 11:59:25 crc kubenswrapper[4909]: I0202 11:59:25.698617 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55fd6bf9fc-5zh8j" event={"ID":"4fb66e6a-8ffd-4afa-9764-42bfb3cef442","Type":"ContainerStarted","Data":"a8fd31610b0d4422ff522831eb76c9493581721beac441ccc941e32efdb82feb"} Feb 02 11:59:25 crc kubenswrapper[4909]: I0202 11:59:25.699383 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 11:59:25 crc kubenswrapper[4909]: I0202 11:59:25.740536 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55fd6bf9fc-5zh8j" podStartSLOduration=2.740513832 podStartE2EDuration="2.740513832s" podCreationTimestamp="2026-02-02 11:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 11:59:25.729890746 +0000 UTC m=+5291.475991711" watchObservedRunningTime="2026-02-02 11:59:25.740513832 +0000 UTC m=+5291.486614567" Feb 02 11:59:26 crc kubenswrapper[4909]: I0202 11:59:26.470945 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7f48d44b77-lkn6v" podUID="c3d6e381-a830-4c0f-9f33-294d0c937f1f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.11:5353: i/o timeout" Feb 02 11:59:55 crc kubenswrapper[4909]: I0202 11:59:55.605121 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55fd6bf9fc-5zh8j" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.007922 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.009485 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.013190 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zlrpf" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.013191 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.014457 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.019661 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.045777 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.046047 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zpbc\" (UniqueName: \"kubernetes.io/projected/a710e525-42ba-4dd6-baf7-514f315b2c26-kube-api-access-6zpbc\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.046143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config-secret\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.046282 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.135822 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq"] Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.137040 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.141155 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.141184 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.148044 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1adc3405-27a3-4632-bc44-bb039b4764f7-secret-volume\") pod \"collect-profiles-29500560-vx5lq\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.148163 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1adc3405-27a3-4632-bc44-bb039b4764f7-config-volume\") pod \"collect-profiles-29500560-vx5lq\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.148231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.148285 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zpbc\" (UniqueName: \"kubernetes.io/projected/a710e525-42ba-4dd6-baf7-514f315b2c26-kube-api-access-6zpbc\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.148320 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config-secret\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.148385 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkq95\" (UniqueName: \"kubernetes.io/projected/1adc3405-27a3-4632-bc44-bb039b4764f7-kube-api-access-fkq95\") pod \"collect-profiles-29500560-vx5lq\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.148478 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.149206 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.151269 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq"] Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.157666 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.158695 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config-secret\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.171303 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zpbc\" (UniqueName: \"kubernetes.io/projected/a710e525-42ba-4dd6-baf7-514f315b2c26-kube-api-access-6zpbc\") pod \"openstackclient\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.250231 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1adc3405-27a3-4632-bc44-bb039b4764f7-config-volume\") pod \"collect-profiles-29500560-vx5lq\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.250339 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkq95\" (UniqueName: \"kubernetes.io/projected/1adc3405-27a3-4632-bc44-bb039b4764f7-kube-api-access-fkq95\") pod \"collect-profiles-29500560-vx5lq\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.250428 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1adc3405-27a3-4632-bc44-bb039b4764f7-secret-volume\") pod \"collect-profiles-29500560-vx5lq\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.251497 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1adc3405-27a3-4632-bc44-bb039b4764f7-config-volume\") pod \"collect-profiles-29500560-vx5lq\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.254641 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1adc3405-27a3-4632-bc44-bb039b4764f7-secret-volume\") pod \"collect-profiles-29500560-vx5lq\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.267249 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkq95\" (UniqueName: \"kubernetes.io/projected/1adc3405-27a3-4632-bc44-bb039b4764f7-kube-api-access-fkq95\") pod \"collect-profiles-29500560-vx5lq\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.333830 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.459225 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.753007 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.859734 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq"] Feb 02 12:00:00 crc kubenswrapper[4909]: W0202 12:00:00.867487 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1adc3405_27a3_4632_bc44_bb039b4764f7.slice/crio-7534c6d18ea5ab2ee2f85fcf18512d6265819d93ad3e0ec9b05e6439b42c24e9 WatchSource:0}: Error finding container 7534c6d18ea5ab2ee2f85fcf18512d6265819d93ad3e0ec9b05e6439b42c24e9: Status 404 returned error can't find the container with id 7534c6d18ea5ab2ee2f85fcf18512d6265819d93ad3e0ec9b05e6439b42c24e9 Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.948259 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" event={"ID":"1adc3405-27a3-4632-bc44-bb039b4764f7","Type":"ContainerStarted","Data":"7534c6d18ea5ab2ee2f85fcf18512d6265819d93ad3e0ec9b05e6439b42c24e9"} Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.949742 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a710e525-42ba-4dd6-baf7-514f315b2c26","Type":"ContainerStarted","Data":"83f04bf35bbd5c3e95033ce0c843ef9043fd4dfff663e2b71a1315bd36aa94f8"} Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.949775 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a710e525-42ba-4dd6-baf7-514f315b2c26","Type":"ContainerStarted","Data":"e7a7c40ab8e53a22ff953809a325d20b3f25c9f4d3b8e663629fcd21520b639e"} Feb 02 12:00:00 crc kubenswrapper[4909]: I0202 12:00:00.972198 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.972176781 podStartE2EDuration="1.972176781s" podCreationTimestamp="2026-02-02 11:59:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:00:00.965254878 +0000 UTC m=+5326.711355613" watchObservedRunningTime="2026-02-02 12:00:00.972176781 +0000 UTC m=+5326.718277516" Feb 02 12:00:01 crc kubenswrapper[4909]: I0202 12:00:01.958763 4909 generic.go:334] "Generic (PLEG): container finished" podID="1adc3405-27a3-4632-bc44-bb039b4764f7" containerID="5f7f8e5dfb27c1152235d5f36d78d1bb515b4f28f5ea150c9413f7c280ba8024" exitCode=0 Feb 02 12:00:01 crc kubenswrapper[4909]: I0202 12:00:01.958863 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" event={"ID":"1adc3405-27a3-4632-bc44-bb039b4764f7","Type":"ContainerDied","Data":"5f7f8e5dfb27c1152235d5f36d78d1bb515b4f28f5ea150c9413f7c280ba8024"} Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.243777 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.404870 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkq95\" (UniqueName: \"kubernetes.io/projected/1adc3405-27a3-4632-bc44-bb039b4764f7-kube-api-access-fkq95\") pod \"1adc3405-27a3-4632-bc44-bb039b4764f7\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.404910 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1adc3405-27a3-4632-bc44-bb039b4764f7-secret-volume\") pod \"1adc3405-27a3-4632-bc44-bb039b4764f7\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.404974 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1adc3405-27a3-4632-bc44-bb039b4764f7-config-volume\") pod \"1adc3405-27a3-4632-bc44-bb039b4764f7\" (UID: \"1adc3405-27a3-4632-bc44-bb039b4764f7\") " Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.405676 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1adc3405-27a3-4632-bc44-bb039b4764f7-config-volume" (OuterVolumeSpecName: "config-volume") pod "1adc3405-27a3-4632-bc44-bb039b4764f7" (UID: "1adc3405-27a3-4632-bc44-bb039b4764f7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.410493 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1adc3405-27a3-4632-bc44-bb039b4764f7-kube-api-access-fkq95" (OuterVolumeSpecName: "kube-api-access-fkq95") pod "1adc3405-27a3-4632-bc44-bb039b4764f7" (UID: "1adc3405-27a3-4632-bc44-bb039b4764f7"). InnerVolumeSpecName "kube-api-access-fkq95". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.412994 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1adc3405-27a3-4632-bc44-bb039b4764f7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1adc3405-27a3-4632-bc44-bb039b4764f7" (UID: "1adc3405-27a3-4632-bc44-bb039b4764f7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.506308 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkq95\" (UniqueName: \"kubernetes.io/projected/1adc3405-27a3-4632-bc44-bb039b4764f7-kube-api-access-fkq95\") on node \"crc\" DevicePath \"\"" Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.506343 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1adc3405-27a3-4632-bc44-bb039b4764f7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.506356 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1adc3405-27a3-4632-bc44-bb039b4764f7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.973897 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" event={"ID":"1adc3405-27a3-4632-bc44-bb039b4764f7","Type":"ContainerDied","Data":"7534c6d18ea5ab2ee2f85fcf18512d6265819d93ad3e0ec9b05e6439b42c24e9"} Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.973947 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq" Feb 02 12:00:03 crc kubenswrapper[4909]: I0202 12:00:03.973964 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7534c6d18ea5ab2ee2f85fcf18512d6265819d93ad3e0ec9b05e6439b42c24e9" Feb 02 12:00:04 crc kubenswrapper[4909]: I0202 12:00:04.322226 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h"] Feb 02 12:00:04 crc kubenswrapper[4909]: I0202 12:00:04.330834 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500515-8jw4h"] Feb 02 12:00:05 crc kubenswrapper[4909]: I0202 12:00:05.030107 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca8f19f-9709-47f7-8207-7e98f2eec922" path="/var/lib/kubelet/pods/1ca8f19f-9709-47f7-8207-7e98f2eec922/volumes" Feb 02 12:00:20 crc kubenswrapper[4909]: I0202 12:00:20.841928 4909 scope.go:117] "RemoveContainer" containerID="439dc9403e649a83be0f9cae449b08589f73d864776a962e1987a3a3dba74aae" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.146078 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29500561-sck9k"] Feb 02 12:01:00 crc kubenswrapper[4909]: E0202 12:01:00.147351 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1adc3405-27a3-4632-bc44-bb039b4764f7" containerName="collect-profiles" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.147369 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1adc3405-27a3-4632-bc44-bb039b4764f7" containerName="collect-profiles" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.147519 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1adc3405-27a3-4632-bc44-bb039b4764f7" containerName="collect-profiles" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.148122 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.157384 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500561-sck9k"] Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.255197 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-combined-ca-bundle\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.255266 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-fernet-keys\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.255666 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-config-data\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.255705 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b6tb\" (UniqueName: \"kubernetes.io/projected/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-kube-api-access-4b6tb\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.356914 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-config-data\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.357271 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b6tb\" (UniqueName: \"kubernetes.io/projected/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-kube-api-access-4b6tb\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.357471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-combined-ca-bundle\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.358030 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-fernet-keys\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.363558 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-fernet-keys\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.378341 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-combined-ca-bundle\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.378452 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b6tb\" (UniqueName: \"kubernetes.io/projected/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-kube-api-access-4b6tb\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.378457 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-config-data\") pod \"keystone-cron-29500561-sck9k\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.471714 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:00 crc kubenswrapper[4909]: I0202 12:01:00.890877 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500561-sck9k"] Feb 02 12:01:01 crc kubenswrapper[4909]: I0202 12:01:01.411880 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500561-sck9k" event={"ID":"966a7f05-d7b9-4141-bba9-76b00ac9f4a5","Type":"ContainerStarted","Data":"4a78dab60345506a0bc2534ad5013b7779d2231ef232640aefc31a0f1cfb54ab"} Feb 02 12:01:01 crc kubenswrapper[4909]: I0202 12:01:01.411917 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500561-sck9k" event={"ID":"966a7f05-d7b9-4141-bba9-76b00ac9f4a5","Type":"ContainerStarted","Data":"e1b5cba79588a68b928804fffe32222a891927e1765a4b7bff9ba438ebd0801c"} Feb 02 12:01:01 crc kubenswrapper[4909]: I0202 12:01:01.432732 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29500561-sck9k" podStartSLOduration=1.432716826 podStartE2EDuration="1.432716826s" podCreationTimestamp="2026-02-02 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:01:01.425764932 +0000 UTC m=+5387.171865687" watchObservedRunningTime="2026-02-02 12:01:01.432716826 +0000 UTC m=+5387.178817561" Feb 02 12:01:03 crc kubenswrapper[4909]: I0202 12:01:03.427591 4909 generic.go:334] "Generic (PLEG): container finished" podID="966a7f05-d7b9-4141-bba9-76b00ac9f4a5" containerID="4a78dab60345506a0bc2534ad5013b7779d2231ef232640aefc31a0f1cfb54ab" exitCode=0 Feb 02 12:01:03 crc kubenswrapper[4909]: I0202 12:01:03.427671 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500561-sck9k" event={"ID":"966a7f05-d7b9-4141-bba9-76b00ac9f4a5","Type":"ContainerDied","Data":"4a78dab60345506a0bc2534ad5013b7779d2231ef232640aefc31a0f1cfb54ab"} Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.777532 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.848070 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b6tb\" (UniqueName: \"kubernetes.io/projected/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-kube-api-access-4b6tb\") pod \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.848139 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-combined-ca-bundle\") pod \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.848248 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-fernet-keys\") pod \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.848317 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-config-data\") pod \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\" (UID: \"966a7f05-d7b9-4141-bba9-76b00ac9f4a5\") " Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.857607 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "966a7f05-d7b9-4141-bba9-76b00ac9f4a5" (UID: "966a7f05-d7b9-4141-bba9-76b00ac9f4a5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.857960 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-kube-api-access-4b6tb" (OuterVolumeSpecName: "kube-api-access-4b6tb") pod "966a7f05-d7b9-4141-bba9-76b00ac9f4a5" (UID: "966a7f05-d7b9-4141-bba9-76b00ac9f4a5"). InnerVolumeSpecName "kube-api-access-4b6tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.874195 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "966a7f05-d7b9-4141-bba9-76b00ac9f4a5" (UID: "966a7f05-d7b9-4141-bba9-76b00ac9f4a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.899212 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-config-data" (OuterVolumeSpecName: "config-data") pod "966a7f05-d7b9-4141-bba9-76b00ac9f4a5" (UID: "966a7f05-d7b9-4141-bba9-76b00ac9f4a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.949489 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.949560 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.949573 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b6tb\" (UniqueName: \"kubernetes.io/projected/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-kube-api-access-4b6tb\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:04 crc kubenswrapper[4909]: I0202 12:01:04.949582 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966a7f05-d7b9-4141-bba9-76b00ac9f4a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:05 crc kubenswrapper[4909]: I0202 12:01:05.444276 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500561-sck9k" event={"ID":"966a7f05-d7b9-4141-bba9-76b00ac9f4a5","Type":"ContainerDied","Data":"e1b5cba79588a68b928804fffe32222a891927e1765a4b7bff9ba438ebd0801c"} Feb 02 12:01:05 crc kubenswrapper[4909]: I0202 12:01:05.444335 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b5cba79588a68b928804fffe32222a891927e1765a4b7bff9ba438ebd0801c" Feb 02 12:01:05 crc kubenswrapper[4909]: I0202 12:01:05.444367 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500561-sck9k" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.574102 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-78x7k"] Feb 02 12:01:39 crc kubenswrapper[4909]: E0202 12:01:39.574834 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966a7f05-d7b9-4141-bba9-76b00ac9f4a5" containerName="keystone-cron" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.574846 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="966a7f05-d7b9-4141-bba9-76b00ac9f4a5" containerName="keystone-cron" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.575004 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="966a7f05-d7b9-4141-bba9-76b00ac9f4a5" containerName="keystone-cron" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.575495 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-78x7k" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.584780 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-28c9-account-create-update-rsw9x"] Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.587010 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-28c9-account-create-update-rsw9x" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.589468 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.599931 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-78x7k"] Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.606981 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-28c9-account-create-update-rsw9x"] Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.631971 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e69165bb-109c-41ad-84c0-bb7614862840-operator-scripts\") pod \"barbican-28c9-account-create-update-rsw9x\" (UID: \"e69165bb-109c-41ad-84c0-bb7614862840\") " pod="openstack/barbican-28c9-account-create-update-rsw9x" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.632016 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9djg\" (UniqueName: \"kubernetes.io/projected/2b9a3fd6-be55-42e2-b83f-077ccb698019-kube-api-access-r9djg\") pod \"barbican-db-create-78x7k\" (UID: \"2b9a3fd6-be55-42e2-b83f-077ccb698019\") " pod="openstack/barbican-db-create-78x7k" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.632063 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8tp\" (UniqueName: \"kubernetes.io/projected/e69165bb-109c-41ad-84c0-bb7614862840-kube-api-access-7k8tp\") pod \"barbican-28c9-account-create-update-rsw9x\" (UID: \"e69165bb-109c-41ad-84c0-bb7614862840\") " pod="openstack/barbican-28c9-account-create-update-rsw9x" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.632340 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b9a3fd6-be55-42e2-b83f-077ccb698019-operator-scripts\") pod \"barbican-db-create-78x7k\" (UID: \"2b9a3fd6-be55-42e2-b83f-077ccb698019\") " pod="openstack/barbican-db-create-78x7k" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.733871 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e69165bb-109c-41ad-84c0-bb7614862840-operator-scripts\") pod \"barbican-28c9-account-create-update-rsw9x\" (UID: \"e69165bb-109c-41ad-84c0-bb7614862840\") " pod="openstack/barbican-28c9-account-create-update-rsw9x" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.733924 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9djg\" (UniqueName: \"kubernetes.io/projected/2b9a3fd6-be55-42e2-b83f-077ccb698019-kube-api-access-r9djg\") pod \"barbican-db-create-78x7k\" (UID: \"2b9a3fd6-be55-42e2-b83f-077ccb698019\") " pod="openstack/barbican-db-create-78x7k" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.733963 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8tp\" (UniqueName: \"kubernetes.io/projected/e69165bb-109c-41ad-84c0-bb7614862840-kube-api-access-7k8tp\") pod \"barbican-28c9-account-create-update-rsw9x\" (UID: \"e69165bb-109c-41ad-84c0-bb7614862840\") " pod="openstack/barbican-28c9-account-create-update-rsw9x" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.734009 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b9a3fd6-be55-42e2-b83f-077ccb698019-operator-scripts\") pod \"barbican-db-create-78x7k\" (UID: \"2b9a3fd6-be55-42e2-b83f-077ccb698019\") " pod="openstack/barbican-db-create-78x7k" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.735033 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b9a3fd6-be55-42e2-b83f-077ccb698019-operator-scripts\") pod \"barbican-db-create-78x7k\" (UID: \"2b9a3fd6-be55-42e2-b83f-077ccb698019\") " pod="openstack/barbican-db-create-78x7k" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.735036 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e69165bb-109c-41ad-84c0-bb7614862840-operator-scripts\") pod \"barbican-28c9-account-create-update-rsw9x\" (UID: \"e69165bb-109c-41ad-84c0-bb7614862840\") " pod="openstack/barbican-28c9-account-create-update-rsw9x" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.752871 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8tp\" (UniqueName: \"kubernetes.io/projected/e69165bb-109c-41ad-84c0-bb7614862840-kube-api-access-7k8tp\") pod \"barbican-28c9-account-create-update-rsw9x\" (UID: \"e69165bb-109c-41ad-84c0-bb7614862840\") " pod="openstack/barbican-28c9-account-create-update-rsw9x" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.755326 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9djg\" (UniqueName: \"kubernetes.io/projected/2b9a3fd6-be55-42e2-b83f-077ccb698019-kube-api-access-r9djg\") pod \"barbican-db-create-78x7k\" (UID: \"2b9a3fd6-be55-42e2-b83f-077ccb698019\") " pod="openstack/barbican-db-create-78x7k" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.915140 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-78x7k" Feb 02 12:01:39 crc kubenswrapper[4909]: I0202 12:01:39.923199 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-28c9-account-create-update-rsw9x" Feb 02 12:01:40 crc kubenswrapper[4909]: I0202 12:01:40.384456 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-28c9-account-create-update-rsw9x"] Feb 02 12:01:40 crc kubenswrapper[4909]: I0202 12:01:40.475158 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-78x7k"] Feb 02 12:01:40 crc kubenswrapper[4909]: I0202 12:01:40.702741 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-28c9-account-create-update-rsw9x" event={"ID":"e69165bb-109c-41ad-84c0-bb7614862840","Type":"ContainerStarted","Data":"3f83b39fe092d064c3cacacc4bf52521ca277d5da3f333ee14e7391c542a6793"} Feb 02 12:01:40 crc kubenswrapper[4909]: I0202 12:01:40.702781 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-28c9-account-create-update-rsw9x" event={"ID":"e69165bb-109c-41ad-84c0-bb7614862840","Type":"ContainerStarted","Data":"0050d17c21a1831c50d984de5bd6af4c6499fb7ab7a500b096fd625f8bea38d4"} Feb 02 12:01:40 crc kubenswrapper[4909]: I0202 12:01:40.705356 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-78x7k" event={"ID":"2b9a3fd6-be55-42e2-b83f-077ccb698019","Type":"ContainerStarted","Data":"4bc2a9d643c3553bf9a59f4de5e4577950491b2d51652c90ef473ee3e1c85397"} Feb 02 12:01:40 crc kubenswrapper[4909]: I0202 12:01:40.705399 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-78x7k" event={"ID":"2b9a3fd6-be55-42e2-b83f-077ccb698019","Type":"ContainerStarted","Data":"176df5bcefa4c169733aeb254e9c3070162b86a568da6a79f5c5027472d61b35"} Feb 02 12:01:40 crc kubenswrapper[4909]: I0202 12:01:40.723551 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-28c9-account-create-update-rsw9x" podStartSLOduration=1.7235333449999999 podStartE2EDuration="1.723533345s" podCreationTimestamp="2026-02-02 12:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:01:40.715969414 +0000 UTC m=+5426.462070149" watchObservedRunningTime="2026-02-02 12:01:40.723533345 +0000 UTC m=+5426.469634080" Feb 02 12:01:40 crc kubenswrapper[4909]: I0202 12:01:40.735187 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-78x7k" podStartSLOduration=1.735169449 podStartE2EDuration="1.735169449s" podCreationTimestamp="2026-02-02 12:01:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:01:40.73235669 +0000 UTC m=+5426.478457435" watchObservedRunningTime="2026-02-02 12:01:40.735169449 +0000 UTC m=+5426.481270184" Feb 02 12:01:41 crc kubenswrapper[4909]: I0202 12:01:41.714141 4909 generic.go:334] "Generic (PLEG): container finished" podID="2b9a3fd6-be55-42e2-b83f-077ccb698019" containerID="4bc2a9d643c3553bf9a59f4de5e4577950491b2d51652c90ef473ee3e1c85397" exitCode=0 Feb 02 12:01:41 crc kubenswrapper[4909]: I0202 12:01:41.714218 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-78x7k" event={"ID":"2b9a3fd6-be55-42e2-b83f-077ccb698019","Type":"ContainerDied","Data":"4bc2a9d643c3553bf9a59f4de5e4577950491b2d51652c90ef473ee3e1c85397"} Feb 02 12:01:41 crc kubenswrapper[4909]: I0202 12:01:41.716190 4909 generic.go:334] "Generic (PLEG): container finished" podID="e69165bb-109c-41ad-84c0-bb7614862840" containerID="3f83b39fe092d064c3cacacc4bf52521ca277d5da3f333ee14e7391c542a6793" exitCode=0 Feb 02 12:01:41 crc kubenswrapper[4909]: I0202 12:01:41.716220 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-28c9-account-create-update-rsw9x" event={"ID":"e69165bb-109c-41ad-84c0-bb7614862840","Type":"ContainerDied","Data":"3f83b39fe092d064c3cacacc4bf52521ca277d5da3f333ee14e7391c542a6793"} Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.087451 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-28c9-account-create-update-rsw9x" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.093338 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-78x7k" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.197122 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e69165bb-109c-41ad-84c0-bb7614862840-operator-scripts\") pod \"e69165bb-109c-41ad-84c0-bb7614862840\" (UID: \"e69165bb-109c-41ad-84c0-bb7614862840\") " Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.197163 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9djg\" (UniqueName: \"kubernetes.io/projected/2b9a3fd6-be55-42e2-b83f-077ccb698019-kube-api-access-r9djg\") pod \"2b9a3fd6-be55-42e2-b83f-077ccb698019\" (UID: \"2b9a3fd6-be55-42e2-b83f-077ccb698019\") " Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.197211 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b9a3fd6-be55-42e2-b83f-077ccb698019-operator-scripts\") pod \"2b9a3fd6-be55-42e2-b83f-077ccb698019\" (UID: \"2b9a3fd6-be55-42e2-b83f-077ccb698019\") " Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.197233 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k8tp\" (UniqueName: \"kubernetes.io/projected/e69165bb-109c-41ad-84c0-bb7614862840-kube-api-access-7k8tp\") pod \"e69165bb-109c-41ad-84c0-bb7614862840\" (UID: \"e69165bb-109c-41ad-84c0-bb7614862840\") " Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.198273 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e69165bb-109c-41ad-84c0-bb7614862840-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e69165bb-109c-41ad-84c0-bb7614862840" (UID: "e69165bb-109c-41ad-84c0-bb7614862840"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.198273 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b9a3fd6-be55-42e2-b83f-077ccb698019-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b9a3fd6-be55-42e2-b83f-077ccb698019" (UID: "2b9a3fd6-be55-42e2-b83f-077ccb698019"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.202706 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e69165bb-109c-41ad-84c0-bb7614862840-kube-api-access-7k8tp" (OuterVolumeSpecName: "kube-api-access-7k8tp") pod "e69165bb-109c-41ad-84c0-bb7614862840" (UID: "e69165bb-109c-41ad-84c0-bb7614862840"). InnerVolumeSpecName "kube-api-access-7k8tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.202970 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9a3fd6-be55-42e2-b83f-077ccb698019-kube-api-access-r9djg" (OuterVolumeSpecName: "kube-api-access-r9djg") pod "2b9a3fd6-be55-42e2-b83f-077ccb698019" (UID: "2b9a3fd6-be55-42e2-b83f-077ccb698019"). InnerVolumeSpecName "kube-api-access-r9djg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.298069 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e69165bb-109c-41ad-84c0-bb7614862840-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.298102 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9djg\" (UniqueName: \"kubernetes.io/projected/2b9a3fd6-be55-42e2-b83f-077ccb698019-kube-api-access-r9djg\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.298114 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b9a3fd6-be55-42e2-b83f-077ccb698019-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.298123 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k8tp\" (UniqueName: \"kubernetes.io/projected/e69165bb-109c-41ad-84c0-bb7614862840-kube-api-access-7k8tp\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.736842 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-28c9-account-create-update-rsw9x" event={"ID":"e69165bb-109c-41ad-84c0-bb7614862840","Type":"ContainerDied","Data":"0050d17c21a1831c50d984de5bd6af4c6499fb7ab7a500b096fd625f8bea38d4"} Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.737162 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0050d17c21a1831c50d984de5bd6af4c6499fb7ab7a500b096fd625f8bea38d4" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.736873 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-28c9-account-create-update-rsw9x" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.738456 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-78x7k" event={"ID":"2b9a3fd6-be55-42e2-b83f-077ccb698019","Type":"ContainerDied","Data":"176df5bcefa4c169733aeb254e9c3070162b86a568da6a79f5c5027472d61b35"} Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.738487 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="176df5bcefa4c169733aeb254e9c3070162b86a568da6a79f5c5027472d61b35" Feb 02 12:01:43 crc kubenswrapper[4909]: I0202 12:01:43.738690 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-78x7k" Feb 02 12:01:44 crc kubenswrapper[4909]: I0202 12:01:44.868360 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-46w84"] Feb 02 12:01:44 crc kubenswrapper[4909]: E0202 12:01:44.868721 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9a3fd6-be55-42e2-b83f-077ccb698019" containerName="mariadb-database-create" Feb 02 12:01:44 crc kubenswrapper[4909]: I0202 12:01:44.868735 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9a3fd6-be55-42e2-b83f-077ccb698019" containerName="mariadb-database-create" Feb 02 12:01:44 crc kubenswrapper[4909]: E0202 12:01:44.868769 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69165bb-109c-41ad-84c0-bb7614862840" containerName="mariadb-account-create-update" Feb 02 12:01:44 crc kubenswrapper[4909]: I0202 12:01:44.868775 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69165bb-109c-41ad-84c0-bb7614862840" containerName="mariadb-account-create-update" Feb 02 12:01:44 crc kubenswrapper[4909]: I0202 12:01:44.868968 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69165bb-109c-41ad-84c0-bb7614862840" containerName="mariadb-account-create-update" Feb 02 12:01:44 crc kubenswrapper[4909]: I0202 12:01:44.868986 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9a3fd6-be55-42e2-b83f-077ccb698019" containerName="mariadb-database-create" Feb 02 12:01:44 crc kubenswrapper[4909]: I0202 12:01:44.869517 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:44 crc kubenswrapper[4909]: I0202 12:01:44.871950 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gng9s" Feb 02 12:01:44 crc kubenswrapper[4909]: I0202 12:01:44.872778 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 02 12:01:44 crc kubenswrapper[4909]: I0202 12:01:44.923756 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-46w84"] Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.027521 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-db-sync-config-data\") pod \"barbican-db-sync-46w84\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.027580 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wp79\" (UniqueName: \"kubernetes.io/projected/0416ea7e-0584-4585-9d94-75f1df10d436-kube-api-access-5wp79\") pod \"barbican-db-sync-46w84\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.027707 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-combined-ca-bundle\") pod \"barbican-db-sync-46w84\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.128875 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wp79\" (UniqueName: \"kubernetes.io/projected/0416ea7e-0584-4585-9d94-75f1df10d436-kube-api-access-5wp79\") pod \"barbican-db-sync-46w84\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.128993 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-combined-ca-bundle\") pod \"barbican-db-sync-46w84\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.129160 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-db-sync-config-data\") pod \"barbican-db-sync-46w84\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.136185 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-combined-ca-bundle\") pod \"barbican-db-sync-46w84\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.136956 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-db-sync-config-data\") pod \"barbican-db-sync-46w84\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.150765 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wp79\" (UniqueName: \"kubernetes.io/projected/0416ea7e-0584-4585-9d94-75f1df10d436-kube-api-access-5wp79\") pod \"barbican-db-sync-46w84\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.188017 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.634425 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-46w84"] Feb 02 12:01:45 crc kubenswrapper[4909]: I0202 12:01:45.754416 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-46w84" event={"ID":"0416ea7e-0584-4585-9d94-75f1df10d436","Type":"ContainerStarted","Data":"d8757547f1d40878c65b84d3f726d6ca99910624b9b8212e53696bc98589c6fb"} Feb 02 12:01:46 crc kubenswrapper[4909]: I0202 12:01:46.763412 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-46w84" event={"ID":"0416ea7e-0584-4585-9d94-75f1df10d436","Type":"ContainerStarted","Data":"aef0d7739f5850b7efa2c6deaa88d3aa390e0403fd0f633df0b8e60d8ff2d7ec"} Feb 02 12:01:46 crc kubenswrapper[4909]: I0202 12:01:46.782208 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-46w84" podStartSLOduration=2.782188596 podStartE2EDuration="2.782188596s" podCreationTimestamp="2026-02-02 12:01:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:01:46.778441512 +0000 UTC m=+5432.524542257" watchObservedRunningTime="2026-02-02 12:01:46.782188596 +0000 UTC m=+5432.528289331" Feb 02 12:01:47 crc kubenswrapper[4909]: I0202 12:01:47.774265 4909 generic.go:334] "Generic (PLEG): container finished" podID="0416ea7e-0584-4585-9d94-75f1df10d436" containerID="aef0d7739f5850b7efa2c6deaa88d3aa390e0403fd0f633df0b8e60d8ff2d7ec" exitCode=0 Feb 02 12:01:47 crc kubenswrapper[4909]: I0202 12:01:47.774335 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-46w84" event={"ID":"0416ea7e-0584-4585-9d94-75f1df10d436","Type":"ContainerDied","Data":"aef0d7739f5850b7efa2c6deaa88d3aa390e0403fd0f633df0b8e60d8ff2d7ec"} Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.113564 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.199600 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-combined-ca-bundle\") pod \"0416ea7e-0584-4585-9d94-75f1df10d436\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.199739 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-db-sync-config-data\") pod \"0416ea7e-0584-4585-9d94-75f1df10d436\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.199840 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wp79\" (UniqueName: \"kubernetes.io/projected/0416ea7e-0584-4585-9d94-75f1df10d436-kube-api-access-5wp79\") pod \"0416ea7e-0584-4585-9d94-75f1df10d436\" (UID: \"0416ea7e-0584-4585-9d94-75f1df10d436\") " Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.204566 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0416ea7e-0584-4585-9d94-75f1df10d436" (UID: "0416ea7e-0584-4585-9d94-75f1df10d436"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.205103 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0416ea7e-0584-4585-9d94-75f1df10d436-kube-api-access-5wp79" (OuterVolumeSpecName: "kube-api-access-5wp79") pod "0416ea7e-0584-4585-9d94-75f1df10d436" (UID: "0416ea7e-0584-4585-9d94-75f1df10d436"). InnerVolumeSpecName "kube-api-access-5wp79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.220444 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0416ea7e-0584-4585-9d94-75f1df10d436" (UID: "0416ea7e-0584-4585-9d94-75f1df10d436"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.302233 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wp79\" (UniqueName: \"kubernetes.io/projected/0416ea7e-0584-4585-9d94-75f1df10d436-kube-api-access-5wp79\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.302513 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.302606 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0416ea7e-0584-4585-9d94-75f1df10d436-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.511372 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.511438 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.791837 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-46w84" event={"ID":"0416ea7e-0584-4585-9d94-75f1df10d436","Type":"ContainerDied","Data":"d8757547f1d40878c65b84d3f726d6ca99910624b9b8212e53696bc98589c6fb"} Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.791873 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8757547f1d40878c65b84d3f726d6ca99910624b9b8212e53696bc98589c6fb" Feb 02 12:01:49 crc kubenswrapper[4909]: I0202 12:01:49.791920 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-46w84" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.013332 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6ccd96d447-czqpq"] Feb 02 12:01:50 crc kubenswrapper[4909]: E0202 12:01:50.013745 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0416ea7e-0584-4585-9d94-75f1df10d436" containerName="barbican-db-sync" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.013771 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0416ea7e-0584-4585-9d94-75f1df10d436" containerName="barbican-db-sync" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.014017 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0416ea7e-0584-4585-9d94-75f1df10d436" containerName="barbican-db-sync" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.015696 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.022086 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.022294 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.022408 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-gng9s" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.030275 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-bcfc899fd-2xfv2"] Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.031759 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.041684 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.047886 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6ccd96d447-czqpq"] Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.060483 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-bcfc899fd-2xfv2"] Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.113848 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzv8h\" (UniqueName: \"kubernetes.io/projected/adddacaf-04be-4f50-9300-76a900e90318-kube-api-access-kzv8h\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.113898 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adddacaf-04be-4f50-9300-76a900e90318-config-data-custom\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.113918 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adddacaf-04be-4f50-9300-76a900e90318-combined-ca-bundle\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.113934 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b584395d-e8d0-4fab-8059-e9cc92550805-config-data\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.113956 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adddacaf-04be-4f50-9300-76a900e90318-config-data\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.113981 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b584395d-e8d0-4fab-8059-e9cc92550805-config-data-custom\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.114027 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adddacaf-04be-4f50-9300-76a900e90318-logs\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.114052 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vr5d\" (UniqueName: \"kubernetes.io/projected/b584395d-e8d0-4fab-8059-e9cc92550805-kube-api-access-5vr5d\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.114076 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b584395d-e8d0-4fab-8059-e9cc92550805-logs\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.114129 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b584395d-e8d0-4fab-8059-e9cc92550805-combined-ca-bundle\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.116111 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75dd8f6b7-gp78r"] Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.119176 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.152156 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dd8f6b7-gp78r"] Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.215620 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adddacaf-04be-4f50-9300-76a900e90318-config-data-custom\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.215980 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-nb\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216007 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adddacaf-04be-4f50-9300-76a900e90318-combined-ca-bundle\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216025 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b584395d-e8d0-4fab-8059-e9cc92550805-config-data\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adddacaf-04be-4f50-9300-76a900e90318-config-data\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216075 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b584395d-e8d0-4fab-8059-e9cc92550805-config-data-custom\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216115 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-dns-svc\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216134 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adddacaf-04be-4f50-9300-76a900e90318-logs\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216156 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-config\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216180 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vr5d\" (UniqueName: \"kubernetes.io/projected/b584395d-e8d0-4fab-8059-e9cc92550805-kube-api-access-5vr5d\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216197 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b584395d-e8d0-4fab-8059-e9cc92550805-logs\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216237 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b584395d-e8d0-4fab-8059-e9cc92550805-combined-ca-bundle\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216260 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49tzp\" (UniqueName: \"kubernetes.io/projected/940bab09-bdad-49c2-b9f5-8326b62f98be-kube-api-access-49tzp\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216297 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzv8h\" (UniqueName: \"kubernetes.io/projected/adddacaf-04be-4f50-9300-76a900e90318-kube-api-access-kzv8h\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.216317 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-sb\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.217044 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adddacaf-04be-4f50-9300-76a900e90318-logs\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.218066 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b584395d-e8d0-4fab-8059-e9cc92550805-logs\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.220134 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adddacaf-04be-4f50-9300-76a900e90318-config-data-custom\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.220996 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adddacaf-04be-4f50-9300-76a900e90318-combined-ca-bundle\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.223038 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adddacaf-04be-4f50-9300-76a900e90318-config-data\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.223339 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b584395d-e8d0-4fab-8059-e9cc92550805-config-data\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.223963 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b584395d-e8d0-4fab-8059-e9cc92550805-config-data-custom\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.236427 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b584395d-e8d0-4fab-8059-e9cc92550805-combined-ca-bundle\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.242587 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vr5d\" (UniqueName: \"kubernetes.io/projected/b584395d-e8d0-4fab-8059-e9cc92550805-kube-api-access-5vr5d\") pod \"barbican-keystone-listener-bcfc899fd-2xfv2\" (UID: \"b584395d-e8d0-4fab-8059-e9cc92550805\") " pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.253058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzv8h\" (UniqueName: \"kubernetes.io/projected/adddacaf-04be-4f50-9300-76a900e90318-kube-api-access-kzv8h\") pod \"barbican-worker-6ccd96d447-czqpq\" (UID: \"adddacaf-04be-4f50-9300-76a900e90318\") " pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.259523 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84d4644d64-2k8dn"] Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.260830 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.271705 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84d4644d64-2k8dn"] Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.274117 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.317589 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49tzp\" (UniqueName: \"kubernetes.io/projected/940bab09-bdad-49c2-b9f5-8326b62f98be-kube-api-access-49tzp\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.317639 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.317669 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data-custom\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.317695 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-sb\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.317710 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b1b64ca-849b-4264-83ca-84af3d522297-logs\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.317728 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-nb\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.317749 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls87r\" (UniqueName: \"kubernetes.io/projected/0b1b64ca-849b-4264-83ca-84af3d522297-kube-api-access-ls87r\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.317789 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-combined-ca-bundle\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.317851 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-dns-svc\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.317874 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-config\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.318662 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-config\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.318833 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-sb\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.319082 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-dns-svc\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.319580 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-nb\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.345731 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49tzp\" (UniqueName: \"kubernetes.io/projected/940bab09-bdad-49c2-b9f5-8326b62f98be-kube-api-access-49tzp\") pod \"dnsmasq-dns-75dd8f6b7-gp78r\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.357227 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6ccd96d447-czqpq" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.371825 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.419832 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.419889 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data-custom\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.419928 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b1b64ca-849b-4264-83ca-84af3d522297-logs\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.419955 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls87r\" (UniqueName: \"kubernetes.io/projected/0b1b64ca-849b-4264-83ca-84af3d522297-kube-api-access-ls87r\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.420006 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-combined-ca-bundle\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.420543 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b1b64ca-849b-4264-83ca-84af3d522297-logs\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.424431 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-combined-ca-bundle\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.424527 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data-custom\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.428756 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.440124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls87r\" (UniqueName: \"kubernetes.io/projected/0b1b64ca-849b-4264-83ca-84af3d522297-kube-api-access-ls87r\") pod \"barbican-api-84d4644d64-2k8dn\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.454585 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:50 crc kubenswrapper[4909]: I0202 12:01:50.617481 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.037879 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-bcfc899fd-2xfv2"] Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.143100 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6ccd96d447-czqpq"] Feb 02 12:01:51 crc kubenswrapper[4909]: W0202 12:01:51.152971 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadddacaf_04be_4f50_9300_76a900e90318.slice/crio-dec8054d1e3530d6359fcbf4fd8ea0afbc3c91cb590962046982a7e60eb35e5a WatchSource:0}: Error finding container dec8054d1e3530d6359fcbf4fd8ea0afbc3c91cb590962046982a7e60eb35e5a: Status 404 returned error can't find the container with id dec8054d1e3530d6359fcbf4fd8ea0afbc3c91cb590962046982a7e60eb35e5a Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.212914 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84d4644d64-2k8dn"] Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.222152 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75dd8f6b7-gp78r"] Feb 02 12:01:51 crc kubenswrapper[4909]: W0202 12:01:51.227832 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod940bab09_bdad_49c2_b9f5_8326b62f98be.slice/crio-6b5733fde5ef9ae73b58096ce605f0778868391c399c2df0135263f79a668d4f WatchSource:0}: Error finding container 6b5733fde5ef9ae73b58096ce605f0778868391c399c2df0135263f79a668d4f: Status 404 returned error can't find the container with id 6b5733fde5ef9ae73b58096ce605f0778868391c399c2df0135263f79a668d4f Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.823567 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccd96d447-czqpq" event={"ID":"adddacaf-04be-4f50-9300-76a900e90318","Type":"ContainerStarted","Data":"41cca897ab0094df63b660d007cf0f697aeb28c926e9f13727008525ab946cb4"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.824015 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccd96d447-czqpq" event={"ID":"adddacaf-04be-4f50-9300-76a900e90318","Type":"ContainerStarted","Data":"6437b12b5fca8783778716132ee2a6548e1cfae13951a08c8bc46e8da278ce77"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.824033 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6ccd96d447-czqpq" event={"ID":"adddacaf-04be-4f50-9300-76a900e90318","Type":"ContainerStarted","Data":"dec8054d1e3530d6359fcbf4fd8ea0afbc3c91cb590962046982a7e60eb35e5a"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.830645 4909 generic.go:334] "Generic (PLEG): container finished" podID="940bab09-bdad-49c2-b9f5-8326b62f98be" containerID="cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e" exitCode=0 Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.830701 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" event={"ID":"940bab09-bdad-49c2-b9f5-8326b62f98be","Type":"ContainerDied","Data":"cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.830725 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" event={"ID":"940bab09-bdad-49c2-b9f5-8326b62f98be","Type":"ContainerStarted","Data":"6b5733fde5ef9ae73b58096ce605f0778868391c399c2df0135263f79a668d4f"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.835505 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d4644d64-2k8dn" event={"ID":"0b1b64ca-849b-4264-83ca-84af3d522297","Type":"ContainerStarted","Data":"db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.835552 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d4644d64-2k8dn" event={"ID":"0b1b64ca-849b-4264-83ca-84af3d522297","Type":"ContainerStarted","Data":"b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.835563 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d4644d64-2k8dn" event={"ID":"0b1b64ca-849b-4264-83ca-84af3d522297","Type":"ContainerStarted","Data":"c4e6ebff9e147157833a8458661b0a891e341276ed25ebe77335c8927ff8682a"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.836478 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.836505 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.845294 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6ccd96d447-czqpq" podStartSLOduration=2.845276206 podStartE2EDuration="2.845276206s" podCreationTimestamp="2026-02-02 12:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:01:51.845255585 +0000 UTC m=+5437.591356320" watchObservedRunningTime="2026-02-02 12:01:51.845276206 +0000 UTC m=+5437.591376941" Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.853747 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" event={"ID":"b584395d-e8d0-4fab-8059-e9cc92550805","Type":"ContainerStarted","Data":"e4ccbf177135c590bf93f0841ca8553fabc81a31171536d4e2f76c91df811ce1"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.853795 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" event={"ID":"b584395d-e8d0-4fab-8059-e9cc92550805","Type":"ContainerStarted","Data":"e52e0a40770e0f4efdb78873c0435a296e446c066ca4ad20e9e3fa557a303358"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.853823 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" event={"ID":"b584395d-e8d0-4fab-8059-e9cc92550805","Type":"ContainerStarted","Data":"46d1b5d778d344598bed9f142501dd39550c4d2d99687ef6aed5b726b36163d1"} Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.882761 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84d4644d64-2k8dn" podStartSLOduration=1.882737629 podStartE2EDuration="1.882737629s" podCreationTimestamp="2026-02-02 12:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:01:51.874250003 +0000 UTC m=+5437.620350738" watchObservedRunningTime="2026-02-02 12:01:51.882737629 +0000 UTC m=+5437.628838374" Feb 02 12:01:51 crc kubenswrapper[4909]: I0202 12:01:51.940367 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-bcfc899fd-2xfv2" podStartSLOduration=2.940341533 podStartE2EDuration="2.940341533s" podCreationTimestamp="2026-02-02 12:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:01:51.930247532 +0000 UTC m=+5437.676348267" watchObservedRunningTime="2026-02-02 12:01:51.940341533 +0000 UTC m=+5437.686442268" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.399701 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75f58b9976-z4nmm"] Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.402683 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.405498 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.405781 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.413245 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75f58b9976-z4nmm"] Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.463557 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-logs\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.463615 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-combined-ca-bundle\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.463648 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-public-tls-certs\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.463682 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-config-data-custom\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.463719 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-internal-tls-certs\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.463744 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jl9b\" (UniqueName: \"kubernetes.io/projected/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-kube-api-access-9jl9b\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.463772 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-config-data\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.565712 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-logs\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.565768 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-combined-ca-bundle\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.565815 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-public-tls-certs\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.565850 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-config-data-custom\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.565890 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-internal-tls-certs\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.565912 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jl9b\" (UniqueName: \"kubernetes.io/projected/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-kube-api-access-9jl9b\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.565939 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-config-data\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.566677 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-logs\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.571236 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-config-data\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.571332 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-internal-tls-certs\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.573460 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-combined-ca-bundle\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.574705 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-public-tls-certs\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.575351 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-config-data-custom\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.587058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jl9b\" (UniqueName: \"kubernetes.io/projected/c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b-kube-api-access-9jl9b\") pod \"barbican-api-75f58b9976-z4nmm\" (UID: \"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b\") " pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.784183 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.869698 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" event={"ID":"940bab09-bdad-49c2-b9f5-8326b62f98be","Type":"ContainerStarted","Data":"2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a"} Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.871720 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:01:52 crc kubenswrapper[4909]: I0202 12:01:52.896672 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" podStartSLOduration=2.896647389 podStartE2EDuration="2.896647389s" podCreationTimestamp="2026-02-02 12:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:01:52.891580518 +0000 UTC m=+5438.637681273" watchObservedRunningTime="2026-02-02 12:01:52.896647389 +0000 UTC m=+5438.642748134" Feb 02 12:01:53 crc kubenswrapper[4909]: I0202 12:01:53.227308 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75f58b9976-z4nmm"] Feb 02 12:01:53 crc kubenswrapper[4909]: W0202 12:01:53.232056 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4059ec4_44bb_46f0_b9ba_d2006cbf4b3b.slice/crio-402d73c0b51fcce08bcd44dab032575cc2075a854100fae7e0d92be8b8de03ef WatchSource:0}: Error finding container 402d73c0b51fcce08bcd44dab032575cc2075a854100fae7e0d92be8b8de03ef: Status 404 returned error can't find the container with id 402d73c0b51fcce08bcd44dab032575cc2075a854100fae7e0d92be8b8de03ef Feb 02 12:01:53 crc kubenswrapper[4909]: I0202 12:01:53.879650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75f58b9976-z4nmm" event={"ID":"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b","Type":"ContainerStarted","Data":"a1dbce36ebecf00288d6f8bae582af6ef83fe815fb88ad815c6c155fec418920"} Feb 02 12:01:53 crc kubenswrapper[4909]: I0202 12:01:53.880083 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75f58b9976-z4nmm" event={"ID":"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b","Type":"ContainerStarted","Data":"fafcfc9b2e46881ccec6f29fd61cf4a664c5611591b5d4a145143424b7c6a8dd"} Feb 02 12:01:53 crc kubenswrapper[4909]: I0202 12:01:53.880101 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75f58b9976-z4nmm" event={"ID":"c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b","Type":"ContainerStarted","Data":"402d73c0b51fcce08bcd44dab032575cc2075a854100fae7e0d92be8b8de03ef"} Feb 02 12:01:53 crc kubenswrapper[4909]: I0202 12:01:53.914199 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75f58b9976-z4nmm" podStartSLOduration=1.914181841 podStartE2EDuration="1.914181841s" podCreationTimestamp="2026-02-02 12:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:01:53.907058162 +0000 UTC m=+5439.653158897" watchObservedRunningTime="2026-02-02 12:01:53.914181841 +0000 UTC m=+5439.660282576" Feb 02 12:01:54 crc kubenswrapper[4909]: I0202 12:01:54.886092 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:01:54 crc kubenswrapper[4909]: I0202 12:01:54.886430 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:02:00 crc kubenswrapper[4909]: I0202 12:02:00.455992 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:02:00 crc kubenswrapper[4909]: I0202 12:02:00.511897 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65fc77cdfc-77wl4"] Feb 02 12:02:00 crc kubenswrapper[4909]: I0202 12:02:00.512139 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" podUID="9155aa19-b965-4d0c-900d-053448dbe0af" containerName="dnsmasq-dns" containerID="cri-o://e50a5af3f8c4c38ccdf23f9f356c02ab53d37a0ec4decd79846bb16017738094" gracePeriod=10 Feb 02 12:02:00 crc kubenswrapper[4909]: I0202 12:02:00.951423 4909 generic.go:334] "Generic (PLEG): container finished" podID="9155aa19-b965-4d0c-900d-053448dbe0af" containerID="e50a5af3f8c4c38ccdf23f9f356c02ab53d37a0ec4decd79846bb16017738094" exitCode=0 Feb 02 12:02:00 crc kubenswrapper[4909]: I0202 12:02:00.951888 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" event={"ID":"9155aa19-b965-4d0c-900d-053448dbe0af","Type":"ContainerDied","Data":"e50a5af3f8c4c38ccdf23f9f356c02ab53d37a0ec4decd79846bb16017738094"} Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.144025 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.330825 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-dns-svc\") pod \"9155aa19-b965-4d0c-900d-053448dbe0af\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.331457 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-nb\") pod \"9155aa19-b965-4d0c-900d-053448dbe0af\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.331709 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-sb\") pod \"9155aa19-b965-4d0c-900d-053448dbe0af\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.331989 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g2nb\" (UniqueName: \"kubernetes.io/projected/9155aa19-b965-4d0c-900d-053448dbe0af-kube-api-access-7g2nb\") pod \"9155aa19-b965-4d0c-900d-053448dbe0af\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.332167 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-config\") pod \"9155aa19-b965-4d0c-900d-053448dbe0af\" (UID: \"9155aa19-b965-4d0c-900d-053448dbe0af\") " Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.351099 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9155aa19-b965-4d0c-900d-053448dbe0af-kube-api-access-7g2nb" (OuterVolumeSpecName: "kube-api-access-7g2nb") pod "9155aa19-b965-4d0c-900d-053448dbe0af" (UID: "9155aa19-b965-4d0c-900d-053448dbe0af"). InnerVolumeSpecName "kube-api-access-7g2nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.377159 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9155aa19-b965-4d0c-900d-053448dbe0af" (UID: "9155aa19-b965-4d0c-900d-053448dbe0af"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.379515 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-config" (OuterVolumeSpecName: "config") pod "9155aa19-b965-4d0c-900d-053448dbe0af" (UID: "9155aa19-b965-4d0c-900d-053448dbe0af"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.379800 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9155aa19-b965-4d0c-900d-053448dbe0af" (UID: "9155aa19-b965-4d0c-900d-053448dbe0af"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.385649 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9155aa19-b965-4d0c-900d-053448dbe0af" (UID: "9155aa19-b965-4d0c-900d-053448dbe0af"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.438261 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.438316 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.438330 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g2nb\" (UniqueName: \"kubernetes.io/projected/9155aa19-b965-4d0c-900d-053448dbe0af-kube-api-access-7g2nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.438341 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.438354 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9155aa19-b965-4d0c-900d-053448dbe0af-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.963533 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" event={"ID":"9155aa19-b965-4d0c-900d-053448dbe0af","Type":"ContainerDied","Data":"cdf3001c5f2c70a9ded00d1aca6ec61a01bb3f335733170f6a592128ede2ed8e"} Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.964026 4909 scope.go:117] "RemoveContainer" containerID="e50a5af3f8c4c38ccdf23f9f356c02ab53d37a0ec4decd79846bb16017738094" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.964249 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" Feb 02 12:02:01 crc kubenswrapper[4909]: I0202 12:02:01.995825 4909 scope.go:117] "RemoveContainer" containerID="177d496d3499a8e2e3d0378841b96fb5a1a89415da4ee397d1d686e910cf3f89" Feb 02 12:02:02 crc kubenswrapper[4909]: I0202 12:02:02.011084 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65fc77cdfc-77wl4"] Feb 02 12:02:02 crc kubenswrapper[4909]: I0202 12:02:02.019970 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65fc77cdfc-77wl4"] Feb 02 12:02:02 crc kubenswrapper[4909]: I0202 12:02:02.358436 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:02:02 crc kubenswrapper[4909]: I0202 12:02:02.612973 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:02:03 crc kubenswrapper[4909]: I0202 12:02:03.027212 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9155aa19-b965-4d0c-900d-053448dbe0af" path="/var/lib/kubelet/pods/9155aa19-b965-4d0c-900d-053448dbe0af/volumes" Feb 02 12:02:04 crc kubenswrapper[4909]: I0202 12:02:04.222167 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:02:04 crc kubenswrapper[4909]: I0202 12:02:04.269120 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75f58b9976-z4nmm" Feb 02 12:02:04 crc kubenswrapper[4909]: I0202 12:02:04.317232 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84d4644d64-2k8dn"] Feb 02 12:02:04 crc kubenswrapper[4909]: I0202 12:02:04.317701 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84d4644d64-2k8dn" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" containerName="barbican-api-log" containerID="cri-o://b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74" gracePeriod=30 Feb 02 12:02:04 crc kubenswrapper[4909]: I0202 12:02:04.318165 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-84d4644d64-2k8dn" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" containerName="barbican-api" containerID="cri-o://db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e" gracePeriod=30 Feb 02 12:02:05 crc kubenswrapper[4909]: I0202 12:02:05.018904 4909 generic.go:334] "Generic (PLEG): container finished" podID="0b1b64ca-849b-4264-83ca-84af3d522297" containerID="b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74" exitCode=143 Feb 02 12:02:05 crc kubenswrapper[4909]: I0202 12:02:05.027512 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d4644d64-2k8dn" event={"ID":"0b1b64ca-849b-4264-83ca-84af3d522297","Type":"ContainerDied","Data":"b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74"} Feb 02 12:02:06 crc kubenswrapper[4909]: I0202 12:02:06.110289 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-65fc77cdfc-77wl4" podUID="9155aa19-b965-4d0c-900d-053448dbe0af" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.17:5353: i/o timeout" Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.470255 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84d4644d64-2k8dn" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.30:9311/healthcheck\": read tcp 10.217.0.2:53540->10.217.1.30:9311: read: connection reset by peer" Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.470288 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-84d4644d64-2k8dn" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.30:9311/healthcheck\": read tcp 10.217.0.2:53538->10.217.1.30:9311: read: connection reset by peer" Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.838396 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.973258 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b1b64ca-849b-4264-83ca-84af3d522297-logs\") pod \"0b1b64ca-849b-4264-83ca-84af3d522297\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.973625 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data-custom\") pod \"0b1b64ca-849b-4264-83ca-84af3d522297\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.973719 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-combined-ca-bundle\") pod \"0b1b64ca-849b-4264-83ca-84af3d522297\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.973727 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b1b64ca-849b-4264-83ca-84af3d522297-logs" (OuterVolumeSpecName: "logs") pod "0b1b64ca-849b-4264-83ca-84af3d522297" (UID: "0b1b64ca-849b-4264-83ca-84af3d522297"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.973826 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls87r\" (UniqueName: \"kubernetes.io/projected/0b1b64ca-849b-4264-83ca-84af3d522297-kube-api-access-ls87r\") pod \"0b1b64ca-849b-4264-83ca-84af3d522297\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.973937 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data\") pod \"0b1b64ca-849b-4264-83ca-84af3d522297\" (UID: \"0b1b64ca-849b-4264-83ca-84af3d522297\") " Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.974278 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b1b64ca-849b-4264-83ca-84af3d522297-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.979265 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1b64ca-849b-4264-83ca-84af3d522297-kube-api-access-ls87r" (OuterVolumeSpecName: "kube-api-access-ls87r") pod "0b1b64ca-849b-4264-83ca-84af3d522297" (UID: "0b1b64ca-849b-4264-83ca-84af3d522297"). InnerVolumeSpecName "kube-api-access-ls87r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.982991 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0b1b64ca-849b-4264-83ca-84af3d522297" (UID: "0b1b64ca-849b-4264-83ca-84af3d522297"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:02:07 crc kubenswrapper[4909]: I0202 12:02:07.997848 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b1b64ca-849b-4264-83ca-84af3d522297" (UID: "0b1b64ca-849b-4264-83ca-84af3d522297"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.013674 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data" (OuterVolumeSpecName: "config-data") pod "0b1b64ca-849b-4264-83ca-84af3d522297" (UID: "0b1b64ca-849b-4264-83ca-84af3d522297"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.060389 4909 generic.go:334] "Generic (PLEG): container finished" podID="0b1b64ca-849b-4264-83ca-84af3d522297" containerID="db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e" exitCode=0 Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.060441 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d4644d64-2k8dn" event={"ID":"0b1b64ca-849b-4264-83ca-84af3d522297","Type":"ContainerDied","Data":"db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e"} Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.060505 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84d4644d64-2k8dn" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.060535 4909 scope.go:117] "RemoveContainer" containerID="db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.060518 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84d4644d64-2k8dn" event={"ID":"0b1b64ca-849b-4264-83ca-84af3d522297","Type":"ContainerDied","Data":"c4e6ebff9e147157833a8458661b0a891e341276ed25ebe77335c8927ff8682a"} Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.075795 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls87r\" (UniqueName: \"kubernetes.io/projected/0b1b64ca-849b-4264-83ca-84af3d522297-kube-api-access-ls87r\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.075843 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.075854 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.075862 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1b64ca-849b-4264-83ca-84af3d522297-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.097171 4909 scope.go:117] "RemoveContainer" containerID="b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.102377 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-84d4644d64-2k8dn"] Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.113208 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-84d4644d64-2k8dn"] Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.114115 4909 scope.go:117] "RemoveContainer" containerID="db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e" Feb 02 12:02:08 crc kubenswrapper[4909]: E0202 12:02:08.114562 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e\": container with ID starting with db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e not found: ID does not exist" containerID="db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.114604 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e"} err="failed to get container status \"db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e\": rpc error: code = NotFound desc = could not find container \"db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e\": container with ID starting with db9db20a33794010e17f04d58bc370bbbda50f6ddb6a73417bb56a250ec23e3e not found: ID does not exist" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.114634 4909 scope.go:117] "RemoveContainer" containerID="b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74" Feb 02 12:02:08 crc kubenswrapper[4909]: E0202 12:02:08.114897 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74\": container with ID starting with b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74 not found: ID does not exist" containerID="b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74" Feb 02 12:02:08 crc kubenswrapper[4909]: I0202 12:02:08.114927 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74"} err="failed to get container status \"b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74\": rpc error: code = NotFound desc = could not find container \"b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74\": container with ID starting with b88068f75c5f86ebd4b4d1631ab4a83d4821774e06246b85ef189d2b1aefbd74 not found: ID does not exist" Feb 02 12:02:09 crc kubenswrapper[4909]: I0202 12:02:09.026033 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" path="/var/lib/kubelet/pods/0b1b64ca-849b-4264-83ca-84af3d522297/volumes" Feb 02 12:02:19 crc kubenswrapper[4909]: I0202 12:02:19.511027 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:02:19 crc kubenswrapper[4909]: I0202 12:02:19.511555 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.258882 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-wv6jm"] Feb 02 12:02:26 crc kubenswrapper[4909]: E0202 12:02:26.259867 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9155aa19-b965-4d0c-900d-053448dbe0af" containerName="dnsmasq-dns" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.259884 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9155aa19-b965-4d0c-900d-053448dbe0af" containerName="dnsmasq-dns" Feb 02 12:02:26 crc kubenswrapper[4909]: E0202 12:02:26.259900 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" containerName="barbican-api" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.259908 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" containerName="barbican-api" Feb 02 12:02:26 crc kubenswrapper[4909]: E0202 12:02:26.259920 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" containerName="barbican-api-log" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.259928 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" containerName="barbican-api-log" Feb 02 12:02:26 crc kubenswrapper[4909]: E0202 12:02:26.259960 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9155aa19-b965-4d0c-900d-053448dbe0af" containerName="init" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.259968 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9155aa19-b965-4d0c-900d-053448dbe0af" containerName="init" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.260151 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" containerName="barbican-api-log" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.260180 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9155aa19-b965-4d0c-900d-053448dbe0af" containerName="dnsmasq-dns" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.260193 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1b64ca-849b-4264-83ca-84af3d522297" containerName="barbican-api" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.260857 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wv6jm" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.268612 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ff19-account-create-update-r4s5q"] Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.269955 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff19-account-create-update-r4s5q" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.272228 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.276195 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wv6jm"] Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.288715 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ff19-account-create-update-r4s5q"] Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.300105 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsvxd\" (UniqueName: \"kubernetes.io/projected/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-kube-api-access-zsvxd\") pod \"neutron-db-create-wv6jm\" (UID: \"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6\") " pod="openstack/neutron-db-create-wv6jm" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.300149 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c842981-834f-414f-b66e-08d8f20a4b06-operator-scripts\") pod \"neutron-ff19-account-create-update-r4s5q\" (UID: \"4c842981-834f-414f-b66e-08d8f20a4b06\") " pod="openstack/neutron-ff19-account-create-update-r4s5q" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.300193 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmjqx\" (UniqueName: \"kubernetes.io/projected/4c842981-834f-414f-b66e-08d8f20a4b06-kube-api-access-cmjqx\") pod \"neutron-ff19-account-create-update-r4s5q\" (UID: \"4c842981-834f-414f-b66e-08d8f20a4b06\") " pod="openstack/neutron-ff19-account-create-update-r4s5q" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.300248 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-operator-scripts\") pod \"neutron-db-create-wv6jm\" (UID: \"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6\") " pod="openstack/neutron-db-create-wv6jm" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.401445 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsvxd\" (UniqueName: \"kubernetes.io/projected/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-kube-api-access-zsvxd\") pod \"neutron-db-create-wv6jm\" (UID: \"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6\") " pod="openstack/neutron-db-create-wv6jm" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.401487 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c842981-834f-414f-b66e-08d8f20a4b06-operator-scripts\") pod \"neutron-ff19-account-create-update-r4s5q\" (UID: \"4c842981-834f-414f-b66e-08d8f20a4b06\") " pod="openstack/neutron-ff19-account-create-update-r4s5q" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.401544 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmjqx\" (UniqueName: \"kubernetes.io/projected/4c842981-834f-414f-b66e-08d8f20a4b06-kube-api-access-cmjqx\") pod \"neutron-ff19-account-create-update-r4s5q\" (UID: \"4c842981-834f-414f-b66e-08d8f20a4b06\") " pod="openstack/neutron-ff19-account-create-update-r4s5q" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.401589 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-operator-scripts\") pod \"neutron-db-create-wv6jm\" (UID: \"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6\") " pod="openstack/neutron-db-create-wv6jm" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.402378 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-operator-scripts\") pod \"neutron-db-create-wv6jm\" (UID: \"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6\") " pod="openstack/neutron-db-create-wv6jm" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.402386 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c842981-834f-414f-b66e-08d8f20a4b06-operator-scripts\") pod \"neutron-ff19-account-create-update-r4s5q\" (UID: \"4c842981-834f-414f-b66e-08d8f20a4b06\") " pod="openstack/neutron-ff19-account-create-update-r4s5q" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.428153 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsvxd\" (UniqueName: \"kubernetes.io/projected/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-kube-api-access-zsvxd\") pod \"neutron-db-create-wv6jm\" (UID: \"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6\") " pod="openstack/neutron-db-create-wv6jm" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.428776 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmjqx\" (UniqueName: \"kubernetes.io/projected/4c842981-834f-414f-b66e-08d8f20a4b06-kube-api-access-cmjqx\") pod \"neutron-ff19-account-create-update-r4s5q\" (UID: \"4c842981-834f-414f-b66e-08d8f20a4b06\") " pod="openstack/neutron-ff19-account-create-update-r4s5q" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.583529 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wv6jm" Feb 02 12:02:26 crc kubenswrapper[4909]: I0202 12:02:26.595018 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff19-account-create-update-r4s5q" Feb 02 12:02:27 crc kubenswrapper[4909]: I0202 12:02:27.012620 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ff19-account-create-update-r4s5q"] Feb 02 12:02:27 crc kubenswrapper[4909]: I0202 12:02:27.086384 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wv6jm"] Feb 02 12:02:27 crc kubenswrapper[4909]: W0202 12:02:27.093373 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod749a32f4_4c60_4f2c_bc33_eb38e8eaddd6.slice/crio-066fa181f35ae36db53159aad60fb475dda243b60487bc6b666e1900a4109e36 WatchSource:0}: Error finding container 066fa181f35ae36db53159aad60fb475dda243b60487bc6b666e1900a4109e36: Status 404 returned error can't find the container with id 066fa181f35ae36db53159aad60fb475dda243b60487bc6b666e1900a4109e36 Feb 02 12:02:27 crc kubenswrapper[4909]: I0202 12:02:27.230059 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wv6jm" event={"ID":"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6","Type":"ContainerStarted","Data":"066fa181f35ae36db53159aad60fb475dda243b60487bc6b666e1900a4109e36"} Feb 02 12:02:27 crc kubenswrapper[4909]: I0202 12:02:27.236252 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff19-account-create-update-r4s5q" event={"ID":"4c842981-834f-414f-b66e-08d8f20a4b06","Type":"ContainerStarted","Data":"372ed309172d584b41df6708891739e8ebbfcf567d87604e8135b3774679eb16"} Feb 02 12:02:27 crc kubenswrapper[4909]: I0202 12:02:27.237270 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff19-account-create-update-r4s5q" event={"ID":"4c842981-834f-414f-b66e-08d8f20a4b06","Type":"ContainerStarted","Data":"074828dba0a7c6f8f6512c06be295129e11dc8c9a2d971e107fe9bfc4496ef06"} Feb 02 12:02:27 crc kubenswrapper[4909]: I0202 12:02:27.252476 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-wv6jm" podStartSLOduration=1.252458043 podStartE2EDuration="1.252458043s" podCreationTimestamp="2026-02-02 12:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:02:27.243547925 +0000 UTC m=+5472.989648660" watchObservedRunningTime="2026-02-02 12:02:27.252458043 +0000 UTC m=+5472.998558778" Feb 02 12:02:27 crc kubenswrapper[4909]: I0202 12:02:27.264356 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ff19-account-create-update-r4s5q" podStartSLOduration=1.264335704 podStartE2EDuration="1.264335704s" podCreationTimestamp="2026-02-02 12:02:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:02:27.259216951 +0000 UTC m=+5473.005317686" watchObservedRunningTime="2026-02-02 12:02:27.264335704 +0000 UTC m=+5473.010436429" Feb 02 12:02:28 crc kubenswrapper[4909]: I0202 12:02:28.259139 4909 generic.go:334] "Generic (PLEG): container finished" podID="749a32f4-4c60-4f2c-bc33-eb38e8eaddd6" containerID="61427af6e27e432f4da4bd3a05d4ac6b158193e75d7477d20ddea56e4baee4dd" exitCode=0 Feb 02 12:02:28 crc kubenswrapper[4909]: I0202 12:02:28.259202 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wv6jm" event={"ID":"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6","Type":"ContainerDied","Data":"61427af6e27e432f4da4bd3a05d4ac6b158193e75d7477d20ddea56e4baee4dd"} Feb 02 12:02:28 crc kubenswrapper[4909]: I0202 12:02:28.261359 4909 generic.go:334] "Generic (PLEG): container finished" podID="4c842981-834f-414f-b66e-08d8f20a4b06" containerID="372ed309172d584b41df6708891739e8ebbfcf567d87604e8135b3774679eb16" exitCode=0 Feb 02 12:02:28 crc kubenswrapper[4909]: I0202 12:02:28.261395 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff19-account-create-update-r4s5q" event={"ID":"4c842981-834f-414f-b66e-08d8f20a4b06","Type":"ContainerDied","Data":"372ed309172d584b41df6708891739e8ebbfcf567d87604e8135b3774679eb16"} Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.681329 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff19-account-create-update-r4s5q" Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.688783 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wv6jm" Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.762158 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsvxd\" (UniqueName: \"kubernetes.io/projected/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-kube-api-access-zsvxd\") pod \"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6\" (UID: \"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6\") " Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.762256 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-operator-scripts\") pod \"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6\" (UID: \"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6\") " Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.762305 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmjqx\" (UniqueName: \"kubernetes.io/projected/4c842981-834f-414f-b66e-08d8f20a4b06-kube-api-access-cmjqx\") pod \"4c842981-834f-414f-b66e-08d8f20a4b06\" (UID: \"4c842981-834f-414f-b66e-08d8f20a4b06\") " Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.762421 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c842981-834f-414f-b66e-08d8f20a4b06-operator-scripts\") pod \"4c842981-834f-414f-b66e-08d8f20a4b06\" (UID: \"4c842981-834f-414f-b66e-08d8f20a4b06\") " Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.762860 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "749a32f4-4c60-4f2c-bc33-eb38e8eaddd6" (UID: "749a32f4-4c60-4f2c-bc33-eb38e8eaddd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.763149 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c842981-834f-414f-b66e-08d8f20a4b06-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c842981-834f-414f-b66e-08d8f20a4b06" (UID: "4c842981-834f-414f-b66e-08d8f20a4b06"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.768414 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c842981-834f-414f-b66e-08d8f20a4b06-kube-api-access-cmjqx" (OuterVolumeSpecName: "kube-api-access-cmjqx") pod "4c842981-834f-414f-b66e-08d8f20a4b06" (UID: "4c842981-834f-414f-b66e-08d8f20a4b06"). InnerVolumeSpecName "kube-api-access-cmjqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.769031 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-kube-api-access-zsvxd" (OuterVolumeSpecName: "kube-api-access-zsvxd") pod "749a32f4-4c60-4f2c-bc33-eb38e8eaddd6" (UID: "749a32f4-4c60-4f2c-bc33-eb38e8eaddd6"). InnerVolumeSpecName "kube-api-access-zsvxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.864601 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsvxd\" (UniqueName: \"kubernetes.io/projected/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-kube-api-access-zsvxd\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.864645 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.864657 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmjqx\" (UniqueName: \"kubernetes.io/projected/4c842981-834f-414f-b66e-08d8f20a4b06-kube-api-access-cmjqx\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:29 crc kubenswrapper[4909]: I0202 12:02:29.864665 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c842981-834f-414f-b66e-08d8f20a4b06-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:30 crc kubenswrapper[4909]: I0202 12:02:30.281082 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wv6jm" event={"ID":"749a32f4-4c60-4f2c-bc33-eb38e8eaddd6","Type":"ContainerDied","Data":"066fa181f35ae36db53159aad60fb475dda243b60487bc6b666e1900a4109e36"} Feb 02 12:02:30 crc kubenswrapper[4909]: I0202 12:02:30.281455 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="066fa181f35ae36db53159aad60fb475dda243b60487bc6b666e1900a4109e36" Feb 02 12:02:30 crc kubenswrapper[4909]: I0202 12:02:30.281529 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wv6jm" Feb 02 12:02:30 crc kubenswrapper[4909]: I0202 12:02:30.287161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ff19-account-create-update-r4s5q" event={"ID":"4c842981-834f-414f-b66e-08d8f20a4b06","Type":"ContainerDied","Data":"074828dba0a7c6f8f6512c06be295129e11dc8c9a2d971e107fe9bfc4496ef06"} Feb 02 12:02:30 crc kubenswrapper[4909]: I0202 12:02:30.287249 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="074828dba0a7c6f8f6512c06be295129e11dc8c9a2d971e107fe9bfc4496ef06" Feb 02 12:02:30 crc kubenswrapper[4909]: I0202 12:02:30.287260 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ff19-account-create-update-r4s5q" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.528520 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-n5npn"] Feb 02 12:02:31 crc kubenswrapper[4909]: E0202 12:02:31.529277 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c842981-834f-414f-b66e-08d8f20a4b06" containerName="mariadb-account-create-update" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.529297 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c842981-834f-414f-b66e-08d8f20a4b06" containerName="mariadb-account-create-update" Feb 02 12:02:31 crc kubenswrapper[4909]: E0202 12:02:31.529311 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749a32f4-4c60-4f2c-bc33-eb38e8eaddd6" containerName="mariadb-database-create" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.529319 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="749a32f4-4c60-4f2c-bc33-eb38e8eaddd6" containerName="mariadb-database-create" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.529508 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c842981-834f-414f-b66e-08d8f20a4b06" containerName="mariadb-account-create-update" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.529539 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="749a32f4-4c60-4f2c-bc33-eb38e8eaddd6" containerName="mariadb-database-create" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.530238 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.533388 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-f949b" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.534604 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.534763 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.539172 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-n5npn"] Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.590918 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-config\") pod \"neutron-db-sync-n5npn\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.590972 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw4wk\" (UniqueName: \"kubernetes.io/projected/c5164255-168d-4b6f-85de-d544b0b0642e-kube-api-access-gw4wk\") pod \"neutron-db-sync-n5npn\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.590994 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-combined-ca-bundle\") pod \"neutron-db-sync-n5npn\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.692419 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-config\") pod \"neutron-db-sync-n5npn\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.692715 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw4wk\" (UniqueName: \"kubernetes.io/projected/c5164255-168d-4b6f-85de-d544b0b0642e-kube-api-access-gw4wk\") pod \"neutron-db-sync-n5npn\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.693145 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-combined-ca-bundle\") pod \"neutron-db-sync-n5npn\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.698914 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-combined-ca-bundle\") pod \"neutron-db-sync-n5npn\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.698982 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-config\") pod \"neutron-db-sync-n5npn\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.712038 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw4wk\" (UniqueName: \"kubernetes.io/projected/c5164255-168d-4b6f-85de-d544b0b0642e-kube-api-access-gw4wk\") pod \"neutron-db-sync-n5npn\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:31 crc kubenswrapper[4909]: I0202 12:02:31.852390 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:32 crc kubenswrapper[4909]: I0202 12:02:32.278979 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-n5npn"] Feb 02 12:02:32 crc kubenswrapper[4909]: I0202 12:02:32.305036 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n5npn" event={"ID":"c5164255-168d-4b6f-85de-d544b0b0642e","Type":"ContainerStarted","Data":"3c9fd7cc4f00a77bda60211708e7793d5d0563cbe3812e6a7c2f118d4a4da1fb"} Feb 02 12:02:33 crc kubenswrapper[4909]: I0202 12:02:33.315470 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n5npn" event={"ID":"c5164255-168d-4b6f-85de-d544b0b0642e","Type":"ContainerStarted","Data":"48580c5d41b1035b23a3ff31fb0e276385d4a59af11fd04d587c3fc8e18931b7"} Feb 02 12:02:33 crc kubenswrapper[4909]: I0202 12:02:33.333596 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-n5npn" podStartSLOduration=2.333571536 podStartE2EDuration="2.333571536s" podCreationTimestamp="2026-02-02 12:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:02:33.332080854 +0000 UTC m=+5479.078181599" watchObservedRunningTime="2026-02-02 12:02:33.333571536 +0000 UTC m=+5479.079672271" Feb 02 12:02:36 crc kubenswrapper[4909]: I0202 12:02:36.344845 4909 generic.go:334] "Generic (PLEG): container finished" podID="c5164255-168d-4b6f-85de-d544b0b0642e" containerID="48580c5d41b1035b23a3ff31fb0e276385d4a59af11fd04d587c3fc8e18931b7" exitCode=0 Feb 02 12:02:36 crc kubenswrapper[4909]: I0202 12:02:36.344940 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n5npn" event={"ID":"c5164255-168d-4b6f-85de-d544b0b0642e","Type":"ContainerDied","Data":"48580c5d41b1035b23a3ff31fb0e276385d4a59af11fd04d587c3fc8e18931b7"} Feb 02 12:02:37 crc kubenswrapper[4909]: I0202 12:02:37.714642 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:37 crc kubenswrapper[4909]: I0202 12:02:37.815593 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw4wk\" (UniqueName: \"kubernetes.io/projected/c5164255-168d-4b6f-85de-d544b0b0642e-kube-api-access-gw4wk\") pod \"c5164255-168d-4b6f-85de-d544b0b0642e\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " Feb 02 12:02:37 crc kubenswrapper[4909]: I0202 12:02:37.815681 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-config\") pod \"c5164255-168d-4b6f-85de-d544b0b0642e\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " Feb 02 12:02:37 crc kubenswrapper[4909]: I0202 12:02:37.816661 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-combined-ca-bundle\") pod \"c5164255-168d-4b6f-85de-d544b0b0642e\" (UID: \"c5164255-168d-4b6f-85de-d544b0b0642e\") " Feb 02 12:02:37 crc kubenswrapper[4909]: I0202 12:02:37.822648 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5164255-168d-4b6f-85de-d544b0b0642e-kube-api-access-gw4wk" (OuterVolumeSpecName: "kube-api-access-gw4wk") pod "c5164255-168d-4b6f-85de-d544b0b0642e" (UID: "c5164255-168d-4b6f-85de-d544b0b0642e"). InnerVolumeSpecName "kube-api-access-gw4wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:02:37 crc kubenswrapper[4909]: I0202 12:02:37.845075 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5164255-168d-4b6f-85de-d544b0b0642e" (UID: "c5164255-168d-4b6f-85de-d544b0b0642e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:02:37 crc kubenswrapper[4909]: I0202 12:02:37.851701 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-config" (OuterVolumeSpecName: "config") pod "c5164255-168d-4b6f-85de-d544b0b0642e" (UID: "c5164255-168d-4b6f-85de-d544b0b0642e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:02:37 crc kubenswrapper[4909]: I0202 12:02:37.919328 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw4wk\" (UniqueName: \"kubernetes.io/projected/c5164255-168d-4b6f-85de-d544b0b0642e-kube-api-access-gw4wk\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:37 crc kubenswrapper[4909]: I0202 12:02:37.919373 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:37 crc kubenswrapper[4909]: I0202 12:02:37.919384 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5164255-168d-4b6f-85de-d544b0b0642e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.359524 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-n5npn" event={"ID":"c5164255-168d-4b6f-85de-d544b0b0642e","Type":"ContainerDied","Data":"3c9fd7cc4f00a77bda60211708e7793d5d0563cbe3812e6a7c2f118d4a4da1fb"} Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.359994 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c9fd7cc4f00a77bda60211708e7793d5d0563cbe3812e6a7c2f118d4a4da1fb" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.359962 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-n5npn" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.524949 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-df8fdb97c-xcthz"] Feb 02 12:02:38 crc kubenswrapper[4909]: E0202 12:02:38.525629 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5164255-168d-4b6f-85de-d544b0b0642e" containerName="neutron-db-sync" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.525719 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5164255-168d-4b6f-85de-d544b0b0642e" containerName="neutron-db-sync" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.526035 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5164255-168d-4b6f-85de-d544b0b0642e" containerName="neutron-db-sync" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.527319 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.552296 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df8fdb97c-xcthz"] Feb 02 12:02:38 crc kubenswrapper[4909]: E0202 12:02:38.573025 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5164255_168d_4b6f_85de_d544b0b0642e.slice/crio-3c9fd7cc4f00a77bda60211708e7793d5d0563cbe3812e6a7c2f118d4a4da1fb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5164255_168d_4b6f_85de_d544b0b0642e.slice\": RecentStats: unable to find data in memory cache]" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.642231 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-dns-svc\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.642271 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-config\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.642361 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-nb\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.642392 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvs4k\" (UniqueName: \"kubernetes.io/projected/b1eec544-d8ae-4672-b520-57b7fc00a655-kube-api-access-qvs4k\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.642426 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-sb\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.670883 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-755b685876-hnwhh"] Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.672241 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.681421 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.681428 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-f949b" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.681927 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.682011 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.684191 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-755b685876-hnwhh"] Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.743801 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-dns-svc\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.743861 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-config\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.743977 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-nb\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.744023 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvs4k\" (UniqueName: \"kubernetes.io/projected/b1eec544-d8ae-4672-b520-57b7fc00a655-kube-api-access-qvs4k\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.744064 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-sb\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.745101 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-sb\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.745711 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-config\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.748876 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-dns-svc\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.748951 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-nb\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.777278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvs4k\" (UniqueName: \"kubernetes.io/projected/b1eec544-d8ae-4672-b520-57b7fc00a655-kube-api-access-qvs4k\") pod \"dnsmasq-dns-df8fdb97c-xcthz\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.845270 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-combined-ca-bundle\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.845332 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnc6k\" (UniqueName: \"kubernetes.io/projected/54804e1f-3371-4774-b7b8-02a96c1d86fb-kube-api-access-qnc6k\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.845365 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-httpd-config\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.845426 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-config\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.845495 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-ovndb-tls-certs\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.849986 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.946475 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnc6k\" (UniqueName: \"kubernetes.io/projected/54804e1f-3371-4774-b7b8-02a96c1d86fb-kube-api-access-qnc6k\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.946914 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-httpd-config\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.946956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-config\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.946999 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-ovndb-tls-certs\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.947059 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-combined-ca-bundle\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.951669 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-combined-ca-bundle\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.952863 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-ovndb-tls-certs\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.953836 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-httpd-config\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.955453 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-config\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.979305 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnc6k\" (UniqueName: \"kubernetes.io/projected/54804e1f-3371-4774-b7b8-02a96c1d86fb-kube-api-access-qnc6k\") pod \"neutron-755b685876-hnwhh\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:38 crc kubenswrapper[4909]: I0202 12:02:38.992218 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:39 crc kubenswrapper[4909]: I0202 12:02:39.359903 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df8fdb97c-xcthz"] Feb 02 12:02:39 crc kubenswrapper[4909]: I0202 12:02:39.611129 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-755b685876-hnwhh"] Feb 02 12:02:40 crc kubenswrapper[4909]: I0202 12:02:40.373764 4909 generic.go:334] "Generic (PLEG): container finished" podID="b1eec544-d8ae-4672-b520-57b7fc00a655" containerID="0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e" exitCode=0 Feb 02 12:02:40 crc kubenswrapper[4909]: I0202 12:02:40.373908 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" event={"ID":"b1eec544-d8ae-4672-b520-57b7fc00a655","Type":"ContainerDied","Data":"0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e"} Feb 02 12:02:40 crc kubenswrapper[4909]: I0202 12:02:40.374293 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" event={"ID":"b1eec544-d8ae-4672-b520-57b7fc00a655","Type":"ContainerStarted","Data":"61fa65b8d6c7ff0bf30b42f300881e6b9fe4b0114974f609f010d6b89436287d"} Feb 02 12:02:40 crc kubenswrapper[4909]: I0202 12:02:40.377709 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755b685876-hnwhh" event={"ID":"54804e1f-3371-4774-b7b8-02a96c1d86fb","Type":"ContainerStarted","Data":"4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2"} Feb 02 12:02:40 crc kubenswrapper[4909]: I0202 12:02:40.377739 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755b685876-hnwhh" event={"ID":"54804e1f-3371-4774-b7b8-02a96c1d86fb","Type":"ContainerStarted","Data":"528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb"} Feb 02 12:02:40 crc kubenswrapper[4909]: I0202 12:02:40.377751 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755b685876-hnwhh" event={"ID":"54804e1f-3371-4774-b7b8-02a96c1d86fb","Type":"ContainerStarted","Data":"abeb214c0ea56bf5c9a7e217e79f0202bd4cad8e89596b11520e9246f3701f17"} Feb 02 12:02:40 crc kubenswrapper[4909]: I0202 12:02:40.377847 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:02:40 crc kubenswrapper[4909]: I0202 12:02:40.474999 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-755b685876-hnwhh" podStartSLOduration=2.474980864 podStartE2EDuration="2.474980864s" podCreationTimestamp="2026-02-02 12:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:02:40.472061631 +0000 UTC m=+5486.218162376" watchObservedRunningTime="2026-02-02 12:02:40.474980864 +0000 UTC m=+5486.221081599" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.393459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" event={"ID":"b1eec544-d8ae-4672-b520-57b7fc00a655","Type":"ContainerStarted","Data":"4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b"} Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.393821 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.420640 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" podStartSLOduration=3.420620844 podStartE2EDuration="3.420620844s" podCreationTimestamp="2026-02-02 12:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:02:41.413725658 +0000 UTC m=+5487.159826393" watchObservedRunningTime="2026-02-02 12:02:41.420620844 +0000 UTC m=+5487.166721579" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.438429 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-864bddcb8f-jdfcj"] Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.439751 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.442129 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.442188 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.452348 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-864bddcb8f-jdfcj"] Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.596510 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-internal-tls-certs\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.596560 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-httpd-config\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.596613 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-ovndb-tls-certs\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.596638 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-public-tls-certs\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.596665 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-combined-ca-bundle\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.596685 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-config\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.596721 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwzbc\" (UniqueName: \"kubernetes.io/projected/544c4791-365d-409b-9d1c-7c6408c865ca-kube-api-access-nwzbc\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.698769 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-config\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.699269 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwzbc\" (UniqueName: \"kubernetes.io/projected/544c4791-365d-409b-9d1c-7c6408c865ca-kube-api-access-nwzbc\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.699405 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-internal-tls-certs\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.699494 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-httpd-config\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.699622 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-ovndb-tls-certs\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.699689 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-public-tls-certs\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.699752 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-combined-ca-bundle\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.705518 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-ovndb-tls-certs\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.706255 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-public-tls-certs\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.712405 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-httpd-config\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.715673 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-internal-tls-certs\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.716929 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-config\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.717999 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/544c4791-365d-409b-9d1c-7c6408c865ca-combined-ca-bundle\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.724322 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwzbc\" (UniqueName: \"kubernetes.io/projected/544c4791-365d-409b-9d1c-7c6408c865ca-kube-api-access-nwzbc\") pod \"neutron-864bddcb8f-jdfcj\" (UID: \"544c4791-365d-409b-9d1c-7c6408c865ca\") " pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:41 crc kubenswrapper[4909]: I0202 12:02:41.761345 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:42 crc kubenswrapper[4909]: I0202 12:02:42.333131 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-864bddcb8f-jdfcj"] Feb 02 12:02:42 crc kubenswrapper[4909]: W0202 12:02:42.338357 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod544c4791_365d_409b_9d1c_7c6408c865ca.slice/crio-51233372b39137e471c233d1b614a2ef1903870dbdbcace5fd9de78c4302bb5f WatchSource:0}: Error finding container 51233372b39137e471c233d1b614a2ef1903870dbdbcace5fd9de78c4302bb5f: Status 404 returned error can't find the container with id 51233372b39137e471c233d1b614a2ef1903870dbdbcace5fd9de78c4302bb5f Feb 02 12:02:42 crc kubenswrapper[4909]: I0202 12:02:42.402207 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-864bddcb8f-jdfcj" event={"ID":"544c4791-365d-409b-9d1c-7c6408c865ca","Type":"ContainerStarted","Data":"51233372b39137e471c233d1b614a2ef1903870dbdbcace5fd9de78c4302bb5f"} Feb 02 12:02:43 crc kubenswrapper[4909]: I0202 12:02:43.066020 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-s4czs"] Feb 02 12:02:43 crc kubenswrapper[4909]: I0202 12:02:43.078254 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-s4czs"] Feb 02 12:02:43 crc kubenswrapper[4909]: I0202 12:02:43.412954 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-864bddcb8f-jdfcj" event={"ID":"544c4791-365d-409b-9d1c-7c6408c865ca","Type":"ContainerStarted","Data":"4239101cb9511c6b858ae314376af17e629a48d0aec92fee2cce880331bd4541"} Feb 02 12:02:43 crc kubenswrapper[4909]: I0202 12:02:43.413281 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-864bddcb8f-jdfcj" event={"ID":"544c4791-365d-409b-9d1c-7c6408c865ca","Type":"ContainerStarted","Data":"73936005a3a94fcefa8649fdb87e179c23ab8c3fcec33ea0a1037ca67fff8760"} Feb 02 12:02:43 crc kubenswrapper[4909]: I0202 12:02:43.413616 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:02:43 crc kubenswrapper[4909]: I0202 12:02:43.432873 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-864bddcb8f-jdfcj" podStartSLOduration=2.432853148 podStartE2EDuration="2.432853148s" podCreationTimestamp="2026-02-02 12:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:02:43.428409992 +0000 UTC m=+5489.174510727" watchObservedRunningTime="2026-02-02 12:02:43.432853148 +0000 UTC m=+5489.178953883" Feb 02 12:02:45 crc kubenswrapper[4909]: I0202 12:02:45.029741 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7faf3248-fd71-4f4e-960f-a89402aa822c" path="/var/lib/kubelet/pods/7faf3248-fd71-4f4e-960f-a89402aa822c/volumes" Feb 02 12:02:48 crc kubenswrapper[4909]: I0202 12:02:48.852018 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:02:48 crc kubenswrapper[4909]: I0202 12:02:48.912627 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dd8f6b7-gp78r"] Feb 02 12:02:48 crc kubenswrapper[4909]: I0202 12:02:48.912962 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" podUID="940bab09-bdad-49c2-b9f5-8326b62f98be" containerName="dnsmasq-dns" containerID="cri-o://2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a" gracePeriod=10 Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.371204 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.473726 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49tzp\" (UniqueName: \"kubernetes.io/projected/940bab09-bdad-49c2-b9f5-8326b62f98be-kube-api-access-49tzp\") pod \"940bab09-bdad-49c2-b9f5-8326b62f98be\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.473873 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-nb\") pod \"940bab09-bdad-49c2-b9f5-8326b62f98be\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.473924 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-sb\") pod \"940bab09-bdad-49c2-b9f5-8326b62f98be\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.473963 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-dns-svc\") pod \"940bab09-bdad-49c2-b9f5-8326b62f98be\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.474013 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-config\") pod \"940bab09-bdad-49c2-b9f5-8326b62f98be\" (UID: \"940bab09-bdad-49c2-b9f5-8326b62f98be\") " Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.481701 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940bab09-bdad-49c2-b9f5-8326b62f98be-kube-api-access-49tzp" (OuterVolumeSpecName: "kube-api-access-49tzp") pod "940bab09-bdad-49c2-b9f5-8326b62f98be" (UID: "940bab09-bdad-49c2-b9f5-8326b62f98be"). InnerVolumeSpecName "kube-api-access-49tzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.485974 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" event={"ID":"940bab09-bdad-49c2-b9f5-8326b62f98be","Type":"ContainerDied","Data":"2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a"} Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.486188 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.485758 4909 generic.go:334] "Generic (PLEG): container finished" podID="940bab09-bdad-49c2-b9f5-8326b62f98be" containerID="2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a" exitCode=0 Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.486364 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75dd8f6b7-gp78r" event={"ID":"940bab09-bdad-49c2-b9f5-8326b62f98be","Type":"ContainerDied","Data":"6b5733fde5ef9ae73b58096ce605f0778868391c399c2df0135263f79a668d4f"} Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.486047 4909 scope.go:117] "RemoveContainer" containerID="2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.510798 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.510905 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.510961 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.511598 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c8a4532cb0e21b24c65ea2a45893f4d1a368d3a1bac20d18498ef08f25007082"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.511658 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://c8a4532cb0e21b24c65ea2a45893f4d1a368d3a1bac20d18498ef08f25007082" gracePeriod=600 Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.526774 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "940bab09-bdad-49c2-b9f5-8326b62f98be" (UID: "940bab09-bdad-49c2-b9f5-8326b62f98be"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.527583 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "940bab09-bdad-49c2-b9f5-8326b62f98be" (UID: "940bab09-bdad-49c2-b9f5-8326b62f98be"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.531645 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-config" (OuterVolumeSpecName: "config") pod "940bab09-bdad-49c2-b9f5-8326b62f98be" (UID: "940bab09-bdad-49c2-b9f5-8326b62f98be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.540125 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "940bab09-bdad-49c2-b9f5-8326b62f98be" (UID: "940bab09-bdad-49c2-b9f5-8326b62f98be"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.576976 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.577018 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49tzp\" (UniqueName: \"kubernetes.io/projected/940bab09-bdad-49c2-b9f5-8326b62f98be-kube-api-access-49tzp\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.577035 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.577047 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.577057 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/940bab09-bdad-49c2-b9f5-8326b62f98be-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.639531 4909 scope.go:117] "RemoveContainer" containerID="cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.671661 4909 scope.go:117] "RemoveContainer" containerID="2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a" Feb 02 12:02:49 crc kubenswrapper[4909]: E0202 12:02:49.672311 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a\": container with ID starting with 2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a not found: ID does not exist" containerID="2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.672362 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a"} err="failed to get container status \"2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a\": rpc error: code = NotFound desc = could not find container \"2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a\": container with ID starting with 2b67f6083332fd3a47409f2b2dab8b776cdf7cbc40c13f860eaf8930bec1051a not found: ID does not exist" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.672389 4909 scope.go:117] "RemoveContainer" containerID="cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e" Feb 02 12:02:49 crc kubenswrapper[4909]: E0202 12:02:49.672730 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e\": container with ID starting with cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e not found: ID does not exist" containerID="cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.672762 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e"} err="failed to get container status \"cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e\": rpc error: code = NotFound desc = could not find container \"cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e\": container with ID starting with cb2fc4375997dc65d99afd69b2a4d5a096dd810e6cb139f30ba34862314af19e not found: ID does not exist" Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.835038 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75dd8f6b7-gp78r"] Feb 02 12:02:49 crc kubenswrapper[4909]: I0202 12:02:49.844722 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75dd8f6b7-gp78r"] Feb 02 12:02:50 crc kubenswrapper[4909]: I0202 12:02:50.497466 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="c8a4532cb0e21b24c65ea2a45893f4d1a368d3a1bac20d18498ef08f25007082" exitCode=0 Feb 02 12:02:50 crc kubenswrapper[4909]: I0202 12:02:50.497530 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"c8a4532cb0e21b24c65ea2a45893f4d1a368d3a1bac20d18498ef08f25007082"} Feb 02 12:02:50 crc kubenswrapper[4909]: I0202 12:02:50.497853 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783"} Feb 02 12:02:50 crc kubenswrapper[4909]: I0202 12:02:50.497877 4909 scope.go:117] "RemoveContainer" containerID="38b303aff690761206d22158501a772c23d430c7b7485ebf1a03a5b35dab6d83" Feb 02 12:02:51 crc kubenswrapper[4909]: I0202 12:02:51.025503 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940bab09-bdad-49c2-b9f5-8326b62f98be" path="/var/lib/kubelet/pods/940bab09-bdad-49c2-b9f5-8326b62f98be/volumes" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.458395 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rvllg"] Feb 02 12:02:57 crc kubenswrapper[4909]: E0202 12:02:57.459292 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940bab09-bdad-49c2-b9f5-8326b62f98be" containerName="init" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.459304 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="940bab09-bdad-49c2-b9f5-8326b62f98be" containerName="init" Feb 02 12:02:57 crc kubenswrapper[4909]: E0202 12:02:57.459329 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940bab09-bdad-49c2-b9f5-8326b62f98be" containerName="dnsmasq-dns" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.459335 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="940bab09-bdad-49c2-b9f5-8326b62f98be" containerName="dnsmasq-dns" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.459494 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="940bab09-bdad-49c2-b9f5-8326b62f98be" containerName="dnsmasq-dns" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.460823 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.472264 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvllg"] Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.607214 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlgcg\" (UniqueName: \"kubernetes.io/projected/c2f4263f-37e2-46f0-be7e-d3da6196e96e-kube-api-access-qlgcg\") pod \"certified-operators-rvllg\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.607305 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-catalog-content\") pod \"certified-operators-rvllg\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.607484 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-utilities\") pod \"certified-operators-rvllg\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.709598 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlgcg\" (UniqueName: \"kubernetes.io/projected/c2f4263f-37e2-46f0-be7e-d3da6196e96e-kube-api-access-qlgcg\") pod \"certified-operators-rvllg\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.709678 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-catalog-content\") pod \"certified-operators-rvllg\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.709847 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-utilities\") pod \"certified-operators-rvllg\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.710526 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-catalog-content\") pod \"certified-operators-rvllg\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.710600 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-utilities\") pod \"certified-operators-rvllg\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.731313 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlgcg\" (UniqueName: \"kubernetes.io/projected/c2f4263f-37e2-46f0-be7e-d3da6196e96e-kube-api-access-qlgcg\") pod \"certified-operators-rvllg\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:57 crc kubenswrapper[4909]: I0202 12:02:57.793070 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:02:58 crc kubenswrapper[4909]: I0202 12:02:58.348120 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvllg"] Feb 02 12:02:58 crc kubenswrapper[4909]: I0202 12:02:58.574107 4909 generic.go:334] "Generic (PLEG): container finished" podID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerID="96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2" exitCode=0 Feb 02 12:02:58 crc kubenswrapper[4909]: I0202 12:02:58.574165 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvllg" event={"ID":"c2f4263f-37e2-46f0-be7e-d3da6196e96e","Type":"ContainerDied","Data":"96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2"} Feb 02 12:02:58 crc kubenswrapper[4909]: I0202 12:02:58.574197 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvllg" event={"ID":"c2f4263f-37e2-46f0-be7e-d3da6196e96e","Type":"ContainerStarted","Data":"79f0869f423e49c5c88464190cc1a18ed04b06a46c07baca6238b22a49d290b6"} Feb 02 12:02:59 crc kubenswrapper[4909]: I0202 12:02:59.584185 4909 generic.go:334] "Generic (PLEG): container finished" podID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerID="89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29" exitCode=0 Feb 02 12:02:59 crc kubenswrapper[4909]: I0202 12:02:59.584377 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvllg" event={"ID":"c2f4263f-37e2-46f0-be7e-d3da6196e96e","Type":"ContainerDied","Data":"89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29"} Feb 02 12:03:00 crc kubenswrapper[4909]: I0202 12:03:00.595473 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvllg" event={"ID":"c2f4263f-37e2-46f0-be7e-d3da6196e96e","Type":"ContainerStarted","Data":"0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e"} Feb 02 12:03:07 crc kubenswrapper[4909]: I0202 12:03:07.794103 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:03:07 crc kubenswrapper[4909]: I0202 12:03:07.794909 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:03:07 crc kubenswrapper[4909]: I0202 12:03:07.844311 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:03:07 crc kubenswrapper[4909]: I0202 12:03:07.868782 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rvllg" podStartSLOduration=9.442597395 podStartE2EDuration="10.868753024s" podCreationTimestamp="2026-02-02 12:02:57 +0000 UTC" firstStartedPulling="2026-02-02 12:02:58.576315202 +0000 UTC m=+5504.322415937" lastFinishedPulling="2026-02-02 12:03:00.002470831 +0000 UTC m=+5505.748571566" observedRunningTime="2026-02-02 12:03:00.613078363 +0000 UTC m=+5506.359179108" watchObservedRunningTime="2026-02-02 12:03:07.868753024 +0000 UTC m=+5513.614853789" Feb 02 12:03:08 crc kubenswrapper[4909]: I0202 12:03:08.729340 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:03:08 crc kubenswrapper[4909]: I0202 12:03:08.808518 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvllg"] Feb 02 12:03:09 crc kubenswrapper[4909]: I0202 12:03:09.002551 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:03:10 crc kubenswrapper[4909]: I0202 12:03:10.687173 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rvllg" podUID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerName="registry-server" containerID="cri-o://0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e" gracePeriod=2 Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.152040 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.262177 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-utilities\") pod \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.262271 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-catalog-content\") pod \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.262366 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlgcg\" (UniqueName: \"kubernetes.io/projected/c2f4263f-37e2-46f0-be7e-d3da6196e96e-kube-api-access-qlgcg\") pod \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\" (UID: \"c2f4263f-37e2-46f0-be7e-d3da6196e96e\") " Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.263143 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-utilities" (OuterVolumeSpecName: "utilities") pod "c2f4263f-37e2-46f0-be7e-d3da6196e96e" (UID: "c2f4263f-37e2-46f0-be7e-d3da6196e96e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.269103 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f4263f-37e2-46f0-be7e-d3da6196e96e-kube-api-access-qlgcg" (OuterVolumeSpecName: "kube-api-access-qlgcg") pod "c2f4263f-37e2-46f0-be7e-d3da6196e96e" (UID: "c2f4263f-37e2-46f0-be7e-d3da6196e96e"). InnerVolumeSpecName "kube-api-access-qlgcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.307827 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2f4263f-37e2-46f0-be7e-d3da6196e96e" (UID: "c2f4263f-37e2-46f0-be7e-d3da6196e96e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.364544 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.364892 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2f4263f-37e2-46f0-be7e-d3da6196e96e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.364970 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlgcg\" (UniqueName: \"kubernetes.io/projected/c2f4263f-37e2-46f0-be7e-d3da6196e96e-kube-api-access-qlgcg\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.698261 4909 generic.go:334] "Generic (PLEG): container finished" podID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerID="0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e" exitCode=0 Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.698312 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvllg" event={"ID":"c2f4263f-37e2-46f0-be7e-d3da6196e96e","Type":"ContainerDied","Data":"0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e"} Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.698344 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvllg" event={"ID":"c2f4263f-37e2-46f0-be7e-d3da6196e96e","Type":"ContainerDied","Data":"79f0869f423e49c5c88464190cc1a18ed04b06a46c07baca6238b22a49d290b6"} Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.698365 4909 scope.go:117] "RemoveContainer" containerID="0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.698359 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvllg" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.737224 4909 scope.go:117] "RemoveContainer" containerID="89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.743719 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvllg"] Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.751201 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rvllg"] Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.770635 4909 scope.go:117] "RemoveContainer" containerID="96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.782865 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-864bddcb8f-jdfcj" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.817059 4909 scope.go:117] "RemoveContainer" containerID="0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e" Feb 02 12:03:11 crc kubenswrapper[4909]: E0202 12:03:11.817645 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e\": container with ID starting with 0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e not found: ID does not exist" containerID="0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.817695 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e"} err="failed to get container status \"0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e\": rpc error: code = NotFound desc = could not find container \"0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e\": container with ID starting with 0035719f3c041842fbc054828f6746914768713ef34d9cc837d71f973dcaf85e not found: ID does not exist" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.817730 4909 scope.go:117] "RemoveContainer" containerID="89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29" Feb 02 12:03:11 crc kubenswrapper[4909]: E0202 12:03:11.818155 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29\": container with ID starting with 89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29 not found: ID does not exist" containerID="89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.818178 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29"} err="failed to get container status \"89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29\": rpc error: code = NotFound desc = could not find container \"89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29\": container with ID starting with 89b7e03208d804df3f421552cb3c575e24bc14ab4cb52f466321d8120145fc29 not found: ID does not exist" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.818197 4909 scope.go:117] "RemoveContainer" containerID="96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2" Feb 02 12:03:11 crc kubenswrapper[4909]: E0202 12:03:11.820257 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2\": container with ID starting with 96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2 not found: ID does not exist" containerID="96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.820376 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2"} err="failed to get container status \"96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2\": rpc error: code = NotFound desc = could not find container \"96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2\": container with ID starting with 96a14360fc96dae0911bde6cf7edf988beb53ab1030eb3abc0d26d8a714707b2 not found: ID does not exist" Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.862645 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-755b685876-hnwhh"] Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.862929 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-755b685876-hnwhh" podUID="54804e1f-3371-4774-b7b8-02a96c1d86fb" containerName="neutron-api" containerID="cri-o://528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb" gracePeriod=30 Feb 02 12:03:11 crc kubenswrapper[4909]: I0202 12:03:11.863445 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-755b685876-hnwhh" podUID="54804e1f-3371-4774-b7b8-02a96c1d86fb" containerName="neutron-httpd" containerID="cri-o://4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2" gracePeriod=30 Feb 02 12:03:12 crc kubenswrapper[4909]: I0202 12:03:12.707939 4909 generic.go:334] "Generic (PLEG): container finished" podID="54804e1f-3371-4774-b7b8-02a96c1d86fb" containerID="4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2" exitCode=0 Feb 02 12:03:12 crc kubenswrapper[4909]: I0202 12:03:12.708020 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755b685876-hnwhh" event={"ID":"54804e1f-3371-4774-b7b8-02a96c1d86fb","Type":"ContainerDied","Data":"4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2"} Feb 02 12:03:13 crc kubenswrapper[4909]: I0202 12:03:13.029583 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" path="/var/lib/kubelet/pods/c2f4263f-37e2-46f0-be7e-d3da6196e96e/volumes" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.583902 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.724837 4909 generic.go:334] "Generic (PLEG): container finished" podID="54804e1f-3371-4774-b7b8-02a96c1d86fb" containerID="528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb" exitCode=0 Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.724882 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755b685876-hnwhh" event={"ID":"54804e1f-3371-4774-b7b8-02a96c1d86fb","Type":"ContainerDied","Data":"528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb"} Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.724913 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-755b685876-hnwhh" event={"ID":"54804e1f-3371-4774-b7b8-02a96c1d86fb","Type":"ContainerDied","Data":"abeb214c0ea56bf5c9a7e217e79f0202bd4cad8e89596b11520e9246f3701f17"} Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.724931 4909 scope.go:117] "RemoveContainer" containerID="4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.724924 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-755b685876-hnwhh" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.728020 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-combined-ca-bundle\") pod \"54804e1f-3371-4774-b7b8-02a96c1d86fb\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.728114 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnc6k\" (UniqueName: \"kubernetes.io/projected/54804e1f-3371-4774-b7b8-02a96c1d86fb-kube-api-access-qnc6k\") pod \"54804e1f-3371-4774-b7b8-02a96c1d86fb\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.728331 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-ovndb-tls-certs\") pod \"54804e1f-3371-4774-b7b8-02a96c1d86fb\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.728409 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-httpd-config\") pod \"54804e1f-3371-4774-b7b8-02a96c1d86fb\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.728447 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-config\") pod \"54804e1f-3371-4774-b7b8-02a96c1d86fb\" (UID: \"54804e1f-3371-4774-b7b8-02a96c1d86fb\") " Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.740989 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54804e1f-3371-4774-b7b8-02a96c1d86fb-kube-api-access-qnc6k" (OuterVolumeSpecName: "kube-api-access-qnc6k") pod "54804e1f-3371-4774-b7b8-02a96c1d86fb" (UID: "54804e1f-3371-4774-b7b8-02a96c1d86fb"). InnerVolumeSpecName "kube-api-access-qnc6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.742650 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "54804e1f-3371-4774-b7b8-02a96c1d86fb" (UID: "54804e1f-3371-4774-b7b8-02a96c1d86fb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.747953 4909 scope.go:117] "RemoveContainer" containerID="528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.769490 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-config" (OuterVolumeSpecName: "config") pod "54804e1f-3371-4774-b7b8-02a96c1d86fb" (UID: "54804e1f-3371-4774-b7b8-02a96c1d86fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.781739 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54804e1f-3371-4774-b7b8-02a96c1d86fb" (UID: "54804e1f-3371-4774-b7b8-02a96c1d86fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.795993 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "54804e1f-3371-4774-b7b8-02a96c1d86fb" (UID: "54804e1f-3371-4774-b7b8-02a96c1d86fb"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.824947 4909 scope.go:117] "RemoveContainer" containerID="4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2" Feb 02 12:03:14 crc kubenswrapper[4909]: E0202 12:03:14.826047 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2\": container with ID starting with 4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2 not found: ID does not exist" containerID="4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.826098 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2"} err="failed to get container status \"4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2\": rpc error: code = NotFound desc = could not find container \"4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2\": container with ID starting with 4a7170c68b93d21a79b447cbd718d5686ec9ef3eda8f5b03d3e1984ffc372de2 not found: ID does not exist" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.826137 4909 scope.go:117] "RemoveContainer" containerID="528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb" Feb 02 12:03:14 crc kubenswrapper[4909]: E0202 12:03:14.826508 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb\": container with ID starting with 528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb not found: ID does not exist" containerID="528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.826549 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb"} err="failed to get container status \"528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb\": rpc error: code = NotFound desc = could not find container \"528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb\": container with ID starting with 528006a1353af2132222d52dde565bf2fe7a70fc14c9a4f5cc67224d3b57eccb not found: ID does not exist" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.830642 4909 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.830666 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.830676 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.830685 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54804e1f-3371-4774-b7b8-02a96c1d86fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:14 crc kubenswrapper[4909]: I0202 12:03:14.830695 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnc6k\" (UniqueName: \"kubernetes.io/projected/54804e1f-3371-4774-b7b8-02a96c1d86fb-kube-api-access-qnc6k\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:15 crc kubenswrapper[4909]: I0202 12:03:15.076112 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-755b685876-hnwhh"] Feb 02 12:03:15 crc kubenswrapper[4909]: I0202 12:03:15.083791 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-755b685876-hnwhh"] Feb 02 12:03:17 crc kubenswrapper[4909]: I0202 12:03:17.028020 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54804e1f-3371-4774-b7b8-02a96c1d86fb" path="/var/lib/kubelet/pods/54804e1f-3371-4774-b7b8-02a96c1d86fb/volumes" Feb 02 12:03:19 crc kubenswrapper[4909]: E0202 12:03:19.425945 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54804e1f_3371_4774_b7b8_02a96c1d86fb.slice/crio-abeb214c0ea56bf5c9a7e217e79f0202bd4cad8e89596b11520e9246f3701f17\": RecentStats: unable to find data in memory cache]" Feb 02 12:03:20 crc kubenswrapper[4909]: I0202 12:03:20.947438 4909 scope.go:117] "RemoveContainer" containerID="27a6db43f1898ac9e2ebe7807adbde6f140ea3ebfe5b9a9f7253a3946a7a11a9" Feb 02 12:03:29 crc kubenswrapper[4909]: E0202 12:03:29.618010 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54804e1f_3371_4774_b7b8_02a96c1d86fb.slice/crio-abeb214c0ea56bf5c9a7e217e79f0202bd4cad8e89596b11520e9246f3701f17\": RecentStats: unable to find data in memory cache]" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.302809 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zbq6d"] Feb 02 12:03:32 crc kubenswrapper[4909]: E0202 12:03:32.306508 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerName="registry-server" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.306523 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerName="registry-server" Feb 02 12:03:32 crc kubenswrapper[4909]: E0202 12:03:32.306537 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54804e1f-3371-4774-b7b8-02a96c1d86fb" containerName="neutron-api" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.306543 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="54804e1f-3371-4774-b7b8-02a96c1d86fb" containerName="neutron-api" Feb 02 12:03:32 crc kubenswrapper[4909]: E0202 12:03:32.306564 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerName="extract-content" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.306572 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerName="extract-content" Feb 02 12:03:32 crc kubenswrapper[4909]: E0202 12:03:32.306594 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54804e1f-3371-4774-b7b8-02a96c1d86fb" containerName="neutron-httpd" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.306600 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="54804e1f-3371-4774-b7b8-02a96c1d86fb" containerName="neutron-httpd" Feb 02 12:03:32 crc kubenswrapper[4909]: E0202 12:03:32.306615 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerName="extract-utilities" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.306622 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerName="extract-utilities" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.306784 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f4263f-37e2-46f0-be7e-d3da6196e96e" containerName="registry-server" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.306794 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="54804e1f-3371-4774-b7b8-02a96c1d86fb" containerName="neutron-httpd" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.306811 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="54804e1f-3371-4774-b7b8-02a96c1d86fb" containerName="neutron-api" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.308162 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.331275 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zbq6d"] Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.407168 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fznpv\" (UniqueName: \"kubernetes.io/projected/e7eb513f-1e95-42ad-9aff-22856a8d6f49-kube-api-access-fznpv\") pod \"community-operators-zbq6d\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.407248 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-catalog-content\") pod \"community-operators-zbq6d\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.407314 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-utilities\") pod \"community-operators-zbq6d\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.509013 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fznpv\" (UniqueName: \"kubernetes.io/projected/e7eb513f-1e95-42ad-9aff-22856a8d6f49-kube-api-access-fznpv\") pod \"community-operators-zbq6d\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.509067 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-catalog-content\") pod \"community-operators-zbq6d\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.509113 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-utilities\") pod \"community-operators-zbq6d\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.509741 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-catalog-content\") pod \"community-operators-zbq6d\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.509801 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-utilities\") pod \"community-operators-zbq6d\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.542811 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fznpv\" (UniqueName: \"kubernetes.io/projected/e7eb513f-1e95-42ad-9aff-22856a8d6f49-kube-api-access-fznpv\") pod \"community-operators-zbq6d\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:32 crc kubenswrapper[4909]: I0202 12:03:32.637778 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:33 crc kubenswrapper[4909]: I0202 12:03:33.168286 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zbq6d"] Feb 02 12:03:33 crc kubenswrapper[4909]: W0202 12:03:33.174499 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7eb513f_1e95_42ad_9aff_22856a8d6f49.slice/crio-12390a39f6931146ab2dc346c7d56089a076e0127e12456fd13ff53ee1f4e7af WatchSource:0}: Error finding container 12390a39f6931146ab2dc346c7d56089a076e0127e12456fd13ff53ee1f4e7af: Status 404 returned error can't find the container with id 12390a39f6931146ab2dc346c7d56089a076e0127e12456fd13ff53ee1f4e7af Feb 02 12:03:33 crc kubenswrapper[4909]: I0202 12:03:33.901009 4909 generic.go:334] "Generic (PLEG): container finished" podID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerID="b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4" exitCode=0 Feb 02 12:03:33 crc kubenswrapper[4909]: I0202 12:03:33.901066 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbq6d" event={"ID":"e7eb513f-1e95-42ad-9aff-22856a8d6f49","Type":"ContainerDied","Data":"b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4"} Feb 02 12:03:33 crc kubenswrapper[4909]: I0202 12:03:33.901319 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbq6d" event={"ID":"e7eb513f-1e95-42ad-9aff-22856a8d6f49","Type":"ContainerStarted","Data":"12390a39f6931146ab2dc346c7d56089a076e0127e12456fd13ff53ee1f4e7af"} Feb 02 12:03:34 crc kubenswrapper[4909]: I0202 12:03:34.910762 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbq6d" event={"ID":"e7eb513f-1e95-42ad-9aff-22856a8d6f49","Type":"ContainerStarted","Data":"e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e"} Feb 02 12:03:35 crc kubenswrapper[4909]: I0202 12:03:35.919251 4909 generic.go:334] "Generic (PLEG): container finished" podID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerID="e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e" exitCode=0 Feb 02 12:03:35 crc kubenswrapper[4909]: I0202 12:03:35.919290 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbq6d" event={"ID":"e7eb513f-1e95-42ad-9aff-22856a8d6f49","Type":"ContainerDied","Data":"e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e"} Feb 02 12:03:36 crc kubenswrapper[4909]: I0202 12:03:36.930077 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbq6d" event={"ID":"e7eb513f-1e95-42ad-9aff-22856a8d6f49","Type":"ContainerStarted","Data":"3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946"} Feb 02 12:03:36 crc kubenswrapper[4909]: I0202 12:03:36.949509 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zbq6d" podStartSLOduration=2.479481109 podStartE2EDuration="4.949481147s" podCreationTimestamp="2026-02-02 12:03:32 +0000 UTC" firstStartedPulling="2026-02-02 12:03:33.90348757 +0000 UTC m=+5539.649588305" lastFinishedPulling="2026-02-02 12:03:36.373487608 +0000 UTC m=+5542.119588343" observedRunningTime="2026-02-02 12:03:36.944994409 +0000 UTC m=+5542.691095144" watchObservedRunningTime="2026-02-02 12:03:36.949481147 +0000 UTC m=+5542.695581882" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.495794 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-p2pcl"] Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.497352 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.505276 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.507380 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.507420 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.511102 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-zxr5l" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.511381 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.546466 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-p2pcl"] Feb 02 12:03:39 crc kubenswrapper[4909]: E0202 12:03:39.547116 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-tvvpz ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-p2pcl" podUID="9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.549904 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-ring-data-devices\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.549982 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvvpz\" (UniqueName: \"kubernetes.io/projected/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-kube-api-access-tvvpz\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.550005 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-scripts\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.550033 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-combined-ca-bundle\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.550077 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-etc-swift\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.550103 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-swiftconf\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.550157 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-dispersionconf\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.570895 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9ps2d"] Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.572463 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.632874 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9ps2d"] Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651089 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-swiftconf\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651168 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-combined-ca-bundle\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651194 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-dispersionconf\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651213 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-etc-swift\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651234 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-dispersionconf\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651255 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-ring-data-devices\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651275 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dzx2\" (UniqueName: \"kubernetes.io/projected/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-kube-api-access-6dzx2\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651321 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvvpz\" (UniqueName: \"kubernetes.io/projected/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-kube-api-access-tvvpz\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651339 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-ring-data-devices\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651355 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-scripts\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651377 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-combined-ca-bundle\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651397 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-swiftconf\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651416 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-scripts\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651455 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-etc-swift\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.651863 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-etc-swift\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.652366 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-scripts\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.652426 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-ring-data-devices\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.662278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-swiftconf\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.665557 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-dispersionconf\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.668881 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-p2pcl"] Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.670304 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-combined-ca-bundle\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.685867 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvvpz\" (UniqueName: \"kubernetes.io/projected/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-kube-api-access-tvvpz\") pod \"swift-ring-rebalance-p2pcl\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.688231 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b4bb8f457-t96qt"] Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.693584 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.712883 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b4bb8f457-t96qt"] Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754200 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-combined-ca-bundle\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754264 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-sb\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754301 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-etc-swift\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754325 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-dispersionconf\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754346 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-dns-svc\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754373 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dzx2\" (UniqueName: \"kubernetes.io/projected/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-kube-api-access-6dzx2\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754434 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-ring-data-devices\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754470 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-swiftconf\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754488 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-scripts\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754518 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whz4b\" (UniqueName: \"kubernetes.io/projected/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-kube-api-access-whz4b\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754612 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-config\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.754634 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-nb\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.756801 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-etc-swift\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.757449 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-ring-data-devices\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.757718 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-scripts\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.763386 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-dispersionconf\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.771970 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-combined-ca-bundle\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.781330 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-swiftconf\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.786064 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dzx2\" (UniqueName: \"kubernetes.io/projected/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-kube-api-access-6dzx2\") pod \"swift-ring-rebalance-9ps2d\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.858063 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-config\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.858216 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-nb\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.858371 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-sb\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.858482 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-dns-svc\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.858653 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whz4b\" (UniqueName: \"kubernetes.io/projected/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-kube-api-access-whz4b\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.859113 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-config\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.859178 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-nb\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.859758 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-dns-svc\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.859782 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-sb\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.879672 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whz4b\" (UniqueName: \"kubernetes.io/projected/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-kube-api-access-whz4b\") pod \"dnsmasq-dns-7b4bb8f457-t96qt\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.894333 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.962495 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:39 crc kubenswrapper[4909]: E0202 12:03:39.968178 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54804e1f_3371_4774_b7b8_02a96c1d86fb.slice/crio-abeb214c0ea56bf5c9a7e217e79f0202bd4cad8e89596b11520e9246f3701f17\": RecentStats: unable to find data in memory cache]" Feb 02 12:03:39 crc kubenswrapper[4909]: I0202 12:03:39.979248 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.163628 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-combined-ca-bundle\") pod \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.164063 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-ring-data-devices\") pod \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.164111 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvvpz\" (UniqueName: \"kubernetes.io/projected/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-kube-api-access-tvvpz\") pod \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.164153 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-swiftconf\") pod \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.164233 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-etc-swift\") pod \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.164270 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-scripts\") pod \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.164289 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-dispersionconf\") pod \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\" (UID: \"9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3\") " Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.164496 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3" (UID: "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.164916 4909 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.164936 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3" (UID: "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.165606 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.169285 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-scripts" (OuterVolumeSpecName: "scripts") pod "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3" (UID: "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.170835 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3" (UID: "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.170930 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3" (UID: "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.173044 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3" (UID: "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.175330 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-kube-api-access-tvvpz" (OuterVolumeSpecName: "kube-api-access-tvvpz") pod "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3" (UID: "9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3"). InnerVolumeSpecName "kube-api-access-tvvpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.266966 4909 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.267009 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.267018 4909 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.267026 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.267035 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvvpz\" (UniqueName: \"kubernetes.io/projected/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-kube-api-access-tvvpz\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.267045 4909 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.440195 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9ps2d"] Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.636544 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b4bb8f457-t96qt"] Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.970353 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9ps2d" event={"ID":"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd","Type":"ContainerStarted","Data":"a414202ee1e2148b979c3bd6587c61d6bcee2fe231529d02dd7590a141e95567"} Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.970408 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9ps2d" event={"ID":"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd","Type":"ContainerStarted","Data":"2b38e35c99a589111c4e03f010d962fbbefdbce75b0b3d4790ae820a9b3726bd"} Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.972569 4909 generic.go:334] "Generic (PLEG): container finished" podID="cc55cc85-5418-4ef7-b8bb-dbedc64b0602" containerID="9491ad23d6e2cac7b69411b59eeed6d0c6c211d83f22231a95974a671ae336de" exitCode=0 Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.972625 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.972659 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" event={"ID":"cc55cc85-5418-4ef7-b8bb-dbedc64b0602","Type":"ContainerDied","Data":"9491ad23d6e2cac7b69411b59eeed6d0c6c211d83f22231a95974a671ae336de"} Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.972699 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" event={"ID":"cc55cc85-5418-4ef7-b8bb-dbedc64b0602","Type":"ContainerStarted","Data":"907afc151385fe2792f28605691f5d65165f6c8f01984bfa66b6a3f750a39127"} Feb 02 12:03:40 crc kubenswrapper[4909]: I0202 12:03:40.999535 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-9ps2d" podStartSLOduration=1.9995148409999999 podStartE2EDuration="1.999514841s" podCreationTimestamp="2026-02-02 12:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:03:40.992252195 +0000 UTC m=+5546.738352930" watchObservedRunningTime="2026-02-02 12:03:40.999514841 +0000 UTC m=+5546.745615576" Feb 02 12:03:41 crc kubenswrapper[4909]: I0202 12:03:41.982598 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" event={"ID":"cc55cc85-5418-4ef7-b8bb-dbedc64b0602","Type":"ContainerStarted","Data":"1e6e73759c5ac0642cea740e6ee22f95884d46c2033397b037087f8a451dbedd"} Feb 02 12:03:41 crc kubenswrapper[4909]: I0202 12:03:41.983095 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.008405 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" podStartSLOduration=3.008383705 podStartE2EDuration="3.008383705s" podCreationTimestamp="2026-02-02 12:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:03:42.006247315 +0000 UTC m=+5547.752348050" watchObservedRunningTime="2026-02-02 12:03:42.008383705 +0000 UTC m=+5547.754484440" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.378339 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7bcd6d9574-qhlss"] Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.379894 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.381668 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.387938 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7bcd6d9574-qhlss"] Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.504276 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-run-httpd\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.504320 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-combined-ca-bundle\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.504349 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-556c9\" (UniqueName: \"kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-kube-api-access-556c9\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.504569 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-log-httpd\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.504708 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-etc-swift\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.504916 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-config-data\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.606246 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-556c9\" (UniqueName: \"kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-kube-api-access-556c9\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.606655 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-log-httpd\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.606709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-etc-swift\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.606772 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-config-data\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.606896 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-run-httpd\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.606928 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-combined-ca-bundle\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.607301 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-log-httpd\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.607978 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-run-httpd\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.614176 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-etc-swift\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.614850 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-combined-ca-bundle\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.619633 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-config-data\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.628280 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-556c9\" (UniqueName: \"kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-kube-api-access-556c9\") pod \"swift-proxy-7bcd6d9574-qhlss\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.638334 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.640147 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.700085 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:42 crc kubenswrapper[4909]: I0202 12:03:42.702515 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:43 crc kubenswrapper[4909]: I0202 12:03:43.048548 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:43 crc kubenswrapper[4909]: I0202 12:03:43.103598 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zbq6d"] Feb 02 12:03:43 crc kubenswrapper[4909]: I0202 12:03:43.484138 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7bcd6d9574-qhlss"] Feb 02 12:03:43 crc kubenswrapper[4909]: W0202 12:03:43.484921 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5459356_1157_4e95_bdde_846a175e8e84.slice/crio-5a66fa3d757f27326d6078188aef95cf8b455113f96dc24e9f77d2e1c3606918 WatchSource:0}: Error finding container 5a66fa3d757f27326d6078188aef95cf8b455113f96dc24e9f77d2e1c3606918: Status 404 returned error can't find the container with id 5a66fa3d757f27326d6078188aef95cf8b455113f96dc24e9f77d2e1c3606918 Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.008183 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bcd6d9574-qhlss" event={"ID":"e5459356-1157-4e95-bdde-846a175e8e84","Type":"ContainerStarted","Data":"3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a"} Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.008996 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bcd6d9574-qhlss" event={"ID":"e5459356-1157-4e95-bdde-846a175e8e84","Type":"ContainerStarted","Data":"7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae"} Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.009019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bcd6d9574-qhlss" event={"ID":"e5459356-1157-4e95-bdde-846a175e8e84","Type":"ContainerStarted","Data":"5a66fa3d757f27326d6078188aef95cf8b455113f96dc24e9f77d2e1c3606918"} Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.043567 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7bcd6d9574-qhlss" podStartSLOduration=2.04352329 podStartE2EDuration="2.04352329s" podCreationTimestamp="2026-02-02 12:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:03:44.027466054 +0000 UTC m=+5549.773566799" watchObservedRunningTime="2026-02-02 12:03:44.04352329 +0000 UTC m=+5549.789624025" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.149099 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-74ddb64c4d-d2jbp"] Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.150505 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.154354 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.155609 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.171700 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74ddb64c4d-d2jbp"] Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.253904 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-public-tls-certs\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.254001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqzl4\" (UniqueName: \"kubernetes.io/projected/bc004db8-ba73-439f-b849-14910682e8c8-kube-api-access-bqzl4\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.254071 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc004db8-ba73-439f-b849-14910682e8c8-etc-swift\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.254195 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc004db8-ba73-439f-b849-14910682e8c8-log-httpd\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.254221 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-internal-tls-certs\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.254249 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-combined-ca-bundle\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.254300 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc004db8-ba73-439f-b849-14910682e8c8-run-httpd\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.254348 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-config-data\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.356112 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc004db8-ba73-439f-b849-14910682e8c8-log-httpd\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.356177 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-internal-tls-certs\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.356207 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-combined-ca-bundle\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.356275 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc004db8-ba73-439f-b849-14910682e8c8-run-httpd\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.356316 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-config-data\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.356343 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-public-tls-certs\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.356650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc004db8-ba73-439f-b849-14910682e8c8-log-httpd\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.357110 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqzl4\" (UniqueName: \"kubernetes.io/projected/bc004db8-ba73-439f-b849-14910682e8c8-kube-api-access-bqzl4\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.357212 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc004db8-ba73-439f-b849-14910682e8c8-etc-swift\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.357098 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc004db8-ba73-439f-b849-14910682e8c8-run-httpd\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.360842 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-config-data\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.362747 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-internal-tls-certs\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.363269 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-public-tls-certs\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.363329 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc004db8-ba73-439f-b849-14910682e8c8-combined-ca-bundle\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.364895 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc004db8-ba73-439f-b849-14910682e8c8-etc-swift\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.375847 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqzl4\" (UniqueName: \"kubernetes.io/projected/bc004db8-ba73-439f-b849-14910682e8c8-kube-api-access-bqzl4\") pod \"swift-proxy-74ddb64c4d-d2jbp\" (UID: \"bc004db8-ba73-439f-b849-14910682e8c8\") " pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:44 crc kubenswrapper[4909]: I0202 12:03:44.467661 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:45 crc kubenswrapper[4909]: I0202 12:03:45.016086 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zbq6d" podUID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerName="registry-server" containerID="cri-o://3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946" gracePeriod=2 Feb 02 12:03:45 crc kubenswrapper[4909]: I0202 12:03:45.036680 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:45 crc kubenswrapper[4909]: I0202 12:03:45.036725 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:45 crc kubenswrapper[4909]: I0202 12:03:45.217610 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-74ddb64c4d-d2jbp"] Feb 02 12:03:45 crc kubenswrapper[4909]: W0202 12:03:45.245169 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc004db8_ba73_439f_b849_14910682e8c8.slice/crio-b32cdf30701b930eef8096719478d1f66588f4c74b8c019aa42c9792c951e798 WatchSource:0}: Error finding container b32cdf30701b930eef8096719478d1f66588f4c74b8c019aa42c9792c951e798: Status 404 returned error can't find the container with id b32cdf30701b930eef8096719478d1f66588f4c74b8c019aa42c9792c951e798 Feb 02 12:03:45 crc kubenswrapper[4909]: I0202 12:03:45.990671 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.039518 4909 generic.go:334] "Generic (PLEG): container finished" podID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerID="3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946" exitCode=0 Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.039589 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zbq6d" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.039590 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbq6d" event={"ID":"e7eb513f-1e95-42ad-9aff-22856a8d6f49","Type":"ContainerDied","Data":"3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946"} Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.039771 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zbq6d" event={"ID":"e7eb513f-1e95-42ad-9aff-22856a8d6f49","Type":"ContainerDied","Data":"12390a39f6931146ab2dc346c7d56089a076e0127e12456fd13ff53ee1f4e7af"} Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.039837 4909 scope.go:117] "RemoveContainer" containerID="3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.046644 4909 generic.go:334] "Generic (PLEG): container finished" podID="bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" containerID="a414202ee1e2148b979c3bd6587c61d6bcee2fe231529d02dd7590a141e95567" exitCode=0 Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.046723 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9ps2d" event={"ID":"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd","Type":"ContainerDied","Data":"a414202ee1e2148b979c3bd6587c61d6bcee2fe231529d02dd7590a141e95567"} Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.051851 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74ddb64c4d-d2jbp" event={"ID":"bc004db8-ba73-439f-b849-14910682e8c8","Type":"ContainerStarted","Data":"ae3776f8c6203079bc0449f47a5e04b6752529cbe8738b62b1bc17a1f50cd0ba"} Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.051923 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74ddb64c4d-d2jbp" event={"ID":"bc004db8-ba73-439f-b849-14910682e8c8","Type":"ContainerStarted","Data":"64605bcb60c94b91708acbea49991d5bd075519ea10c9f12c8ce2662a1b472de"} Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.051941 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-74ddb64c4d-d2jbp" event={"ID":"bc004db8-ba73-439f-b849-14910682e8c8","Type":"ContainerStarted","Data":"b32cdf30701b930eef8096719478d1f66588f4c74b8c019aa42c9792c951e798"} Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.052231 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.076148 4909 scope.go:117] "RemoveContainer" containerID="e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.093613 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-catalog-content\") pod \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.096869 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-utilities\") pod \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.096977 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fznpv\" (UniqueName: \"kubernetes.io/projected/e7eb513f-1e95-42ad-9aff-22856a8d6f49-kube-api-access-fznpv\") pod \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\" (UID: \"e7eb513f-1e95-42ad-9aff-22856a8d6f49\") " Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.103893 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-74ddb64c4d-d2jbp" podStartSLOduration=2.103866209 podStartE2EDuration="2.103866209s" podCreationTimestamp="2026-02-02 12:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:03:46.096578132 +0000 UTC m=+5551.842678867" watchObservedRunningTime="2026-02-02 12:03:46.103866209 +0000 UTC m=+5551.849966944" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.108859 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-utilities" (OuterVolumeSpecName: "utilities") pod "e7eb513f-1e95-42ad-9aff-22856a8d6f49" (UID: "e7eb513f-1e95-42ad-9aff-22856a8d6f49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.116770 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7eb513f-1e95-42ad-9aff-22856a8d6f49-kube-api-access-fznpv" (OuterVolumeSpecName: "kube-api-access-fznpv") pod "e7eb513f-1e95-42ad-9aff-22856a8d6f49" (UID: "e7eb513f-1e95-42ad-9aff-22856a8d6f49"). InnerVolumeSpecName "kube-api-access-fznpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.125715 4909 scope.go:117] "RemoveContainer" containerID="b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.177945 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e7eb513f-1e95-42ad-9aff-22856a8d6f49" (UID: "e7eb513f-1e95-42ad-9aff-22856a8d6f49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.178535 4909 scope.go:117] "RemoveContainer" containerID="3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946" Feb 02 12:03:46 crc kubenswrapper[4909]: E0202 12:03:46.179222 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946\": container with ID starting with 3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946 not found: ID does not exist" containerID="3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.179297 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946"} err="failed to get container status \"3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946\": rpc error: code = NotFound desc = could not find container \"3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946\": container with ID starting with 3fe8ac84dfea4bac8ebf948c81bf198c6c7331e82889c452e0957998c10bb946 not found: ID does not exist" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.179344 4909 scope.go:117] "RemoveContainer" containerID="e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e" Feb 02 12:03:46 crc kubenswrapper[4909]: E0202 12:03:46.179824 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e\": container with ID starting with e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e not found: ID does not exist" containerID="e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.179889 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e"} err="failed to get container status \"e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e\": rpc error: code = NotFound desc = could not find container \"e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e\": container with ID starting with e4b385a20c2f39b23821da4581f1806594f524770de365aff111e553d559a15e not found: ID does not exist" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.179927 4909 scope.go:117] "RemoveContainer" containerID="b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4" Feb 02 12:03:46 crc kubenswrapper[4909]: E0202 12:03:46.180244 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4\": container with ID starting with b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4 not found: ID does not exist" containerID="b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.180279 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4"} err="failed to get container status \"b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4\": rpc error: code = NotFound desc = could not find container \"b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4\": container with ID starting with b46b7c540e031db811b85a556fbad0fbc275af2dffa8801140ce16b3b495b7a4 not found: ID does not exist" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.200150 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.200178 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e7eb513f-1e95-42ad-9aff-22856a8d6f49-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.200189 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fznpv\" (UniqueName: \"kubernetes.io/projected/e7eb513f-1e95-42ad-9aff-22856a8d6f49-kube-api-access-fznpv\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.382486 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zbq6d"] Feb 02 12:03:46 crc kubenswrapper[4909]: I0202 12:03:46.390027 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zbq6d"] Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.027561 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" path="/var/lib/kubelet/pods/e7eb513f-1e95-42ad-9aff-22856a8d6f49/volumes" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.061127 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.393973 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.526202 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-scripts\") pod \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.526292 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-combined-ca-bundle\") pod \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.526322 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-dispersionconf\") pod \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.526421 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-swiftconf\") pod \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.526882 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-ring-data-devices\") pod \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.526962 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dzx2\" (UniqueName: \"kubernetes.io/projected/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-kube-api-access-6dzx2\") pod \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.527026 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-etc-swift\") pod \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\" (UID: \"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd\") " Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.527729 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" (UID: "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.528194 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" (UID: "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.531608 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-kube-api-access-6dzx2" (OuterVolumeSpecName: "kube-api-access-6dzx2") pod "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" (UID: "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd"). InnerVolumeSpecName "kube-api-access-6dzx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.534222 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" (UID: "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.548357 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-scripts" (OuterVolumeSpecName: "scripts") pod "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" (UID: "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.550240 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" (UID: "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.555400 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" (UID: "bf9311a4-5cfc-4c64-ad6b-198cbe0509bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.628986 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.629015 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.629027 4909 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.629036 4909 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.629043 4909 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.629052 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dzx2\" (UniqueName: \"kubernetes.io/projected/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-kube-api-access-6dzx2\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:47 crc kubenswrapper[4909]: I0202 12:03:47.629060 4909 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bf9311a4-5cfc-4c64-ad6b-198cbe0509bd-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:48 crc kubenswrapper[4909]: I0202 12:03:48.071510 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9ps2d" event={"ID":"bf9311a4-5cfc-4c64-ad6b-198cbe0509bd","Type":"ContainerDied","Data":"2b38e35c99a589111c4e03f010d962fbbefdbce75b0b3d4790ae820a9b3726bd"} Feb 02 12:03:48 crc kubenswrapper[4909]: I0202 12:03:48.071903 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b38e35c99a589111c4e03f010d962fbbefdbce75b0b3d4790ae820a9b3726bd" Feb 02 12:03:48 crc kubenswrapper[4909]: I0202 12:03:48.071549 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9ps2d" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.167076 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:03:50 crc kubenswrapper[4909]: E0202 12:03:50.191414 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54804e1f_3371_4774_b7b8_02a96c1d86fb.slice/crio-abeb214c0ea56bf5c9a7e217e79f0202bd4cad8e89596b11520e9246f3701f17\": RecentStats: unable to find data in memory cache]" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.256035 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-df8fdb97c-xcthz"] Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.257611 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" podUID="b1eec544-d8ae-4672-b520-57b7fc00a655" containerName="dnsmasq-dns" containerID="cri-o://4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b" gracePeriod=10 Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.745437 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.888563 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-dns-svc\") pod \"b1eec544-d8ae-4672-b520-57b7fc00a655\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.888645 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-sb\") pod \"b1eec544-d8ae-4672-b520-57b7fc00a655\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.888919 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-nb\") pod \"b1eec544-d8ae-4672-b520-57b7fc00a655\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.889296 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-config\") pod \"b1eec544-d8ae-4672-b520-57b7fc00a655\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.889416 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvs4k\" (UniqueName: \"kubernetes.io/projected/b1eec544-d8ae-4672-b520-57b7fc00a655-kube-api-access-qvs4k\") pod \"b1eec544-d8ae-4672-b520-57b7fc00a655\" (UID: \"b1eec544-d8ae-4672-b520-57b7fc00a655\") " Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.895182 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1eec544-d8ae-4672-b520-57b7fc00a655-kube-api-access-qvs4k" (OuterVolumeSpecName: "kube-api-access-qvs4k") pod "b1eec544-d8ae-4672-b520-57b7fc00a655" (UID: "b1eec544-d8ae-4672-b520-57b7fc00a655"). InnerVolumeSpecName "kube-api-access-qvs4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.934342 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b1eec544-d8ae-4672-b520-57b7fc00a655" (UID: "b1eec544-d8ae-4672-b520-57b7fc00a655"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.939064 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b1eec544-d8ae-4672-b520-57b7fc00a655" (UID: "b1eec544-d8ae-4672-b520-57b7fc00a655"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.946422 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1eec544-d8ae-4672-b520-57b7fc00a655" (UID: "b1eec544-d8ae-4672-b520-57b7fc00a655"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.951174 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-config" (OuterVolumeSpecName: "config") pod "b1eec544-d8ae-4672-b520-57b7fc00a655" (UID: "b1eec544-d8ae-4672-b520-57b7fc00a655"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.991616 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.991648 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.991661 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvs4k\" (UniqueName: \"kubernetes.io/projected/b1eec544-d8ae-4672-b520-57b7fc00a655-kube-api-access-qvs4k\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.991674 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:50 crc kubenswrapper[4909]: I0202 12:03:50.991684 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b1eec544-d8ae-4672-b520-57b7fc00a655-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.095682 4909 generic.go:334] "Generic (PLEG): container finished" podID="b1eec544-d8ae-4672-b520-57b7fc00a655" containerID="4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b" exitCode=0 Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.096156 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" event={"ID":"b1eec544-d8ae-4672-b520-57b7fc00a655","Type":"ContainerDied","Data":"4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b"} Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.096246 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" event={"ID":"b1eec544-d8ae-4672-b520-57b7fc00a655","Type":"ContainerDied","Data":"61fa65b8d6c7ff0bf30b42f300881e6b9fe4b0114974f609f010d6b89436287d"} Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.096320 4909 scope.go:117] "RemoveContainer" containerID="4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b" Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.096335 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df8fdb97c-xcthz" Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.119688 4909 scope.go:117] "RemoveContainer" containerID="0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e" Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.125111 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-df8fdb97c-xcthz"] Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.137176 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-df8fdb97c-xcthz"] Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.139719 4909 scope.go:117] "RemoveContainer" containerID="4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b" Feb 02 12:03:51 crc kubenswrapper[4909]: E0202 12:03:51.142619 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b\": container with ID starting with 4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b not found: ID does not exist" containerID="4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b" Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.142655 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b"} err="failed to get container status \"4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b\": rpc error: code = NotFound desc = could not find container \"4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b\": container with ID starting with 4bb589ba24d04c090fb5338a1b56a18f57e2588c37f1978323264995fc5d478b not found: ID does not exist" Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.142681 4909 scope.go:117] "RemoveContainer" containerID="0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e" Feb 02 12:03:51 crc kubenswrapper[4909]: E0202 12:03:51.142969 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e\": container with ID starting with 0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e not found: ID does not exist" containerID="0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e" Feb 02 12:03:51 crc kubenswrapper[4909]: I0202 12:03:51.142993 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e"} err="failed to get container status \"0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e\": rpc error: code = NotFound desc = could not find container \"0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e\": container with ID starting with 0c8903278f74ee0dba8f303255973f079f028baa694f87b4b846b146910b5c9e not found: ID does not exist" Feb 02 12:03:52 crc kubenswrapper[4909]: I0202 12:03:52.704468 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:52 crc kubenswrapper[4909]: I0202 12:03:52.704743 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:53 crc kubenswrapper[4909]: I0202 12:03:53.025444 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1eec544-d8ae-4672-b520-57b7fc00a655" path="/var/lib/kubelet/pods/b1eec544-d8ae-4672-b520-57b7fc00a655/volumes" Feb 02 12:03:54 crc kubenswrapper[4909]: I0202 12:03:54.473412 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:54 crc kubenswrapper[4909]: I0202 12:03:54.475280 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-74ddb64c4d-d2jbp" Feb 02 12:03:54 crc kubenswrapper[4909]: I0202 12:03:54.586449 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7bcd6d9574-qhlss"] Feb 02 12:03:54 crc kubenswrapper[4909]: I0202 12:03:54.586778 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7bcd6d9574-qhlss" podUID="e5459356-1157-4e95-bdde-846a175e8e84" containerName="proxy-httpd" containerID="cri-o://7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae" gracePeriod=30 Feb 02 12:03:54 crc kubenswrapper[4909]: I0202 12:03:54.587086 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7bcd6d9574-qhlss" podUID="e5459356-1157-4e95-bdde-846a175e8e84" containerName="proxy-server" containerID="cri-o://3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a" gracePeriod=30 Feb 02 12:03:55 crc kubenswrapper[4909]: I0202 12:03:55.131430 4909 generic.go:334] "Generic (PLEG): container finished" podID="e5459356-1157-4e95-bdde-846a175e8e84" containerID="7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae" exitCode=0 Feb 02 12:03:55 crc kubenswrapper[4909]: I0202 12:03:55.131509 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bcd6d9574-qhlss" event={"ID":"e5459356-1157-4e95-bdde-846a175e8e84","Type":"ContainerDied","Data":"7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae"} Feb 02 12:03:55 crc kubenswrapper[4909]: I0202 12:03:55.967320 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.090640 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-etc-swift\") pod \"e5459356-1157-4e95-bdde-846a175e8e84\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.090711 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-run-httpd\") pod \"e5459356-1157-4e95-bdde-846a175e8e84\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.090763 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-log-httpd\") pod \"e5459356-1157-4e95-bdde-846a175e8e84\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.090797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-556c9\" (UniqueName: \"kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-kube-api-access-556c9\") pod \"e5459356-1157-4e95-bdde-846a175e8e84\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.091012 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-config-data\") pod \"e5459356-1157-4e95-bdde-846a175e8e84\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.091196 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e5459356-1157-4e95-bdde-846a175e8e84" (UID: "e5459356-1157-4e95-bdde-846a175e8e84"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.091300 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e5459356-1157-4e95-bdde-846a175e8e84" (UID: "e5459356-1157-4e95-bdde-846a175e8e84"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.091391 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-combined-ca-bundle\") pod \"e5459356-1157-4e95-bdde-846a175e8e84\" (UID: \"e5459356-1157-4e95-bdde-846a175e8e84\") " Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.091947 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.091966 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e5459356-1157-4e95-bdde-846a175e8e84-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.100159 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-kube-api-access-556c9" (OuterVolumeSpecName: "kube-api-access-556c9") pod "e5459356-1157-4e95-bdde-846a175e8e84" (UID: "e5459356-1157-4e95-bdde-846a175e8e84"). InnerVolumeSpecName "kube-api-access-556c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.107828 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e5459356-1157-4e95-bdde-846a175e8e84" (UID: "e5459356-1157-4e95-bdde-846a175e8e84"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.145615 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5459356-1157-4e95-bdde-846a175e8e84" (UID: "e5459356-1157-4e95-bdde-846a175e8e84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.151294 4909 generic.go:334] "Generic (PLEG): container finished" podID="e5459356-1157-4e95-bdde-846a175e8e84" containerID="3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a" exitCode=0 Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.151341 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bcd6d9574-qhlss" event={"ID":"e5459356-1157-4e95-bdde-846a175e8e84","Type":"ContainerDied","Data":"3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a"} Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.151370 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7bcd6d9574-qhlss" event={"ID":"e5459356-1157-4e95-bdde-846a175e8e84","Type":"ContainerDied","Data":"5a66fa3d757f27326d6078188aef95cf8b455113f96dc24e9f77d2e1c3606918"} Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.151390 4909 scope.go:117] "RemoveContainer" containerID="3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.151411 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7bcd6d9574-qhlss" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.163759 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-config-data" (OuterVolumeSpecName: "config-data") pod "e5459356-1157-4e95-bdde-846a175e8e84" (UID: "e5459356-1157-4e95-bdde-846a175e8e84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.193225 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.193268 4909 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.193278 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-556c9\" (UniqueName: \"kubernetes.io/projected/e5459356-1157-4e95-bdde-846a175e8e84-kube-api-access-556c9\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.193287 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5459356-1157-4e95-bdde-846a175e8e84-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.224678 4909 scope.go:117] "RemoveContainer" containerID="7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.240744 4909 scope.go:117] "RemoveContainer" containerID="3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a" Feb 02 12:03:56 crc kubenswrapper[4909]: E0202 12:03:56.241192 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a\": container with ID starting with 3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a not found: ID does not exist" containerID="3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.241357 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a"} err="failed to get container status \"3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a\": rpc error: code = NotFound desc = could not find container \"3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a\": container with ID starting with 3baba70df0663be285b1037f2c0a6a0055a76308c489316dd5fde6466f7e699a not found: ID does not exist" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.241455 4909 scope.go:117] "RemoveContainer" containerID="7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae" Feb 02 12:03:56 crc kubenswrapper[4909]: E0202 12:03:56.241762 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae\": container with ID starting with 7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae not found: ID does not exist" containerID="7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.241793 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae"} err="failed to get container status \"7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae\": rpc error: code = NotFound desc = could not find container \"7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae\": container with ID starting with 7ec0d2e1ddfd27fd88f355e9e73bf6e7f0c7c6273e83d6dc7b71753bd279a0ae not found: ID does not exist" Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.484416 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7bcd6d9574-qhlss"] Feb 02 12:03:56 crc kubenswrapper[4909]: I0202 12:03:56.494848 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7bcd6d9574-qhlss"] Feb 02 12:03:57 crc kubenswrapper[4909]: I0202 12:03:57.025682 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5459356-1157-4e95-bdde-846a175e8e84" path="/var/lib/kubelet/pods/e5459356-1157-4e95-bdde-846a175e8e84/volumes" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.256932 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-l2vls"] Feb 02 12:04:00 crc kubenswrapper[4909]: E0202 12:04:00.257591 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1eec544-d8ae-4672-b520-57b7fc00a655" containerName="init" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257607 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1eec544-d8ae-4672-b520-57b7fc00a655" containerName="init" Feb 02 12:04:00 crc kubenswrapper[4909]: E0202 12:04:00.257624 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerName="registry-server" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257630 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerName="registry-server" Feb 02 12:04:00 crc kubenswrapper[4909]: E0202 12:04:00.257644 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5459356-1157-4e95-bdde-846a175e8e84" containerName="proxy-server" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257651 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5459356-1157-4e95-bdde-846a175e8e84" containerName="proxy-server" Feb 02 12:04:00 crc kubenswrapper[4909]: E0202 12:04:00.257664 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerName="extract-utilities" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257670 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerName="extract-utilities" Feb 02 12:04:00 crc kubenswrapper[4909]: E0202 12:04:00.257680 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerName="extract-content" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257686 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerName="extract-content" Feb 02 12:04:00 crc kubenswrapper[4909]: E0202 12:04:00.257701 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" containerName="swift-ring-rebalance" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257707 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" containerName="swift-ring-rebalance" Feb 02 12:04:00 crc kubenswrapper[4909]: E0202 12:04:00.257720 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5459356-1157-4e95-bdde-846a175e8e84" containerName="proxy-httpd" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257725 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5459356-1157-4e95-bdde-846a175e8e84" containerName="proxy-httpd" Feb 02 12:04:00 crc kubenswrapper[4909]: E0202 12:04:00.257737 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1eec544-d8ae-4672-b520-57b7fc00a655" containerName="dnsmasq-dns" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257742 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1eec544-d8ae-4672-b520-57b7fc00a655" containerName="dnsmasq-dns" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257901 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5459356-1157-4e95-bdde-846a175e8e84" containerName="proxy-server" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257917 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9311a4-5cfc-4c64-ad6b-198cbe0509bd" containerName="swift-ring-rebalance" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257931 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1eec544-d8ae-4672-b520-57b7fc00a655" containerName="dnsmasq-dns" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257937 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7eb513f-1e95-42ad-9aff-22856a8d6f49" containerName="registry-server" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.257950 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5459356-1157-4e95-bdde-846a175e8e84" containerName="proxy-httpd" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.258740 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l2vls" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.269136 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-l2vls"] Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.363066 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2fd8\" (UniqueName: \"kubernetes.io/projected/5e1d7afa-1e6f-4271-b1ab-e4052d238647-kube-api-access-r2fd8\") pod \"cinder-db-create-l2vls\" (UID: \"5e1d7afa-1e6f-4271-b1ab-e4052d238647\") " pod="openstack/cinder-db-create-l2vls" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.363542 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1d7afa-1e6f-4271-b1ab-e4052d238647-operator-scripts\") pod \"cinder-db-create-l2vls\" (UID: \"5e1d7afa-1e6f-4271-b1ab-e4052d238647\") " pod="openstack/cinder-db-create-l2vls" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.365982 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-d45e-account-create-update-cpm9n"] Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.367101 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d45e-account-create-update-cpm9n" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.371801 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.377168 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d45e-account-create-update-cpm9n"] Feb 02 12:04:00 crc kubenswrapper[4909]: E0202 12:04:00.432155 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54804e1f_3371_4774_b7b8_02a96c1d86fb.slice/crio-abeb214c0ea56bf5c9a7e217e79f0202bd4cad8e89596b11520e9246f3701f17\": RecentStats: unable to find data in memory cache]" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.465887 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2fd8\" (UniqueName: \"kubernetes.io/projected/5e1d7afa-1e6f-4271-b1ab-e4052d238647-kube-api-access-r2fd8\") pod \"cinder-db-create-l2vls\" (UID: \"5e1d7afa-1e6f-4271-b1ab-e4052d238647\") " pod="openstack/cinder-db-create-l2vls" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.466130 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faaef286-81e0-4931-863a-0df86ef982c2-operator-scripts\") pod \"cinder-d45e-account-create-update-cpm9n\" (UID: \"faaef286-81e0-4931-863a-0df86ef982c2\") " pod="openstack/cinder-d45e-account-create-update-cpm9n" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.466208 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7msd\" (UniqueName: \"kubernetes.io/projected/faaef286-81e0-4931-863a-0df86ef982c2-kube-api-access-c7msd\") pod \"cinder-d45e-account-create-update-cpm9n\" (UID: \"faaef286-81e0-4931-863a-0df86ef982c2\") " pod="openstack/cinder-d45e-account-create-update-cpm9n" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.466247 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1d7afa-1e6f-4271-b1ab-e4052d238647-operator-scripts\") pod \"cinder-db-create-l2vls\" (UID: \"5e1d7afa-1e6f-4271-b1ab-e4052d238647\") " pod="openstack/cinder-db-create-l2vls" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.467092 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1d7afa-1e6f-4271-b1ab-e4052d238647-operator-scripts\") pod \"cinder-db-create-l2vls\" (UID: \"5e1d7afa-1e6f-4271-b1ab-e4052d238647\") " pod="openstack/cinder-db-create-l2vls" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.486594 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2fd8\" (UniqueName: \"kubernetes.io/projected/5e1d7afa-1e6f-4271-b1ab-e4052d238647-kube-api-access-r2fd8\") pod \"cinder-db-create-l2vls\" (UID: \"5e1d7afa-1e6f-4271-b1ab-e4052d238647\") " pod="openstack/cinder-db-create-l2vls" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.567620 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7msd\" (UniqueName: \"kubernetes.io/projected/faaef286-81e0-4931-863a-0df86ef982c2-kube-api-access-c7msd\") pod \"cinder-d45e-account-create-update-cpm9n\" (UID: \"faaef286-81e0-4931-863a-0df86ef982c2\") " pod="openstack/cinder-d45e-account-create-update-cpm9n" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.567759 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faaef286-81e0-4931-863a-0df86ef982c2-operator-scripts\") pod \"cinder-d45e-account-create-update-cpm9n\" (UID: \"faaef286-81e0-4931-863a-0df86ef982c2\") " pod="openstack/cinder-d45e-account-create-update-cpm9n" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.568502 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faaef286-81e0-4931-863a-0df86ef982c2-operator-scripts\") pod \"cinder-d45e-account-create-update-cpm9n\" (UID: \"faaef286-81e0-4931-863a-0df86ef982c2\") " pod="openstack/cinder-d45e-account-create-update-cpm9n" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.583121 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7msd\" (UniqueName: \"kubernetes.io/projected/faaef286-81e0-4931-863a-0df86ef982c2-kube-api-access-c7msd\") pod \"cinder-d45e-account-create-update-cpm9n\" (UID: \"faaef286-81e0-4931-863a-0df86ef982c2\") " pod="openstack/cinder-d45e-account-create-update-cpm9n" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.591057 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l2vls" Feb 02 12:04:00 crc kubenswrapper[4909]: I0202 12:04:00.690357 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d45e-account-create-update-cpm9n" Feb 02 12:04:01 crc kubenswrapper[4909]: I0202 12:04:01.015582 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-l2vls"] Feb 02 12:04:01 crc kubenswrapper[4909]: I0202 12:04:01.152138 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-d45e-account-create-update-cpm9n"] Feb 02 12:04:01 crc kubenswrapper[4909]: I0202 12:04:01.193699 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l2vls" event={"ID":"5e1d7afa-1e6f-4271-b1ab-e4052d238647","Type":"ContainerStarted","Data":"b4e9a41594afcc210335a3299cd21dc3bdf8a2a49c7f8e50334867d1514052c7"} Feb 02 12:04:01 crc kubenswrapper[4909]: I0202 12:04:01.193744 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l2vls" event={"ID":"5e1d7afa-1e6f-4271-b1ab-e4052d238647","Type":"ContainerStarted","Data":"4df8962dd6672df5a52b48b3c4f940f26be4fbb82e865391fd1795ffbb942478"} Feb 02 12:04:01 crc kubenswrapper[4909]: I0202 12:04:01.194875 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d45e-account-create-update-cpm9n" event={"ID":"faaef286-81e0-4931-863a-0df86ef982c2","Type":"ContainerStarted","Data":"be427ba21003bb1dfa63e5e2f5a43860ac561d09a854cacf6a69a048a9021cb0"} Feb 02 12:04:01 crc kubenswrapper[4909]: I0202 12:04:01.216616 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-l2vls" podStartSLOduration=1.216592281 podStartE2EDuration="1.216592281s" podCreationTimestamp="2026-02-02 12:04:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:04:01.208519162 +0000 UTC m=+5566.954619897" watchObservedRunningTime="2026-02-02 12:04:01.216592281 +0000 UTC m=+5566.962693016" Feb 02 12:04:02 crc kubenswrapper[4909]: I0202 12:04:02.205006 4909 generic.go:334] "Generic (PLEG): container finished" podID="5e1d7afa-1e6f-4271-b1ab-e4052d238647" containerID="b4e9a41594afcc210335a3299cd21dc3bdf8a2a49c7f8e50334867d1514052c7" exitCode=0 Feb 02 12:04:02 crc kubenswrapper[4909]: I0202 12:04:02.205045 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l2vls" event={"ID":"5e1d7afa-1e6f-4271-b1ab-e4052d238647","Type":"ContainerDied","Data":"b4e9a41594afcc210335a3299cd21dc3bdf8a2a49c7f8e50334867d1514052c7"} Feb 02 12:04:02 crc kubenswrapper[4909]: I0202 12:04:02.207165 4909 generic.go:334] "Generic (PLEG): container finished" podID="faaef286-81e0-4931-863a-0df86ef982c2" containerID="ba71b7556d483fd7259c209e27b2c8c194cfd84d2e0f6156107f56599d7b9f0b" exitCode=0 Feb 02 12:04:02 crc kubenswrapper[4909]: I0202 12:04:02.207200 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d45e-account-create-update-cpm9n" event={"ID":"faaef286-81e0-4931-863a-0df86ef982c2","Type":"ContainerDied","Data":"ba71b7556d483fd7259c209e27b2c8c194cfd84d2e0f6156107f56599d7b9f0b"} Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.599680 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d45e-account-create-update-cpm9n" Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.605433 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l2vls" Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.730003 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2fd8\" (UniqueName: \"kubernetes.io/projected/5e1d7afa-1e6f-4271-b1ab-e4052d238647-kube-api-access-r2fd8\") pod \"5e1d7afa-1e6f-4271-b1ab-e4052d238647\" (UID: \"5e1d7afa-1e6f-4271-b1ab-e4052d238647\") " Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.730069 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faaef286-81e0-4931-863a-0df86ef982c2-operator-scripts\") pod \"faaef286-81e0-4931-863a-0df86ef982c2\" (UID: \"faaef286-81e0-4931-863a-0df86ef982c2\") " Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.730281 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7msd\" (UniqueName: \"kubernetes.io/projected/faaef286-81e0-4931-863a-0df86ef982c2-kube-api-access-c7msd\") pod \"faaef286-81e0-4931-863a-0df86ef982c2\" (UID: \"faaef286-81e0-4931-863a-0df86ef982c2\") " Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.730308 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1d7afa-1e6f-4271-b1ab-e4052d238647-operator-scripts\") pod \"5e1d7afa-1e6f-4271-b1ab-e4052d238647\" (UID: \"5e1d7afa-1e6f-4271-b1ab-e4052d238647\") " Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.730834 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faaef286-81e0-4931-863a-0df86ef982c2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "faaef286-81e0-4931-863a-0df86ef982c2" (UID: "faaef286-81e0-4931-863a-0df86ef982c2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.730861 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1d7afa-1e6f-4271-b1ab-e4052d238647-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e1d7afa-1e6f-4271-b1ab-e4052d238647" (UID: "5e1d7afa-1e6f-4271-b1ab-e4052d238647"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.731625 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e1d7afa-1e6f-4271-b1ab-e4052d238647-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.731647 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/faaef286-81e0-4931-863a-0df86ef982c2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.735843 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1d7afa-1e6f-4271-b1ab-e4052d238647-kube-api-access-r2fd8" (OuterVolumeSpecName: "kube-api-access-r2fd8") pod "5e1d7afa-1e6f-4271-b1ab-e4052d238647" (UID: "5e1d7afa-1e6f-4271-b1ab-e4052d238647"). InnerVolumeSpecName "kube-api-access-r2fd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.736002 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faaef286-81e0-4931-863a-0df86ef982c2-kube-api-access-c7msd" (OuterVolumeSpecName: "kube-api-access-c7msd") pod "faaef286-81e0-4931-863a-0df86ef982c2" (UID: "faaef286-81e0-4931-863a-0df86ef982c2"). InnerVolumeSpecName "kube-api-access-c7msd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.832909 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7msd\" (UniqueName: \"kubernetes.io/projected/faaef286-81e0-4931-863a-0df86ef982c2-kube-api-access-c7msd\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:03 crc kubenswrapper[4909]: I0202 12:04:03.832943 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2fd8\" (UniqueName: \"kubernetes.io/projected/5e1d7afa-1e6f-4271-b1ab-e4052d238647-kube-api-access-r2fd8\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:04 crc kubenswrapper[4909]: I0202 12:04:04.228190 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-l2vls" Feb 02 12:04:04 crc kubenswrapper[4909]: I0202 12:04:04.228192 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-l2vls" event={"ID":"5e1d7afa-1e6f-4271-b1ab-e4052d238647","Type":"ContainerDied","Data":"4df8962dd6672df5a52b48b3c4f940f26be4fbb82e865391fd1795ffbb942478"} Feb 02 12:04:04 crc kubenswrapper[4909]: I0202 12:04:04.228326 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4df8962dd6672df5a52b48b3c4f940f26be4fbb82e865391fd1795ffbb942478" Feb 02 12:04:04 crc kubenswrapper[4909]: I0202 12:04:04.229897 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-d45e-account-create-update-cpm9n" event={"ID":"faaef286-81e0-4931-863a-0df86ef982c2","Type":"ContainerDied","Data":"be427ba21003bb1dfa63e5e2f5a43860ac561d09a854cacf6a69a048a9021cb0"} Feb 02 12:04:04 crc kubenswrapper[4909]: I0202 12:04:04.229948 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be427ba21003bb1dfa63e5e2f5a43860ac561d09a854cacf6a69a048a9021cb0" Feb 02 12:04:04 crc kubenswrapper[4909]: I0202 12:04:04.229954 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-d45e-account-create-update-cpm9n" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.707204 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-h4tm9"] Feb 02 12:04:05 crc kubenswrapper[4909]: E0202 12:04:05.707832 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1d7afa-1e6f-4271-b1ab-e4052d238647" containerName="mariadb-database-create" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.707843 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1d7afa-1e6f-4271-b1ab-e4052d238647" containerName="mariadb-database-create" Feb 02 12:04:05 crc kubenswrapper[4909]: E0202 12:04:05.707861 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faaef286-81e0-4931-863a-0df86ef982c2" containerName="mariadb-account-create-update" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.707867 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="faaef286-81e0-4931-863a-0df86ef982c2" containerName="mariadb-account-create-update" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.708015 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="faaef286-81e0-4931-863a-0df86ef982c2" containerName="mariadb-account-create-update" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.708032 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1d7afa-1e6f-4271-b1ab-e4052d238647" containerName="mariadb-database-create" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.708581 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.714104 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.714340 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.714498 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qjzr8" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.719874 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-h4tm9"] Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.868612 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6bk\" (UniqueName: \"kubernetes.io/projected/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-kube-api-access-sd6bk\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.868935 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-scripts\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.869031 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-config-data\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.869140 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-db-sync-config-data\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.869293 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-etc-machine-id\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.869400 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-combined-ca-bundle\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.970743 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-etc-machine-id\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.970791 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-combined-ca-bundle\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.970847 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6bk\" (UniqueName: \"kubernetes.io/projected/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-kube-api-access-sd6bk\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.970875 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-scripts\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.970878 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-etc-machine-id\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.970911 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-config-data\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.970935 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-db-sync-config-data\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.975077 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-combined-ca-bundle\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.975470 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-scripts\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.977629 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-config-data\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.978257 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-db-sync-config-data\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:05 crc kubenswrapper[4909]: I0202 12:04:05.987758 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6bk\" (UniqueName: \"kubernetes.io/projected/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-kube-api-access-sd6bk\") pod \"cinder-db-sync-h4tm9\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:06 crc kubenswrapper[4909]: I0202 12:04:06.025520 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:06 crc kubenswrapper[4909]: I0202 12:04:06.518541 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-h4tm9"] Feb 02 12:04:07 crc kubenswrapper[4909]: I0202 12:04:07.264312 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-h4tm9" event={"ID":"312db841-556a-4d31-b5c0-e2d4dc5cf3e4","Type":"ContainerStarted","Data":"6d71f70849dc4c93fe50630422c63883636886d2746a0cac0fec0b21b702f79c"} Feb 02 12:04:07 crc kubenswrapper[4909]: I0202 12:04:07.264726 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-h4tm9" event={"ID":"312db841-556a-4d31-b5c0-e2d4dc5cf3e4","Type":"ContainerStarted","Data":"b9bd2255a0f847f4cea69096c753c5c36d9fe3475026de40abf6a9e5972bef12"} Feb 02 12:04:07 crc kubenswrapper[4909]: I0202 12:04:07.279512 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-h4tm9" podStartSLOduration=2.279488106 podStartE2EDuration="2.279488106s" podCreationTimestamp="2026-02-02 12:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:04:07.278420056 +0000 UTC m=+5573.024520791" watchObservedRunningTime="2026-02-02 12:04:07.279488106 +0000 UTC m=+5573.025588851" Feb 02 12:04:10 crc kubenswrapper[4909]: I0202 12:04:10.287192 4909 generic.go:334] "Generic (PLEG): container finished" podID="312db841-556a-4d31-b5c0-e2d4dc5cf3e4" containerID="6d71f70849dc4c93fe50630422c63883636886d2746a0cac0fec0b21b702f79c" exitCode=0 Feb 02 12:04:10 crc kubenswrapper[4909]: I0202 12:04:10.287288 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-h4tm9" event={"ID":"312db841-556a-4d31-b5c0-e2d4dc5cf3e4","Type":"ContainerDied","Data":"6d71f70849dc4c93fe50630422c63883636886d2746a0cac0fec0b21b702f79c"} Feb 02 12:04:10 crc kubenswrapper[4909]: E0202 12:04:10.619565 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54804e1f_3371_4774_b7b8_02a96c1d86fb.slice/crio-abeb214c0ea56bf5c9a7e217e79f0202bd4cad8e89596b11520e9246f3701f17\": RecentStats: unable to find data in memory cache]" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.148694 4909 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9cf7d8fd_a9dc_4523_a82c_d1e4ccff67b3.slice" Feb 02 12:04:11 crc kubenswrapper[4909]: E0202 12:04:11.148761 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9cf7d8fd_a9dc_4523_a82c_d1e4ccff67b3.slice" pod="openstack/swift-ring-rebalance-p2pcl" podUID="9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.294120 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-p2pcl" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.336898 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-p2pcl"] Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.336954 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-p2pcl"] Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.642838 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.783776 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-etc-machine-id\") pod \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.783864 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-db-sync-config-data\") pod \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.783923 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "312db841-556a-4d31-b5c0-e2d4dc5cf3e4" (UID: "312db841-556a-4d31-b5c0-e2d4dc5cf3e4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.783969 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd6bk\" (UniqueName: \"kubernetes.io/projected/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-kube-api-access-sd6bk\") pod \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.784022 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-combined-ca-bundle\") pod \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.784053 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-scripts\") pod \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.784087 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-config-data\") pod \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\" (UID: \"312db841-556a-4d31-b5c0-e2d4dc5cf3e4\") " Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.784538 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.789102 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "312db841-556a-4d31-b5c0-e2d4dc5cf3e4" (UID: "312db841-556a-4d31-b5c0-e2d4dc5cf3e4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.789415 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-scripts" (OuterVolumeSpecName: "scripts") pod "312db841-556a-4d31-b5c0-e2d4dc5cf3e4" (UID: "312db841-556a-4d31-b5c0-e2d4dc5cf3e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.796621 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-kube-api-access-sd6bk" (OuterVolumeSpecName: "kube-api-access-sd6bk") pod "312db841-556a-4d31-b5c0-e2d4dc5cf3e4" (UID: "312db841-556a-4d31-b5c0-e2d4dc5cf3e4"). InnerVolumeSpecName "kube-api-access-sd6bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.806313 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "312db841-556a-4d31-b5c0-e2d4dc5cf3e4" (UID: "312db841-556a-4d31-b5c0-e2d4dc5cf3e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.827459 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-config-data" (OuterVolumeSpecName: "config-data") pod "312db841-556a-4d31-b5c0-e2d4dc5cf3e4" (UID: "312db841-556a-4d31-b5c0-e2d4dc5cf3e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.886078 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.886119 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd6bk\" (UniqueName: \"kubernetes.io/projected/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-kube-api-access-sd6bk\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.886129 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.886137 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:11 crc kubenswrapper[4909]: I0202 12:04:11.886146 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312db841-556a-4d31-b5c0-e2d4dc5cf3e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.303233 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-h4tm9" event={"ID":"312db841-556a-4d31-b5c0-e2d4dc5cf3e4","Type":"ContainerDied","Data":"b9bd2255a0f847f4cea69096c753c5c36d9fe3475026de40abf6a9e5972bef12"} Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.303505 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9bd2255a0f847f4cea69096c753c5c36d9fe3475026de40abf6a9e5972bef12" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.303278 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-h4tm9" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.682936 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fb6955cd5-q5rkf"] Feb 02 12:04:12 crc kubenswrapper[4909]: E0202 12:04:12.683401 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312db841-556a-4d31-b5c0-e2d4dc5cf3e4" containerName="cinder-db-sync" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.683419 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="312db841-556a-4d31-b5c0-e2d4dc5cf3e4" containerName="cinder-db-sync" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.683642 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="312db841-556a-4d31-b5c0-e2d4dc5cf3e4" containerName="cinder-db-sync" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.684848 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.707668 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb6955cd5-q5rkf"] Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.806262 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-dns-svc\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.806339 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-sb\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.806387 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll44r\" (UniqueName: \"kubernetes.io/projected/90545462-3397-4592-9bc2-0ff7471e7791-kube-api-access-ll44r\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.806445 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-nb\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.806565 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-config\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.823295 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.825041 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.830360 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.830615 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.834199 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qjzr8" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.834861 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.838293 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.907930 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-sb\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.907974 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll44r\" (UniqueName: \"kubernetes.io/projected/90545462-3397-4592-9bc2-0ff7471e7791-kube-api-access-ll44r\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.908013 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-nb\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.908097 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-config\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.908146 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-dns-svc\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.909058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-dns-svc\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.909600 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-sb\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.910355 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-nb\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.910484 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-config\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:12 crc kubenswrapper[4909]: I0202 12:04:12.928227 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll44r\" (UniqueName: \"kubernetes.io/projected/90545462-3397-4592-9bc2-0ff7471e7791-kube-api-access-ll44r\") pod \"dnsmasq-dns-fb6955cd5-q5rkf\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.009850 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-scripts\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.009900 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9102716d-702e-468b-ab36-2fb5fc743121-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.009928 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9102716d-702e-468b-ab36-2fb5fc743121-logs\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.009956 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data-custom\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.010022 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvshf\" (UniqueName: \"kubernetes.io/projected/9102716d-702e-468b-ab36-2fb5fc743121-kube-api-access-vvshf\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.010040 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.010096 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.016515 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.045122 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3" path="/var/lib/kubelet/pods/9cf7d8fd-a9dc-4523-a82c-d1e4ccff67b3/volumes" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.111217 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.111317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-scripts\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.111342 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9102716d-702e-468b-ab36-2fb5fc743121-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.111364 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9102716d-702e-468b-ab36-2fb5fc743121-logs\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.111391 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data-custom\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.111449 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvshf\" (UniqueName: \"kubernetes.io/projected/9102716d-702e-468b-ab36-2fb5fc743121-kube-api-access-vvshf\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.111471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.111491 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9102716d-702e-468b-ab36-2fb5fc743121-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.112279 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9102716d-702e-468b-ab36-2fb5fc743121-logs\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.116192 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-scripts\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.117988 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data-custom\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.120016 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.124319 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.134636 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvshf\" (UniqueName: \"kubernetes.io/projected/9102716d-702e-468b-ab36-2fb5fc743121-kube-api-access-vvshf\") pod \"cinder-api-0\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.172160 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.454881 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:13 crc kubenswrapper[4909]: I0202 12:04:13.563579 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fb6955cd5-q5rkf"] Feb 02 12:04:13 crc kubenswrapper[4909]: W0202 12:04:13.570282 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90545462_3397_4592_9bc2_0ff7471e7791.slice/crio-6c400f5836fecf500863bab1a64dc10879c04cd8a50c5160a64abcde42837d94 WatchSource:0}: Error finding container 6c400f5836fecf500863bab1a64dc10879c04cd8a50c5160a64abcde42837d94: Status 404 returned error can't find the container with id 6c400f5836fecf500863bab1a64dc10879c04cd8a50c5160a64abcde42837d94 Feb 02 12:04:14 crc kubenswrapper[4909]: I0202 12:04:14.320591 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9102716d-702e-468b-ab36-2fb5fc743121","Type":"ContainerStarted","Data":"79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f"} Feb 02 12:04:14 crc kubenswrapper[4909]: I0202 12:04:14.320661 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9102716d-702e-468b-ab36-2fb5fc743121","Type":"ContainerStarted","Data":"8e18fc58f8dd0d2a4e358e66c9db2f26125aa1614f93dcb87ef9baf632dad1ca"} Feb 02 12:04:14 crc kubenswrapper[4909]: I0202 12:04:14.323425 4909 generic.go:334] "Generic (PLEG): container finished" podID="90545462-3397-4592-9bc2-0ff7471e7791" containerID="a00b31552aa09e21cdc253d11272915be6a7a9d85d3b07efcb97c42332d3f81e" exitCode=0 Feb 02 12:04:14 crc kubenswrapper[4909]: I0202 12:04:14.323456 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" event={"ID":"90545462-3397-4592-9bc2-0ff7471e7791","Type":"ContainerDied","Data":"a00b31552aa09e21cdc253d11272915be6a7a9d85d3b07efcb97c42332d3f81e"} Feb 02 12:04:14 crc kubenswrapper[4909]: I0202 12:04:14.323477 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" event={"ID":"90545462-3397-4592-9bc2-0ff7471e7791","Type":"ContainerStarted","Data":"6c400f5836fecf500863bab1a64dc10879c04cd8a50c5160a64abcde42837d94"} Feb 02 12:04:14 crc kubenswrapper[4909]: I0202 12:04:14.900488 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:15 crc kubenswrapper[4909]: E0202 12:04:15.047686 4909 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/ae0b499b0c423f0a8ac91d24a58b8efa9e87a0cfc34a47233a2a39dfcad2806d/diff" to get inode usage: stat /var/lib/containers/storage/overlay/ae0b499b0c423f0a8ac91d24a58b8efa9e87a0cfc34a47233a2a39dfcad2806d/diff: no such file or directory, extraDiskErr: Feb 02 12:04:15 crc kubenswrapper[4909]: I0202 12:04:15.333562 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9102716d-702e-468b-ab36-2fb5fc743121","Type":"ContainerStarted","Data":"bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0"} Feb 02 12:04:15 crc kubenswrapper[4909]: I0202 12:04:15.333671 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 12:04:15 crc kubenswrapper[4909]: I0202 12:04:15.335465 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" event={"ID":"90545462-3397-4592-9bc2-0ff7471e7791","Type":"ContainerStarted","Data":"89ea69836f5cb241e726c47c6d625ce695f445b41f6eb8b04cc62e0824a8213f"} Feb 02 12:04:15 crc kubenswrapper[4909]: I0202 12:04:15.335633 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:15 crc kubenswrapper[4909]: I0202 12:04:15.353622 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.353606398 podStartE2EDuration="3.353606398s" podCreationTimestamp="2026-02-02 12:04:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:04:15.348628067 +0000 UTC m=+5581.094728812" watchObservedRunningTime="2026-02-02 12:04:15.353606398 +0000 UTC m=+5581.099707133" Feb 02 12:04:15 crc kubenswrapper[4909]: I0202 12:04:15.363386 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" podStartSLOduration=3.363369145 podStartE2EDuration="3.363369145s" podCreationTimestamp="2026-02-02 12:04:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:04:15.362232613 +0000 UTC m=+5581.108333348" watchObservedRunningTime="2026-02-02 12:04:15.363369145 +0000 UTC m=+5581.109469880" Feb 02 12:04:16 crc kubenswrapper[4909]: I0202 12:04:16.343964 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9102716d-702e-468b-ab36-2fb5fc743121" containerName="cinder-api-log" containerID="cri-o://79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f" gracePeriod=30 Feb 02 12:04:16 crc kubenswrapper[4909]: I0202 12:04:16.344049 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9102716d-702e-468b-ab36-2fb5fc743121" containerName="cinder-api" containerID="cri-o://bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0" gracePeriod=30 Feb 02 12:04:16 crc kubenswrapper[4909]: I0202 12:04:16.920297 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.088903 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvshf\" (UniqueName: \"kubernetes.io/projected/9102716d-702e-468b-ab36-2fb5fc743121-kube-api-access-vvshf\") pod \"9102716d-702e-468b-ab36-2fb5fc743121\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.089012 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data\") pod \"9102716d-702e-468b-ab36-2fb5fc743121\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.089035 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9102716d-702e-468b-ab36-2fb5fc743121-etc-machine-id\") pod \"9102716d-702e-468b-ab36-2fb5fc743121\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.089103 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9102716d-702e-468b-ab36-2fb5fc743121-logs\") pod \"9102716d-702e-468b-ab36-2fb5fc743121\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.089174 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9102716d-702e-468b-ab36-2fb5fc743121-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9102716d-702e-468b-ab36-2fb5fc743121" (UID: "9102716d-702e-468b-ab36-2fb5fc743121"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.089220 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-combined-ca-bundle\") pod \"9102716d-702e-468b-ab36-2fb5fc743121\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.089242 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-scripts\") pod \"9102716d-702e-468b-ab36-2fb5fc743121\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.089265 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data-custom\") pod \"9102716d-702e-468b-ab36-2fb5fc743121\" (UID: \"9102716d-702e-468b-ab36-2fb5fc743121\") " Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.089487 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9102716d-702e-468b-ab36-2fb5fc743121-logs" (OuterVolumeSpecName: "logs") pod "9102716d-702e-468b-ab36-2fb5fc743121" (UID: "9102716d-702e-468b-ab36-2fb5fc743121"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.089983 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9102716d-702e-468b-ab36-2fb5fc743121-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.090008 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9102716d-702e-468b-ab36-2fb5fc743121-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.094984 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9102716d-702e-468b-ab36-2fb5fc743121" (UID: "9102716d-702e-468b-ab36-2fb5fc743121"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.095367 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-scripts" (OuterVolumeSpecName: "scripts") pod "9102716d-702e-468b-ab36-2fb5fc743121" (UID: "9102716d-702e-468b-ab36-2fb5fc743121"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.096501 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9102716d-702e-468b-ab36-2fb5fc743121-kube-api-access-vvshf" (OuterVolumeSpecName: "kube-api-access-vvshf") pod "9102716d-702e-468b-ab36-2fb5fc743121" (UID: "9102716d-702e-468b-ab36-2fb5fc743121"). InnerVolumeSpecName "kube-api-access-vvshf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.118197 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9102716d-702e-468b-ab36-2fb5fc743121" (UID: "9102716d-702e-468b-ab36-2fb5fc743121"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.144687 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data" (OuterVolumeSpecName: "config-data") pod "9102716d-702e-468b-ab36-2fb5fc743121" (UID: "9102716d-702e-468b-ab36-2fb5fc743121"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.192003 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvshf\" (UniqueName: \"kubernetes.io/projected/9102716d-702e-468b-ab36-2fb5fc743121-kube-api-access-vvshf\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.192037 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.192046 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.192054 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.192063 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9102716d-702e-468b-ab36-2fb5fc743121-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.353585 4909 generic.go:334] "Generic (PLEG): container finished" podID="9102716d-702e-468b-ab36-2fb5fc743121" containerID="bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0" exitCode=0 Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.353620 4909 generic.go:334] "Generic (PLEG): container finished" podID="9102716d-702e-468b-ab36-2fb5fc743121" containerID="79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f" exitCode=143 Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.353639 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.353645 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9102716d-702e-468b-ab36-2fb5fc743121","Type":"ContainerDied","Data":"bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0"} Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.353693 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9102716d-702e-468b-ab36-2fb5fc743121","Type":"ContainerDied","Data":"79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f"} Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.353706 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9102716d-702e-468b-ab36-2fb5fc743121","Type":"ContainerDied","Data":"8e18fc58f8dd0d2a4e358e66c9db2f26125aa1614f93dcb87ef9baf632dad1ca"} Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.353716 4909 scope.go:117] "RemoveContainer" containerID="bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.373213 4909 scope.go:117] "RemoveContainer" containerID="79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.384157 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.394284 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.400066 4909 scope.go:117] "RemoveContainer" containerID="bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.411327 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:17 crc kubenswrapper[4909]: E0202 12:04:17.412177 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9102716d-702e-468b-ab36-2fb5fc743121" containerName="cinder-api" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.412292 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9102716d-702e-468b-ab36-2fb5fc743121" containerName="cinder-api" Feb 02 12:04:17 crc kubenswrapper[4909]: E0202 12:04:17.421386 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9102716d-702e-468b-ab36-2fb5fc743121" containerName="cinder-api-log" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.421676 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9102716d-702e-468b-ab36-2fb5fc743121" containerName="cinder-api-log" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.422325 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9102716d-702e-468b-ab36-2fb5fc743121" containerName="cinder-api" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.425892 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9102716d-702e-468b-ab36-2fb5fc743121" containerName="cinder-api-log" Feb 02 12:04:17 crc kubenswrapper[4909]: E0202 12:04:17.420033 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0\": container with ID starting with bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0 not found: ID does not exist" containerID="bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.426404 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0"} err="failed to get container status \"bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0\": rpc error: code = NotFound desc = could not find container \"bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0\": container with ID starting with bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0 not found: ID does not exist" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.426457 4909 scope.go:117] "RemoveContainer" containerID="79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.427401 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: E0202 12:04:17.431936 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f\": container with ID starting with 79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f not found: ID does not exist" containerID="79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.432054 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f"} err="failed to get container status \"79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f\": rpc error: code = NotFound desc = could not find container \"79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f\": container with ID starting with 79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f not found: ID does not exist" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.432145 4909 scope.go:117] "RemoveContainer" containerID="bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.432853 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.432983 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qjzr8" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.433673 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0"} err="failed to get container status \"bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0\": rpc error: code = NotFound desc = could not find container \"bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0\": container with ID starting with bf1325d971204dcc6c287447541b2b118103f64572e45126bd01b1d4c8d6beb0 not found: ID does not exist" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.433901 4909 scope.go:117] "RemoveContainer" containerID="79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.433032 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.434486 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f"} err="failed to get container status \"79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f\": rpc error: code = NotFound desc = could not find container \"79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f\": container with ID starting with 79139881e9e2ab9b2e1e1afbb23a17c44793704893da32466ce4daad42af766f not found: ID does not exist" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.433190 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.433255 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.433275 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.436995 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.599501 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cca02bdf-3908-4c1c-ab6c-883bf36a7121-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.599571 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.599595 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.599609 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cca02bdf-3908-4c1c-ab6c-883bf36a7121-logs\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.599632 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jc7t\" (UniqueName: \"kubernetes.io/projected/cca02bdf-3908-4c1c-ab6c-883bf36a7121-kube-api-access-4jc7t\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.599646 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-scripts\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.599669 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.599682 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.599704 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data-custom\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701318 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cca02bdf-3908-4c1c-ab6c-883bf36a7121-logs\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701388 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jc7t\" (UniqueName: \"kubernetes.io/projected/cca02bdf-3908-4c1c-ab6c-883bf36a7121-kube-api-access-4jc7t\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701418 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-scripts\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701450 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701467 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701492 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data-custom\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701585 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cca02bdf-3908-4c1c-ab6c-883bf36a7121-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701628 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701649 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701876 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cca02bdf-3908-4c1c-ab6c-883bf36a7121-etc-machine-id\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.701896 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cca02bdf-3908-4c1c-ab6c-883bf36a7121-logs\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.706082 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-scripts\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.706194 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data-custom\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.706384 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-public-tls-certs\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.707043 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.707920 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.709046 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.716439 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jc7t\" (UniqueName: \"kubernetes.io/projected/cca02bdf-3908-4c1c-ab6c-883bf36a7121-kube-api-access-4jc7t\") pod \"cinder-api-0\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " pod="openstack/cinder-api-0" Feb 02 12:04:17 crc kubenswrapper[4909]: I0202 12:04:17.801601 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 12:04:18 crc kubenswrapper[4909]: I0202 12:04:18.205291 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:18 crc kubenswrapper[4909]: I0202 12:04:18.363912 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cca02bdf-3908-4c1c-ab6c-883bf36a7121","Type":"ContainerStarted","Data":"13512373eb68d8a6a981a6fc69427f2b2f58839c4079990bfe3c30c3ae8aa169"} Feb 02 12:04:19 crc kubenswrapper[4909]: I0202 12:04:19.029745 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9102716d-702e-468b-ab36-2fb5fc743121" path="/var/lib/kubelet/pods/9102716d-702e-468b-ab36-2fb5fc743121/volumes" Feb 02 12:04:19 crc kubenswrapper[4909]: I0202 12:04:19.375371 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cca02bdf-3908-4c1c-ab6c-883bf36a7121","Type":"ContainerStarted","Data":"bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f"} Feb 02 12:04:19 crc kubenswrapper[4909]: I0202 12:04:19.375411 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cca02bdf-3908-4c1c-ab6c-883bf36a7121","Type":"ContainerStarted","Data":"ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de"} Feb 02 12:04:19 crc kubenswrapper[4909]: I0202 12:04:19.375505 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 12:04:19 crc kubenswrapper[4909]: I0202 12:04:19.410553 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.410527967 podStartE2EDuration="2.410527967s" podCreationTimestamp="2026-02-02 12:04:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:04:19.404170977 +0000 UTC m=+5585.150271712" watchObservedRunningTime="2026-02-02 12:04:19.410527967 +0000 UTC m=+5585.156628702" Feb 02 12:04:21 crc kubenswrapper[4909]: I0202 12:04:21.030661 4909 scope.go:117] "RemoveContainer" containerID="48c29c1509e33cdd6317cb7dc255f44fde9a51dcbf54f58d7f1d92c391208e1b" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.029243 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.096532 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b4bb8f457-t96qt"] Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.102873 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" podUID="cc55cc85-5418-4ef7-b8bb-dbedc64b0602" containerName="dnsmasq-dns" containerID="cri-o://1e6e73759c5ac0642cea740e6ee22f95884d46c2033397b037087f8a451dbedd" gracePeriod=10 Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.410240 4909 generic.go:334] "Generic (PLEG): container finished" podID="cc55cc85-5418-4ef7-b8bb-dbedc64b0602" containerID="1e6e73759c5ac0642cea740e6ee22f95884d46c2033397b037087f8a451dbedd" exitCode=0 Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.410512 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" event={"ID":"cc55cc85-5418-4ef7-b8bb-dbedc64b0602","Type":"ContainerDied","Data":"1e6e73759c5ac0642cea740e6ee22f95884d46c2033397b037087f8a451dbedd"} Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.586166 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.719128 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-dns-svc\") pod \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.719200 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-nb\") pod \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.719220 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-sb\") pod \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.719353 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-config\") pod \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.719414 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whz4b\" (UniqueName: \"kubernetes.io/projected/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-kube-api-access-whz4b\") pod \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\" (UID: \"cc55cc85-5418-4ef7-b8bb-dbedc64b0602\") " Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.724230 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-kube-api-access-whz4b" (OuterVolumeSpecName: "kube-api-access-whz4b") pod "cc55cc85-5418-4ef7-b8bb-dbedc64b0602" (UID: "cc55cc85-5418-4ef7-b8bb-dbedc64b0602"). InnerVolumeSpecName "kube-api-access-whz4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.775621 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc55cc85-5418-4ef7-b8bb-dbedc64b0602" (UID: "cc55cc85-5418-4ef7-b8bb-dbedc64b0602"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.781713 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc55cc85-5418-4ef7-b8bb-dbedc64b0602" (UID: "cc55cc85-5418-4ef7-b8bb-dbedc64b0602"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.792492 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc55cc85-5418-4ef7-b8bb-dbedc64b0602" (UID: "cc55cc85-5418-4ef7-b8bb-dbedc64b0602"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.809315 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-config" (OuterVolumeSpecName: "config") pod "cc55cc85-5418-4ef7-b8bb-dbedc64b0602" (UID: "cc55cc85-5418-4ef7-b8bb-dbedc64b0602"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.821425 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.821457 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whz4b\" (UniqueName: \"kubernetes.io/projected/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-kube-api-access-whz4b\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.821468 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.821478 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:23 crc kubenswrapper[4909]: I0202 12:04:23.821487 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc55cc85-5418-4ef7-b8bb-dbedc64b0602-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:24 crc kubenswrapper[4909]: I0202 12:04:24.419789 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" event={"ID":"cc55cc85-5418-4ef7-b8bb-dbedc64b0602","Type":"ContainerDied","Data":"907afc151385fe2792f28605691f5d65165f6c8f01984bfa66b6a3f750a39127"} Feb 02 12:04:24 crc kubenswrapper[4909]: I0202 12:04:24.420678 4909 scope.go:117] "RemoveContainer" containerID="1e6e73759c5ac0642cea740e6ee22f95884d46c2033397b037087f8a451dbedd" Feb 02 12:04:24 crc kubenswrapper[4909]: I0202 12:04:24.419868 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b4bb8f457-t96qt" Feb 02 12:04:24 crc kubenswrapper[4909]: I0202 12:04:24.442069 4909 scope.go:117] "RemoveContainer" containerID="9491ad23d6e2cac7b69411b59eeed6d0c6c211d83f22231a95974a671ae336de" Feb 02 12:04:24 crc kubenswrapper[4909]: I0202 12:04:24.458314 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b4bb8f457-t96qt"] Feb 02 12:04:24 crc kubenswrapper[4909]: I0202 12:04:24.466844 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b4bb8f457-t96qt"] Feb 02 12:04:25 crc kubenswrapper[4909]: I0202 12:04:25.041034 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc55cc85-5418-4ef7-b8bb-dbedc64b0602" path="/var/lib/kubelet/pods/cc55cc85-5418-4ef7-b8bb-dbedc64b0602/volumes" Feb 02 12:04:29 crc kubenswrapper[4909]: I0202 12:04:29.595926 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 02 12:04:44 crc kubenswrapper[4909]: I0202 12:04:44.998224 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 12:04:44 crc kubenswrapper[4909]: E0202 12:04:44.999122 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc55cc85-5418-4ef7-b8bb-dbedc64b0602" containerName="dnsmasq-dns" Feb 02 12:04:44 crc kubenswrapper[4909]: I0202 12:04:44.999138 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc55cc85-5418-4ef7-b8bb-dbedc64b0602" containerName="dnsmasq-dns" Feb 02 12:04:44 crc kubenswrapper[4909]: E0202 12:04:44.999168 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc55cc85-5418-4ef7-b8bb-dbedc64b0602" containerName="init" Feb 02 12:04:44 crc kubenswrapper[4909]: I0202 12:04:44.999174 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc55cc85-5418-4ef7-b8bb-dbedc64b0602" containerName="init" Feb 02 12:04:44 crc kubenswrapper[4909]: I0202 12:04:44.999327 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc55cc85-5418-4ef7-b8bb-dbedc64b0602" containerName="dnsmasq-dns" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.000194 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.003482 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.012829 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.119525 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.119603 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftnt4\" (UniqueName: \"kubernetes.io/projected/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-kube-api-access-ftnt4\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.119630 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.119731 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.119909 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-scripts\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.119949 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.221686 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-scripts\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.221735 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.221773 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.221799 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftnt4\" (UniqueName: \"kubernetes.io/projected/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-kube-api-access-ftnt4\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.221831 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.222094 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.222155 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.228232 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.229021 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.242585 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftnt4\" (UniqueName: \"kubernetes.io/projected/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-kube-api-access-ftnt4\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.243845 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.246521 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-scripts\") pod \"cinder-scheduler-0\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.324090 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 12:04:45 crc kubenswrapper[4909]: I0202 12:04:45.762645 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 12:04:46 crc kubenswrapper[4909]: I0202 12:04:46.447214 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:46 crc kubenswrapper[4909]: I0202 12:04:46.448089 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerName="cinder-api-log" containerID="cri-o://ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de" gracePeriod=30 Feb 02 12:04:46 crc kubenswrapper[4909]: I0202 12:04:46.448229 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerName="cinder-api" containerID="cri-o://bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f" gracePeriod=30 Feb 02 12:04:46 crc kubenswrapper[4909]: I0202 12:04:46.616570 4909 generic.go:334] "Generic (PLEG): container finished" podID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerID="ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de" exitCode=143 Feb 02 12:04:46 crc kubenswrapper[4909]: I0202 12:04:46.616650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cca02bdf-3908-4c1c-ab6c-883bf36a7121","Type":"ContainerDied","Data":"ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de"} Feb 02 12:04:46 crc kubenswrapper[4909]: I0202 12:04:46.619552 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d","Type":"ContainerStarted","Data":"9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836"} Feb 02 12:04:46 crc kubenswrapper[4909]: I0202 12:04:46.619581 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d","Type":"ContainerStarted","Data":"4d0ee3ca55b175f7519a517080056b839663534876e83ee99b27d06fd1df13f6"} Feb 02 12:04:47 crc kubenswrapper[4909]: I0202 12:04:47.629854 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d","Type":"ContainerStarted","Data":"bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6"} Feb 02 12:04:47 crc kubenswrapper[4909]: I0202 12:04:47.648281 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.648261653 podStartE2EDuration="3.648261653s" podCreationTimestamp="2026-02-02 12:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:04:47.646273646 +0000 UTC m=+5613.392374391" watchObservedRunningTime="2026-02-02 12:04:47.648261653 +0000 UTC m=+5613.394362398" Feb 02 12:04:49 crc kubenswrapper[4909]: I0202 12:04:49.510877 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:04:49 crc kubenswrapper[4909]: I0202 12:04:49.511527 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:04:49 crc kubenswrapper[4909]: I0202 12:04:49.601346 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.1.50:8776/healthcheck\": read tcp 10.217.0.2:41694->10.217.1.50:8776: read: connection reset by peer" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.025121 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.118928 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jc7t\" (UniqueName: \"kubernetes.io/projected/cca02bdf-3908-4c1c-ab6c-883bf36a7121-kube-api-access-4jc7t\") pod \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.118979 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data\") pod \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.119012 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-scripts\") pod \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.119029 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cca02bdf-3908-4c1c-ab6c-883bf36a7121-etc-machine-id\") pod \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.119051 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cca02bdf-3908-4c1c-ab6c-883bf36a7121-logs\") pod \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.119099 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-public-tls-certs\") pod \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.119603 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-internal-tls-certs\") pod \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.119627 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-combined-ca-bundle\") pod \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.119674 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data-custom\") pod \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\" (UID: \"cca02bdf-3908-4c1c-ab6c-883bf36a7121\") " Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.119894 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cca02bdf-3908-4c1c-ab6c-883bf36a7121-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "cca02bdf-3908-4c1c-ab6c-883bf36a7121" (UID: "cca02bdf-3908-4c1c-ab6c-883bf36a7121"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.121199 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca02bdf-3908-4c1c-ab6c-883bf36a7121-logs" (OuterVolumeSpecName: "logs") pod "cca02bdf-3908-4c1c-ab6c-883bf36a7121" (UID: "cca02bdf-3908-4c1c-ab6c-883bf36a7121"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.125954 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca02bdf-3908-4c1c-ab6c-883bf36a7121-kube-api-access-4jc7t" (OuterVolumeSpecName: "kube-api-access-4jc7t") pod "cca02bdf-3908-4c1c-ab6c-883bf36a7121" (UID: "cca02bdf-3908-4c1c-ab6c-883bf36a7121"). InnerVolumeSpecName "kube-api-access-4jc7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.139860 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-scripts" (OuterVolumeSpecName: "scripts") pod "cca02bdf-3908-4c1c-ab6c-883bf36a7121" (UID: "cca02bdf-3908-4c1c-ab6c-883bf36a7121"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.143946 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cca02bdf-3908-4c1c-ab6c-883bf36a7121" (UID: "cca02bdf-3908-4c1c-ab6c-883bf36a7121"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.165402 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cca02bdf-3908-4c1c-ab6c-883bf36a7121" (UID: "cca02bdf-3908-4c1c-ab6c-883bf36a7121"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.185279 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cca02bdf-3908-4c1c-ab6c-883bf36a7121" (UID: "cca02bdf-3908-4c1c-ab6c-883bf36a7121"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.203737 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data" (OuterVolumeSpecName: "config-data") pod "cca02bdf-3908-4c1c-ab6c-883bf36a7121" (UID: "cca02bdf-3908-4c1c-ab6c-883bf36a7121"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.205621 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cca02bdf-3908-4c1c-ab6c-883bf36a7121" (UID: "cca02bdf-3908-4c1c-ab6c-883bf36a7121"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.223039 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.223075 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jc7t\" (UniqueName: \"kubernetes.io/projected/cca02bdf-3908-4c1c-ab6c-883bf36a7121-kube-api-access-4jc7t\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.223087 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.223094 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.223103 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cca02bdf-3908-4c1c-ab6c-883bf36a7121-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.223111 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cca02bdf-3908-4c1c-ab6c-883bf36a7121-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.223120 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.223128 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.223137 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cca02bdf-3908-4c1c-ab6c-883bf36a7121-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.324176 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.656242 4909 generic.go:334] "Generic (PLEG): container finished" podID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerID="bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f" exitCode=0 Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.656292 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.656294 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cca02bdf-3908-4c1c-ab6c-883bf36a7121","Type":"ContainerDied","Data":"bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f"} Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.656414 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"cca02bdf-3908-4c1c-ab6c-883bf36a7121","Type":"ContainerDied","Data":"13512373eb68d8a6a981a6fc69427f2b2f58839c4079990bfe3c30c3ae8aa169"} Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.656436 4909 scope.go:117] "RemoveContainer" containerID="bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.679311 4909 scope.go:117] "RemoveContainer" containerID="ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.723861 4909 scope.go:117] "RemoveContainer" containerID="bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.725635 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:50 crc kubenswrapper[4909]: E0202 12:04:50.727199 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f\": container with ID starting with bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f not found: ID does not exist" containerID="bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.727239 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f"} err="failed to get container status \"bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f\": rpc error: code = NotFound desc = could not find container \"bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f\": container with ID starting with bf5f496992e0d7d90f998429ae0e22ff69cee3b3c4c24914eda175df4b09d77f not found: ID does not exist" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.727265 4909 scope.go:117] "RemoveContainer" containerID="ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de" Feb 02 12:04:50 crc kubenswrapper[4909]: E0202 12:04:50.727789 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de\": container with ID starting with ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de not found: ID does not exist" containerID="ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.727855 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de"} err="failed to get container status \"ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de\": rpc error: code = NotFound desc = could not find container \"ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de\": container with ID starting with ea5b28406fab0f2341189ee7c314f973f618e45bba5563c86afcf540969de9de not found: ID does not exist" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.734773 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.744523 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:50 crc kubenswrapper[4909]: E0202 12:04:50.744974 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerName="cinder-api-log" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.744998 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerName="cinder-api-log" Feb 02 12:04:50 crc kubenswrapper[4909]: E0202 12:04:50.745020 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerName="cinder-api" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.745028 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerName="cinder-api" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.745213 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerName="cinder-api" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.745239 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" containerName="cinder-api-log" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.746112 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.751550 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.752504 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.752785 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.762011 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.934143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea5f68d-4b9b-45cf-b584-075ad7298647-logs\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.934205 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.934331 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-config-data-custom\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.934376 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.934425 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eea5f68d-4b9b-45cf-b584-075ad7298647-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.934447 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lpr9\" (UniqueName: \"kubernetes.io/projected/eea5f68d-4b9b-45cf-b584-075ad7298647-kube-api-access-7lpr9\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.934476 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-scripts\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.934493 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-config-data\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:50 crc kubenswrapper[4909]: I0202 12:04:50.934523 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.034062 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca02bdf-3908-4c1c-ab6c-883bf36a7121" path="/var/lib/kubelet/pods/cca02bdf-3908-4c1c-ab6c-883bf36a7121/volumes" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.035780 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.035856 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eea5f68d-4b9b-45cf-b584-075ad7298647-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.035880 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lpr9\" (UniqueName: \"kubernetes.io/projected/eea5f68d-4b9b-45cf-b584-075ad7298647-kube-api-access-7lpr9\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.035910 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-scripts\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.035927 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-config-data\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.035948 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.035951 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eea5f68d-4b9b-45cf-b584-075ad7298647-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.035981 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea5f68d-4b9b-45cf-b584-075ad7298647-logs\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.035997 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.036039 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-config-data-custom\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.036671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eea5f68d-4b9b-45cf-b584-075ad7298647-logs\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.039904 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-scripts\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.042057 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-config-data-custom\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.042694 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-config-data\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.043020 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.044079 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.049849 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eea5f68d-4b9b-45cf-b584-075ad7298647-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.054262 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lpr9\" (UniqueName: \"kubernetes.io/projected/eea5f68d-4b9b-45cf-b584-075ad7298647-kube-api-access-7lpr9\") pod \"cinder-api-0\" (UID: \"eea5f68d-4b9b-45cf-b584-075ad7298647\") " pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.073597 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.505672 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 02 12:04:51 crc kubenswrapper[4909]: I0202 12:04:51.667492 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eea5f68d-4b9b-45cf-b584-075ad7298647","Type":"ContainerStarted","Data":"6eece48b72adb5e63b9e07c866fb9d008295be68c436478d524f289721d321ed"} Feb 02 12:04:52 crc kubenswrapper[4909]: I0202 12:04:52.683080 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eea5f68d-4b9b-45cf-b584-075ad7298647","Type":"ContainerStarted","Data":"b9e03f04c8a5cf065309e1d8032c9f1ccc93f9aca2634443ed47c855521f76c1"} Feb 02 12:04:52 crc kubenswrapper[4909]: I0202 12:04:52.683716 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eea5f68d-4b9b-45cf-b584-075ad7298647","Type":"ContainerStarted","Data":"90ca3457853c45b6d690f505ba38f069034776ed01bf0fcc2c200643a9129e5f"} Feb 02 12:04:52 crc kubenswrapper[4909]: I0202 12:04:52.683769 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 02 12:04:55 crc kubenswrapper[4909]: I0202 12:04:55.549663 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 02 12:04:55 crc kubenswrapper[4909]: I0202 12:04:55.580816 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.580783755 podStartE2EDuration="5.580783755s" podCreationTimestamp="2026-02-02 12:04:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:04:52.710228589 +0000 UTC m=+5618.456329324" watchObservedRunningTime="2026-02-02 12:04:55.580783755 +0000 UTC m=+5621.326884490" Feb 02 12:04:55 crc kubenswrapper[4909]: I0202 12:04:55.617600 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 12:04:55 crc kubenswrapper[4909]: I0202 12:04:55.713004 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" containerName="cinder-scheduler" containerID="cri-o://9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836" gracePeriod=30 Feb 02 12:04:55 crc kubenswrapper[4909]: I0202 12:04:55.713349 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" containerName="probe" containerID="cri-o://bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6" gracePeriod=30 Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.501589 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.569825 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data-custom\") pod \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.570203 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-combined-ca-bundle\") pod \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.570276 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftnt4\" (UniqueName: \"kubernetes.io/projected/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-kube-api-access-ftnt4\") pod \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.570313 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-etc-machine-id\") pod \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.570345 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data\") pod \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.570366 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-scripts\") pod \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\" (UID: \"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d\") " Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.570414 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" (UID: "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.570643 4909 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.591034 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-scripts" (OuterVolumeSpecName: "scripts") pod "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" (UID: "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.595884 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-kube-api-access-ftnt4" (OuterVolumeSpecName: "kube-api-access-ftnt4") pod "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" (UID: "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d"). InnerVolumeSpecName "kube-api-access-ftnt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.597588 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" (UID: "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.630366 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" (UID: "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.674379 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.674491 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.674508 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftnt4\" (UniqueName: \"kubernetes.io/projected/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-kube-api-access-ftnt4\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.674522 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.679530 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data" (OuterVolumeSpecName: "config-data") pod "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" (UID: "1afb9ab5-f2d0-42a3-84a4-34cd9631e96d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.738070 4909 generic.go:334] "Generic (PLEG): container finished" podID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" containerID="bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6" exitCode=0 Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.738108 4909 generic.go:334] "Generic (PLEG): container finished" podID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" containerID="9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836" exitCode=0 Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.738131 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d","Type":"ContainerDied","Data":"bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6"} Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.738161 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d","Type":"ContainerDied","Data":"9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836"} Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.738174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"1afb9ab5-f2d0-42a3-84a4-34cd9631e96d","Type":"ContainerDied","Data":"4d0ee3ca55b175f7519a517080056b839663534876e83ee99b27d06fd1df13f6"} Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.738190 4909 scope.go:117] "RemoveContainer" containerID="bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.738258 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.776263 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.801872 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.808533 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.818997 4909 scope.go:117] "RemoveContainer" containerID="9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.821162 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 12:04:57 crc kubenswrapper[4909]: E0202 12:04:57.821643 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" containerName="probe" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.821670 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" containerName="probe" Feb 02 12:04:57 crc kubenswrapper[4909]: E0202 12:04:57.821695 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" containerName="cinder-scheduler" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.821704 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" containerName="cinder-scheduler" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.821920 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" containerName="probe" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.821951 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" containerName="cinder-scheduler" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.826884 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.834712 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.835967 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.858175 4909 scope.go:117] "RemoveContainer" containerID="bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6" Feb 02 12:04:57 crc kubenswrapper[4909]: E0202 12:04:57.858690 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6\": container with ID starting with bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6 not found: ID does not exist" containerID="bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.858732 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6"} err="failed to get container status \"bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6\": rpc error: code = NotFound desc = could not find container \"bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6\": container with ID starting with bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6 not found: ID does not exist" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.858761 4909 scope.go:117] "RemoveContainer" containerID="9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836" Feb 02 12:04:57 crc kubenswrapper[4909]: E0202 12:04:57.859006 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836\": container with ID starting with 9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836 not found: ID does not exist" containerID="9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.859030 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836"} err="failed to get container status \"9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836\": rpc error: code = NotFound desc = could not find container \"9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836\": container with ID starting with 9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836 not found: ID does not exist" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.859048 4909 scope.go:117] "RemoveContainer" containerID="bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.859246 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6"} err="failed to get container status \"bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6\": rpc error: code = NotFound desc = could not find container \"bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6\": container with ID starting with bc7d89c8424d5b11026046962ac7ebfc21cb903b4cd4f61904c3380b8fe839a6 not found: ID does not exist" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.859267 4909 scope.go:117] "RemoveContainer" containerID="9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.859468 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836"} err="failed to get container status \"9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836\": rpc error: code = NotFound desc = could not find container \"9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836\": container with ID starting with 9d77cfa7d6f35adf8f0c67f63879d84ec6354947efb32f430f4bac6efa65c836 not found: ID does not exist" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.877689 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc6jt\" (UniqueName: \"kubernetes.io/projected/695e2382-8687-40cf-be18-868d6746b9b9-kube-api-access-fc6jt\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.877852 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.877897 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/695e2382-8687-40cf-be18-868d6746b9b9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.877943 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-config-data\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.877990 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-scripts\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.878023 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.980122 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-config-data\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.980668 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-scripts\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.980792 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.980959 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc6jt\" (UniqueName: \"kubernetes.io/projected/695e2382-8687-40cf-be18-868d6746b9b9-kube-api-access-fc6jt\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.981210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.981709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/695e2382-8687-40cf-be18-868d6746b9b9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.981987 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/695e2382-8687-40cf-be18-868d6746b9b9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.984657 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-scripts\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.984965 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-config-data\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.990240 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:57 crc kubenswrapper[4909]: I0202 12:04:57.990588 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/695e2382-8687-40cf-be18-868d6746b9b9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:58 crc kubenswrapper[4909]: I0202 12:04:58.003797 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc6jt\" (UniqueName: \"kubernetes.io/projected/695e2382-8687-40cf-be18-868d6746b9b9-kube-api-access-fc6jt\") pod \"cinder-scheduler-0\" (UID: \"695e2382-8687-40cf-be18-868d6746b9b9\") " pod="openstack/cinder-scheduler-0" Feb 02 12:04:58 crc kubenswrapper[4909]: I0202 12:04:58.150647 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 02 12:04:58 crc kubenswrapper[4909]: I0202 12:04:58.628669 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 02 12:04:58 crc kubenswrapper[4909]: I0202 12:04:58.754889 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"695e2382-8687-40cf-be18-868d6746b9b9","Type":"ContainerStarted","Data":"9e8261b98c1806844f0a7b79ce4ea14f3ba2159f3b6306758f8a8d778c3f8042"} Feb 02 12:04:59 crc kubenswrapper[4909]: I0202 12:04:59.028121 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1afb9ab5-f2d0-42a3-84a4-34cd9631e96d" path="/var/lib/kubelet/pods/1afb9ab5-f2d0-42a3-84a4-34cd9631e96d/volumes" Feb 02 12:04:59 crc kubenswrapper[4909]: I0202 12:04:59.769099 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"695e2382-8687-40cf-be18-868d6746b9b9","Type":"ContainerStarted","Data":"f7d703be50daf33a4c6a39f53915164d25bb26d240d5a2a8fb573efb8072aa6a"} Feb 02 12:04:59 crc kubenswrapper[4909]: I0202 12:04:59.771550 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"695e2382-8687-40cf-be18-868d6746b9b9","Type":"ContainerStarted","Data":"188d9a68782ac03bec906d20e1d18341b95b31e7dbbef4e50fc9aead4f876a2b"} Feb 02 12:04:59 crc kubenswrapper[4909]: I0202 12:04:59.796549 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.796531543 podStartE2EDuration="2.796531543s" podCreationTimestamp="2026-02-02 12:04:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:04:59.792348494 +0000 UTC m=+5625.538449249" watchObservedRunningTime="2026-02-02 12:04:59.796531543 +0000 UTC m=+5625.542632278" Feb 02 12:05:02 crc kubenswrapper[4909]: I0202 12:05:02.946716 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 02 12:05:03 crc kubenswrapper[4909]: I0202 12:05:03.151073 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 02 12:05:09 crc kubenswrapper[4909]: I0202 12:05:08.349984 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.161248 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-86npr"] Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.163541 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-86npr" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.174310 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-86npr"] Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.256020 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4633-account-create-update-h9zgz"] Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.273938 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94b252cb-ab91-45b8-90e3-848e58d8e15e-operator-scripts\") pod \"glance-db-create-86npr\" (UID: \"94b252cb-ab91-45b8-90e3-848e58d8e15e\") " pod="openstack/glance-db-create-86npr" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.274130 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w72jw\" (UniqueName: \"kubernetes.io/projected/94b252cb-ab91-45b8-90e3-848e58d8e15e-kube-api-access-w72jw\") pod \"glance-db-create-86npr\" (UID: \"94b252cb-ab91-45b8-90e3-848e58d8e15e\") " pod="openstack/glance-db-create-86npr" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.276868 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4633-account-create-update-h9zgz" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.279343 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4633-account-create-update-h9zgz"] Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.318626 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.375835 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6g2l\" (UniqueName: \"kubernetes.io/projected/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-kube-api-access-d6g2l\") pod \"glance-4633-account-create-update-h9zgz\" (UID: \"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092\") " pod="openstack/glance-4633-account-create-update-h9zgz" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.375905 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w72jw\" (UniqueName: \"kubernetes.io/projected/94b252cb-ab91-45b8-90e3-848e58d8e15e-kube-api-access-w72jw\") pod \"glance-db-create-86npr\" (UID: \"94b252cb-ab91-45b8-90e3-848e58d8e15e\") " pod="openstack/glance-db-create-86npr" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.375956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94b252cb-ab91-45b8-90e3-848e58d8e15e-operator-scripts\") pod \"glance-db-create-86npr\" (UID: \"94b252cb-ab91-45b8-90e3-848e58d8e15e\") " pod="openstack/glance-db-create-86npr" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.375989 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-operator-scripts\") pod \"glance-4633-account-create-update-h9zgz\" (UID: \"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092\") " pod="openstack/glance-4633-account-create-update-h9zgz" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.377664 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94b252cb-ab91-45b8-90e3-848e58d8e15e-operator-scripts\") pod \"glance-db-create-86npr\" (UID: \"94b252cb-ab91-45b8-90e3-848e58d8e15e\") " pod="openstack/glance-db-create-86npr" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.395663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w72jw\" (UniqueName: \"kubernetes.io/projected/94b252cb-ab91-45b8-90e3-848e58d8e15e-kube-api-access-w72jw\") pod \"glance-db-create-86npr\" (UID: \"94b252cb-ab91-45b8-90e3-848e58d8e15e\") " pod="openstack/glance-db-create-86npr" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.477243 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6g2l\" (UniqueName: \"kubernetes.io/projected/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-kube-api-access-d6g2l\") pod \"glance-4633-account-create-update-h9zgz\" (UID: \"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092\") " pod="openstack/glance-4633-account-create-update-h9zgz" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.477380 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-operator-scripts\") pod \"glance-4633-account-create-update-h9zgz\" (UID: \"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092\") " pod="openstack/glance-4633-account-create-update-h9zgz" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.478199 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-operator-scripts\") pod \"glance-4633-account-create-update-h9zgz\" (UID: \"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092\") " pod="openstack/glance-4633-account-create-update-h9zgz" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.491018 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-86npr" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.493313 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6g2l\" (UniqueName: \"kubernetes.io/projected/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-kube-api-access-d6g2l\") pod \"glance-4633-account-create-update-h9zgz\" (UID: \"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092\") " pod="openstack/glance-4633-account-create-update-h9zgz" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.643751 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4633-account-create-update-h9zgz" Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.737880 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-86npr"] Feb 02 12:05:11 crc kubenswrapper[4909]: W0202 12:05:11.750009 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94b252cb_ab91_45b8_90e3_848e58d8e15e.slice/crio-b20dd13b61635b91d2cca6126ea1e7cc59044343bd1466677ee9939cd8e4a15f WatchSource:0}: Error finding container b20dd13b61635b91d2cca6126ea1e7cc59044343bd1466677ee9939cd8e4a15f: Status 404 returned error can't find the container with id b20dd13b61635b91d2cca6126ea1e7cc59044343bd1466677ee9939cd8e4a15f Feb 02 12:05:11 crc kubenswrapper[4909]: I0202 12:05:11.924118 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-86npr" event={"ID":"94b252cb-ab91-45b8-90e3-848e58d8e15e","Type":"ContainerStarted","Data":"b20dd13b61635b91d2cca6126ea1e7cc59044343bd1466677ee9939cd8e4a15f"} Feb 02 12:05:12 crc kubenswrapper[4909]: W0202 12:05:12.106139 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda73632c6_a76c_4c6f_bf8b_b08b9e3c6092.slice/crio-907d00d9720f1a27f620d1acbc4c6b1ce2301830ec7c54fa78939abad633cef0 WatchSource:0}: Error finding container 907d00d9720f1a27f620d1acbc4c6b1ce2301830ec7c54fa78939abad633cef0: Status 404 returned error can't find the container with id 907d00d9720f1a27f620d1acbc4c6b1ce2301830ec7c54fa78939abad633cef0 Feb 02 12:05:12 crc kubenswrapper[4909]: I0202 12:05:12.106801 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4633-account-create-update-h9zgz"] Feb 02 12:05:12 crc kubenswrapper[4909]: I0202 12:05:12.934059 4909 generic.go:334] "Generic (PLEG): container finished" podID="a73632c6-a76c-4c6f-bf8b-b08b9e3c6092" containerID="44e184f7db4057268a2723de85c6bf922fa1b132fbffd98e4fef4350870a0b6f" exitCode=0 Feb 02 12:05:12 crc kubenswrapper[4909]: I0202 12:05:12.934133 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4633-account-create-update-h9zgz" event={"ID":"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092","Type":"ContainerDied","Data":"44e184f7db4057268a2723de85c6bf922fa1b132fbffd98e4fef4350870a0b6f"} Feb 02 12:05:12 crc kubenswrapper[4909]: I0202 12:05:12.934409 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4633-account-create-update-h9zgz" event={"ID":"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092","Type":"ContainerStarted","Data":"907d00d9720f1a27f620d1acbc4c6b1ce2301830ec7c54fa78939abad633cef0"} Feb 02 12:05:12 crc kubenswrapper[4909]: I0202 12:05:12.936138 4909 generic.go:334] "Generic (PLEG): container finished" podID="94b252cb-ab91-45b8-90e3-848e58d8e15e" containerID="24f82203c598179471b653e1ca099546b9a9e9872e0df83fbebcc69ed7005811" exitCode=0 Feb 02 12:05:12 crc kubenswrapper[4909]: I0202 12:05:12.936171 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-86npr" event={"ID":"94b252cb-ab91-45b8-90e3-848e58d8e15e","Type":"ContainerDied","Data":"24f82203c598179471b653e1ca099546b9a9e9872e0df83fbebcc69ed7005811"} Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.336220 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-86npr" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.343529 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4633-account-create-update-h9zgz" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.442200 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94b252cb-ab91-45b8-90e3-848e58d8e15e-operator-scripts\") pod \"94b252cb-ab91-45b8-90e3-848e58d8e15e\" (UID: \"94b252cb-ab91-45b8-90e3-848e58d8e15e\") " Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.442403 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w72jw\" (UniqueName: \"kubernetes.io/projected/94b252cb-ab91-45b8-90e3-848e58d8e15e-kube-api-access-w72jw\") pod \"94b252cb-ab91-45b8-90e3-848e58d8e15e\" (UID: \"94b252cb-ab91-45b8-90e3-848e58d8e15e\") " Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.442426 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6g2l\" (UniqueName: \"kubernetes.io/projected/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-kube-api-access-d6g2l\") pod \"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092\" (UID: \"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092\") " Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.442575 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-operator-scripts\") pod \"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092\" (UID: \"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092\") " Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.442898 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94b252cb-ab91-45b8-90e3-848e58d8e15e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94b252cb-ab91-45b8-90e3-848e58d8e15e" (UID: "94b252cb-ab91-45b8-90e3-848e58d8e15e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.443225 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a73632c6-a76c-4c6f-bf8b-b08b9e3c6092" (UID: "a73632c6-a76c-4c6f-bf8b-b08b9e3c6092"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.443619 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94b252cb-ab91-45b8-90e3-848e58d8e15e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.443641 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.449116 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b252cb-ab91-45b8-90e3-848e58d8e15e-kube-api-access-w72jw" (OuterVolumeSpecName: "kube-api-access-w72jw") pod "94b252cb-ab91-45b8-90e3-848e58d8e15e" (UID: "94b252cb-ab91-45b8-90e3-848e58d8e15e"). InnerVolumeSpecName "kube-api-access-w72jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.449154 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-kube-api-access-d6g2l" (OuterVolumeSpecName: "kube-api-access-d6g2l") pod "a73632c6-a76c-4c6f-bf8b-b08b9e3c6092" (UID: "a73632c6-a76c-4c6f-bf8b-b08b9e3c6092"). InnerVolumeSpecName "kube-api-access-d6g2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.546439 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6g2l\" (UniqueName: \"kubernetes.io/projected/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092-kube-api-access-d6g2l\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.546488 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w72jw\" (UniqueName: \"kubernetes.io/projected/94b252cb-ab91-45b8-90e3-848e58d8e15e-kube-api-access-w72jw\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.954167 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4633-account-create-update-h9zgz" event={"ID":"a73632c6-a76c-4c6f-bf8b-b08b9e3c6092","Type":"ContainerDied","Data":"907d00d9720f1a27f620d1acbc4c6b1ce2301830ec7c54fa78939abad633cef0"} Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.954210 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="907d00d9720f1a27f620d1acbc4c6b1ce2301830ec7c54fa78939abad633cef0" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.954190 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4633-account-create-update-h9zgz" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.955796 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-86npr" event={"ID":"94b252cb-ab91-45b8-90e3-848e58d8e15e","Type":"ContainerDied","Data":"b20dd13b61635b91d2cca6126ea1e7cc59044343bd1466677ee9939cd8e4a15f"} Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.955840 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b20dd13b61635b91d2cca6126ea1e7cc59044343bd1466677ee9939cd8e4a15f" Feb 02 12:05:14 crc kubenswrapper[4909]: I0202 12:05:14.955926 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-86npr" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.493290 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zp7nf"] Feb 02 12:05:16 crc kubenswrapper[4909]: E0202 12:05:16.493711 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b252cb-ab91-45b8-90e3-848e58d8e15e" containerName="mariadb-database-create" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.493726 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b252cb-ab91-45b8-90e3-848e58d8e15e" containerName="mariadb-database-create" Feb 02 12:05:16 crc kubenswrapper[4909]: E0202 12:05:16.493740 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a73632c6-a76c-4c6f-bf8b-b08b9e3c6092" containerName="mariadb-account-create-update" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.493746 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73632c6-a76c-4c6f-bf8b-b08b9e3c6092" containerName="mariadb-account-create-update" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.494060 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a73632c6-a76c-4c6f-bf8b-b08b9e3c6092" containerName="mariadb-account-create-update" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.494073 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b252cb-ab91-45b8-90e3-848e58d8e15e" containerName="mariadb-database-create" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.494633 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.496585 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.496741 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-n9t9t" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.509894 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zp7nf"] Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.581873 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxghr\" (UniqueName: \"kubernetes.io/projected/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-kube-api-access-nxghr\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.582018 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-config-data\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.582089 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-db-sync-config-data\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.582105 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-combined-ca-bundle\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.683539 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxghr\" (UniqueName: \"kubernetes.io/projected/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-kube-api-access-nxghr\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.684113 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-config-data\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.684335 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-db-sync-config-data\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.684442 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-combined-ca-bundle\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.691689 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-combined-ca-bundle\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.697505 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-config-data\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.700831 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-db-sync-config-data\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.701359 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxghr\" (UniqueName: \"kubernetes.io/projected/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-kube-api-access-nxghr\") pod \"glance-db-sync-zp7nf\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:16 crc kubenswrapper[4909]: I0202 12:05:16.813353 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:17 crc kubenswrapper[4909]: I0202 12:05:17.361047 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zp7nf"] Feb 02 12:05:17 crc kubenswrapper[4909]: W0202 12:05:17.361640 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf879cfdb_2ad6_47ca_ac7c_cb892538ca36.slice/crio-b99039077596f5a22134e6f820094d9fb979aa9255fe22b55dc2c9f5f35e654c WatchSource:0}: Error finding container b99039077596f5a22134e6f820094d9fb979aa9255fe22b55dc2c9f5f35e654c: Status 404 returned error can't find the container with id b99039077596f5a22134e6f820094d9fb979aa9255fe22b55dc2c9f5f35e654c Feb 02 12:05:17 crc kubenswrapper[4909]: I0202 12:05:17.981009 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zp7nf" event={"ID":"f879cfdb-2ad6-47ca-ac7c-cb892538ca36","Type":"ContainerStarted","Data":"d0c26779153e8bc2446d85ab7c2136b32ffa0643deeac78846eb0e073ad7a6ba"} Feb 02 12:05:17 crc kubenswrapper[4909]: I0202 12:05:17.981363 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zp7nf" event={"ID":"f879cfdb-2ad6-47ca-ac7c-cb892538ca36","Type":"ContainerStarted","Data":"b99039077596f5a22134e6f820094d9fb979aa9255fe22b55dc2c9f5f35e654c"} Feb 02 12:05:17 crc kubenswrapper[4909]: I0202 12:05:17.998125 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zp7nf" podStartSLOduration=1.9981026160000002 podStartE2EDuration="1.998102616s" podCreationTimestamp="2026-02-02 12:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:05:17.996285735 +0000 UTC m=+5643.742386470" watchObservedRunningTime="2026-02-02 12:05:17.998102616 +0000 UTC m=+5643.744203351" Feb 02 12:05:19 crc kubenswrapper[4909]: I0202 12:05:19.510782 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:05:19 crc kubenswrapper[4909]: I0202 12:05:19.511307 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:05:21 crc kubenswrapper[4909]: I0202 12:05:21.136851 4909 scope.go:117] "RemoveContainer" containerID="4b2762d3e81516838c026874fac5de21b6e4eb372af75a58811af530077b7c16" Feb 02 12:05:22 crc kubenswrapper[4909]: I0202 12:05:22.020101 4909 generic.go:334] "Generic (PLEG): container finished" podID="f879cfdb-2ad6-47ca-ac7c-cb892538ca36" containerID="d0c26779153e8bc2446d85ab7c2136b32ffa0643deeac78846eb0e073ad7a6ba" exitCode=0 Feb 02 12:05:22 crc kubenswrapper[4909]: I0202 12:05:22.020170 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zp7nf" event={"ID":"f879cfdb-2ad6-47ca-ac7c-cb892538ca36","Type":"ContainerDied","Data":"d0c26779153e8bc2446d85ab7c2136b32ffa0643deeac78846eb0e073ad7a6ba"} Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.437931 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.531414 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-db-sync-config-data\") pod \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.531488 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-config-data\") pod \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.531541 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-combined-ca-bundle\") pod \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.531574 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxghr\" (UniqueName: \"kubernetes.io/projected/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-kube-api-access-nxghr\") pod \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\" (UID: \"f879cfdb-2ad6-47ca-ac7c-cb892538ca36\") " Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.537176 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-kube-api-access-nxghr" (OuterVolumeSpecName: "kube-api-access-nxghr") pod "f879cfdb-2ad6-47ca-ac7c-cb892538ca36" (UID: "f879cfdb-2ad6-47ca-ac7c-cb892538ca36"). InnerVolumeSpecName "kube-api-access-nxghr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.538324 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f879cfdb-2ad6-47ca-ac7c-cb892538ca36" (UID: "f879cfdb-2ad6-47ca-ac7c-cb892538ca36"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.558249 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f879cfdb-2ad6-47ca-ac7c-cb892538ca36" (UID: "f879cfdb-2ad6-47ca-ac7c-cb892538ca36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.600345 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-config-data" (OuterVolumeSpecName: "config-data") pod "f879cfdb-2ad6-47ca-ac7c-cb892538ca36" (UID: "f879cfdb-2ad6-47ca-ac7c-cb892538ca36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.634144 4909 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.634172 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.634180 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:23 crc kubenswrapper[4909]: I0202 12:05:23.634189 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxghr\" (UniqueName: \"kubernetes.io/projected/f879cfdb-2ad6-47ca-ac7c-cb892538ca36-kube-api-access-nxghr\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.040991 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zp7nf" event={"ID":"f879cfdb-2ad6-47ca-ac7c-cb892538ca36","Type":"ContainerDied","Data":"b99039077596f5a22134e6f820094d9fb979aa9255fe22b55dc2c9f5f35e654c"} Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.041043 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b99039077596f5a22134e6f820094d9fb979aa9255fe22b55dc2c9f5f35e654c" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.041106 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zp7nf" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.482307 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:05:24 crc kubenswrapper[4909]: E0202 12:05:24.482747 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f879cfdb-2ad6-47ca-ac7c-cb892538ca36" containerName="glance-db-sync" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.482765 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f879cfdb-2ad6-47ca-ac7c-cb892538ca36" containerName="glance-db-sync" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.483016 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f879cfdb-2ad6-47ca-ac7c-cb892538ca36" containerName="glance-db-sync" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.489315 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.491372 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.491848 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-n9t9t" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.493098 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.499732 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.618893 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76c747cbd5-ntrd9"] Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.620744 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.634985 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c747cbd5-ntrd9"] Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.653846 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.653954 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.654180 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-logs\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.654238 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-config-data\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.654269 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-scripts\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.654314 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx22p\" (UniqueName: \"kubernetes.io/projected/a74b5c77-830a-402a-a0de-e1981ad4c293-kube-api-access-fx22p\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.714388 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.716854 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.727022 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.747098 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.755997 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-config\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.756044 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-logs\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.756066 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-config-data\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.756089 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-scripts\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.756119 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx22p\" (UniqueName: \"kubernetes.io/projected/a74b5c77-830a-402a-a0de-e1981ad4c293-kube-api-access-fx22p\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.756158 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-nb\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.756200 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.756231 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-sb\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.756249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.756268 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-dns-svc\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.756319 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn6mn\" (UniqueName: \"kubernetes.io/projected/45d46867-5215-4ec7-a2a8-94c398de6a4f-kube-api-access-pn6mn\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.757276 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-logs\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.757503 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.764082 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-scripts\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.774248 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.783468 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-config-data\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.792769 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx22p\" (UniqueName: \"kubernetes.io/projected/a74b5c77-830a-402a-a0de-e1981ad4c293-kube-api-access-fx22p\") pod \"glance-default-external-api-0\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.810697 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858159 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljw86\" (UniqueName: \"kubernetes.io/projected/2298644a-6792-41dc-9bc5-994bdf1ff6cb-kube-api-access-ljw86\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858243 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn6mn\" (UniqueName: \"kubernetes.io/projected/45d46867-5215-4ec7-a2a8-94c398de6a4f-kube-api-access-pn6mn\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858284 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858339 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858398 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858441 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-config\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858511 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858571 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-nb\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858638 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858699 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-sb\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.858738 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-dns-svc\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.859845 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-dns-svc\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.860795 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-nb\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.861002 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-sb\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.861002 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-config\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.881659 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn6mn\" (UniqueName: \"kubernetes.io/projected/45d46867-5215-4ec7-a2a8-94c398de6a4f-kube-api-access-pn6mn\") pod \"dnsmasq-dns-76c747cbd5-ntrd9\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.946596 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.960609 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.960680 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.960771 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.960886 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljw86\" (UniqueName: \"kubernetes.io/projected/2298644a-6792-41dc-9bc5-994bdf1ff6cb-kube-api-access-ljw86\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.960925 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.960970 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.961482 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.965642 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.966565 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-logs\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.972979 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:24 crc kubenswrapper[4909]: I0202 12:05:24.989825 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljw86\" (UniqueName: \"kubernetes.io/projected/2298644a-6792-41dc-9bc5-994bdf1ff6cb-kube-api-access-ljw86\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:25 crc kubenswrapper[4909]: I0202 12:05:25.010601 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:25 crc kubenswrapper[4909]: I0202 12:05:25.045446 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:25 crc kubenswrapper[4909]: I0202 12:05:25.449270 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:05:25 crc kubenswrapper[4909]: W0202 12:05:25.521598 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45d46867_5215_4ec7_a2a8_94c398de6a4f.slice/crio-3f90152365872ead39059b6ccf4c14a707b4499af781855043722deb3f491078 WatchSource:0}: Error finding container 3f90152365872ead39059b6ccf4c14a707b4499af781855043722deb3f491078: Status 404 returned error can't find the container with id 3f90152365872ead39059b6ccf4c14a707b4499af781855043722deb3f491078 Feb 02 12:05:25 crc kubenswrapper[4909]: I0202 12:05:25.522766 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76c747cbd5-ntrd9"] Feb 02 12:05:25 crc kubenswrapper[4909]: W0202 12:05:25.717114 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2298644a_6792_41dc_9bc5_994bdf1ff6cb.slice/crio-584d3ed97644833dae057037ccb0d212fba9976f7ae94607b25515ac24351889 WatchSource:0}: Error finding container 584d3ed97644833dae057037ccb0d212fba9976f7ae94607b25515ac24351889: Status 404 returned error can't find the container with id 584d3ed97644833dae057037ccb0d212fba9976f7ae94607b25515ac24351889 Feb 02 12:05:25 crc kubenswrapper[4909]: I0202 12:05:25.718147 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:05:25 crc kubenswrapper[4909]: I0202 12:05:25.924570 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:05:26 crc kubenswrapper[4909]: I0202 12:05:26.067052 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a74b5c77-830a-402a-a0de-e1981ad4c293","Type":"ContainerStarted","Data":"66d8b782c84d9b707c4af864ff2cfb32d55bc98f116bc17f9175293317fc8eaa"} Feb 02 12:05:26 crc kubenswrapper[4909]: I0202 12:05:26.081058 4909 generic.go:334] "Generic (PLEG): container finished" podID="45d46867-5215-4ec7-a2a8-94c398de6a4f" containerID="8042a24e72598aec2239bbac4c00396a9f4536978eb8540a3b7e2dc17283efc9" exitCode=0 Feb 02 12:05:26 crc kubenswrapper[4909]: I0202 12:05:26.081144 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" event={"ID":"45d46867-5215-4ec7-a2a8-94c398de6a4f","Type":"ContainerDied","Data":"8042a24e72598aec2239bbac4c00396a9f4536978eb8540a3b7e2dc17283efc9"} Feb 02 12:05:26 crc kubenswrapper[4909]: I0202 12:05:26.081170 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" event={"ID":"45d46867-5215-4ec7-a2a8-94c398de6a4f","Type":"ContainerStarted","Data":"3f90152365872ead39059b6ccf4c14a707b4499af781855043722deb3f491078"} Feb 02 12:05:26 crc kubenswrapper[4909]: I0202 12:05:26.112678 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2298644a-6792-41dc-9bc5-994bdf1ff6cb","Type":"ContainerStarted","Data":"584d3ed97644833dae057037ccb0d212fba9976f7ae94607b25515ac24351889"} Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.145525 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a74b5c77-830a-402a-a0de-e1981ad4c293","Type":"ContainerStarted","Data":"7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d"} Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.146056 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a74b5c77-830a-402a-a0de-e1981ad4c293","Type":"ContainerStarted","Data":"c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d"} Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.145861 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a74b5c77-830a-402a-a0de-e1981ad4c293" containerName="glance-log" containerID="cri-o://c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d" gracePeriod=30 Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.146394 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="a74b5c77-830a-402a-a0de-e1981ad4c293" containerName="glance-httpd" containerID="cri-o://7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d" gracePeriod=30 Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.149152 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2298644a-6792-41dc-9bc5-994bdf1ff6cb","Type":"ContainerStarted","Data":"9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd"} Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.150973 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" event={"ID":"45d46867-5215-4ec7-a2a8-94c398de6a4f","Type":"ContainerStarted","Data":"b19fb7db4717b78c4d3b4a5128835e2eb236b26f7f191dffe4b397a9a5bd852f"} Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.151961 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.178755 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.1787334449999998 podStartE2EDuration="3.178733445s" podCreationTimestamp="2026-02-02 12:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:05:27.166942681 +0000 UTC m=+5652.913043436" watchObservedRunningTime="2026-02-02 12:05:27.178733445 +0000 UTC m=+5652.924834180" Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.198339 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" podStartSLOduration=3.198319131 podStartE2EDuration="3.198319131s" podCreationTimestamp="2026-02-02 12:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:05:27.191401095 +0000 UTC m=+5652.937501850" watchObservedRunningTime="2026-02-02 12:05:27.198319131 +0000 UTC m=+5652.944419866" Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.386691 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:05:27 crc kubenswrapper[4909]: I0202 12:05:27.908255 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.026949 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-combined-ca-bundle\") pod \"a74b5c77-830a-402a-a0de-e1981ad4c293\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.027182 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-config-data\") pod \"a74b5c77-830a-402a-a0de-e1981ad4c293\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.027289 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-httpd-run\") pod \"a74b5c77-830a-402a-a0de-e1981ad4c293\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.027332 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-logs\") pod \"a74b5c77-830a-402a-a0de-e1981ad4c293\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.027391 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx22p\" (UniqueName: \"kubernetes.io/projected/a74b5c77-830a-402a-a0de-e1981ad4c293-kube-api-access-fx22p\") pod \"a74b5c77-830a-402a-a0de-e1981ad4c293\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.027464 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-scripts\") pod \"a74b5c77-830a-402a-a0de-e1981ad4c293\" (UID: \"a74b5c77-830a-402a-a0de-e1981ad4c293\") " Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.028972 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-logs" (OuterVolumeSpecName: "logs") pod "a74b5c77-830a-402a-a0de-e1981ad4c293" (UID: "a74b5c77-830a-402a-a0de-e1981ad4c293"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.028995 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a74b5c77-830a-402a-a0de-e1981ad4c293" (UID: "a74b5c77-830a-402a-a0de-e1981ad4c293"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.035984 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-scripts" (OuterVolumeSpecName: "scripts") pod "a74b5c77-830a-402a-a0de-e1981ad4c293" (UID: "a74b5c77-830a-402a-a0de-e1981ad4c293"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.040864 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a74b5c77-830a-402a-a0de-e1981ad4c293-kube-api-access-fx22p" (OuterVolumeSpecName: "kube-api-access-fx22p") pod "a74b5c77-830a-402a-a0de-e1981ad4c293" (UID: "a74b5c77-830a-402a-a0de-e1981ad4c293"). InnerVolumeSpecName "kube-api-access-fx22p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.060568 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a74b5c77-830a-402a-a0de-e1981ad4c293" (UID: "a74b5c77-830a-402a-a0de-e1981ad4c293"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.099146 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-config-data" (OuterVolumeSpecName: "config-data") pod "a74b5c77-830a-402a-a0de-e1981ad4c293" (UID: "a74b5c77-830a-402a-a0de-e1981ad4c293"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.132140 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.132174 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.132183 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.132193 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a74b5c77-830a-402a-a0de-e1981ad4c293-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.132202 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx22p\" (UniqueName: \"kubernetes.io/projected/a74b5c77-830a-402a-a0de-e1981ad4c293-kube-api-access-fx22p\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.132211 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a74b5c77-830a-402a-a0de-e1981ad4c293-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.162484 4909 generic.go:334] "Generic (PLEG): container finished" podID="a74b5c77-830a-402a-a0de-e1981ad4c293" containerID="7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d" exitCode=143 Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.162529 4909 generic.go:334] "Generic (PLEG): container finished" podID="a74b5c77-830a-402a-a0de-e1981ad4c293" containerID="c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d" exitCode=143 Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.162535 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a74b5c77-830a-402a-a0de-e1981ad4c293","Type":"ContainerDied","Data":"7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d"} Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.162591 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a74b5c77-830a-402a-a0de-e1981ad4c293","Type":"ContainerDied","Data":"c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d"} Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.162603 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a74b5c77-830a-402a-a0de-e1981ad4c293","Type":"ContainerDied","Data":"66d8b782c84d9b707c4af864ff2cfb32d55bc98f116bc17f9175293317fc8eaa"} Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.162620 4909 scope.go:117] "RemoveContainer" containerID="7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.162561 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.166662 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2298644a-6792-41dc-9bc5-994bdf1ff6cb","Type":"ContainerStarted","Data":"da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f"} Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.194629 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.194602459 podStartE2EDuration="4.194602459s" podCreationTimestamp="2026-02-02 12:05:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:05:28.192978533 +0000 UTC m=+5653.939079258" watchObservedRunningTime="2026-02-02 12:05:28.194602459 +0000 UTC m=+5653.940703184" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.212087 4909 scope.go:117] "RemoveContainer" containerID="c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.218251 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.232202 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.241696 4909 scope.go:117] "RemoveContainer" containerID="7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d" Feb 02 12:05:28 crc kubenswrapper[4909]: E0202 12:05:28.242250 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d\": container with ID starting with 7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d not found: ID does not exist" containerID="7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.242340 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d"} err="failed to get container status \"7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d\": rpc error: code = NotFound desc = could not find container \"7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d\": container with ID starting with 7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d not found: ID does not exist" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.242405 4909 scope.go:117] "RemoveContainer" containerID="c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d" Feb 02 12:05:28 crc kubenswrapper[4909]: E0202 12:05:28.243092 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d\": container with ID starting with c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d not found: ID does not exist" containerID="c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.243173 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d"} err="failed to get container status \"c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d\": rpc error: code = NotFound desc = could not find container \"c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d\": container with ID starting with c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d not found: ID does not exist" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.243205 4909 scope.go:117] "RemoveContainer" containerID="7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.243511 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d"} err="failed to get container status \"7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d\": rpc error: code = NotFound desc = could not find container \"7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d\": container with ID starting with 7ee871ec3af51a12f24f4165644d36c7ec74d776b28b7fbcc71ebffc27f7ec1d not found: ID does not exist" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.243574 4909 scope.go:117] "RemoveContainer" containerID="c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.244169 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d"} err="failed to get container status \"c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d\": rpc error: code = NotFound desc = could not find container \"c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d\": container with ID starting with c3f3adfebdef21f4247659ddb5d058a360fbc57e1ceb0a42573021e9e0a4613d not found: ID does not exist" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.247251 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:05:28 crc kubenswrapper[4909]: E0202 12:05:28.247615 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74b5c77-830a-402a-a0de-e1981ad4c293" containerName="glance-log" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.247632 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74b5c77-830a-402a-a0de-e1981ad4c293" containerName="glance-log" Feb 02 12:05:28 crc kubenswrapper[4909]: E0202 12:05:28.247644 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a74b5c77-830a-402a-a0de-e1981ad4c293" containerName="glance-httpd" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.247650 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a74b5c77-830a-402a-a0de-e1981ad4c293" containerName="glance-httpd" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.247873 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74b5c77-830a-402a-a0de-e1981ad4c293" containerName="glance-httpd" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.247900 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a74b5c77-830a-402a-a0de-e1981ad4c293" containerName="glance-log" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.248780 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.250826 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.250993 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.310948 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.334751 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpcfq\" (UniqueName: \"kubernetes.io/projected/5dc8b554-88ed-470a-b46a-3c0f611f75a8-kube-api-access-hpcfq\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.334902 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-logs\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.334969 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.334994 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.335057 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.335098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.335141 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.439780 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.440282 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpcfq\" (UniqueName: \"kubernetes.io/projected/5dc8b554-88ed-470a-b46a-3c0f611f75a8-kube-api-access-hpcfq\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.440424 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.443016 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-logs\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.443160 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.443208 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.443324 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.443396 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.445657 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-logs\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.448251 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.448366 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.448683 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.450019 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.459730 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpcfq\" (UniqueName: \"kubernetes.io/projected/5dc8b554-88ed-470a-b46a-3c0f611f75a8-kube-api-access-hpcfq\") pod \"glance-default-external-api-0\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " pod="openstack/glance-default-external-api-0" Feb 02 12:05:28 crc kubenswrapper[4909]: I0202 12:05:28.640002 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.031336 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a74b5c77-830a-402a-a0de-e1981ad4c293" path="/var/lib/kubelet/pods/a74b5c77-830a-402a-a0de-e1981ad4c293/volumes" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.119068 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-48hqz"] Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.122716 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.137264 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48hqz"] Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.193997 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" containerName="glance-log" containerID="cri-o://9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd" gracePeriod=30 Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.194047 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" containerName="glance-httpd" containerID="cri-o://da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f" gracePeriod=30 Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.203772 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:05:29 crc kubenswrapper[4909]: W0202 12:05:29.208639 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc8b554_88ed_470a_b46a_3c0f611f75a8.slice/crio-eba133e8c2a6f7a0ff6d9a6faaf3cc08378656e4600394dfc8bf8b8ae440c122 WatchSource:0}: Error finding container eba133e8c2a6f7a0ff6d9a6faaf3cc08378656e4600394dfc8bf8b8ae440c122: Status 404 returned error can't find the container with id eba133e8c2a6f7a0ff6d9a6faaf3cc08378656e4600394dfc8bf8b8ae440c122 Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.261246 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-catalog-content\") pod \"redhat-marketplace-48hqz\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.261371 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-utilities\") pod \"redhat-marketplace-48hqz\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.261517 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phlt5\" (UniqueName: \"kubernetes.io/projected/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-kube-api-access-phlt5\") pod \"redhat-marketplace-48hqz\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.362800 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phlt5\" (UniqueName: \"kubernetes.io/projected/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-kube-api-access-phlt5\") pod \"redhat-marketplace-48hqz\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.363005 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-catalog-content\") pod \"redhat-marketplace-48hqz\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.363133 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-utilities\") pod \"redhat-marketplace-48hqz\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.363578 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-catalog-content\") pod \"redhat-marketplace-48hqz\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.363833 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-utilities\") pod \"redhat-marketplace-48hqz\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.383056 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phlt5\" (UniqueName: \"kubernetes.io/projected/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-kube-api-access-phlt5\") pod \"redhat-marketplace-48hqz\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.464255 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:29 crc kubenswrapper[4909]: I0202 12:05:29.998829 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.089170 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-logs\") pod \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.089263 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-combined-ca-bundle\") pod \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.089406 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-httpd-run\") pod \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.089459 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-scripts\") pod \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.089511 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-config-data\") pod \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.089540 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljw86\" (UniqueName: \"kubernetes.io/projected/2298644a-6792-41dc-9bc5-994bdf1ff6cb-kube-api-access-ljw86\") pod \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\" (UID: \"2298644a-6792-41dc-9bc5-994bdf1ff6cb\") " Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.090664 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2298644a-6792-41dc-9bc5-994bdf1ff6cb" (UID: "2298644a-6792-41dc-9bc5-994bdf1ff6cb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.090835 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-logs" (OuterVolumeSpecName: "logs") pod "2298644a-6792-41dc-9bc5-994bdf1ff6cb" (UID: "2298644a-6792-41dc-9bc5-994bdf1ff6cb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.094108 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2298644a-6792-41dc-9bc5-994bdf1ff6cb-kube-api-access-ljw86" (OuterVolumeSpecName: "kube-api-access-ljw86") pod "2298644a-6792-41dc-9bc5-994bdf1ff6cb" (UID: "2298644a-6792-41dc-9bc5-994bdf1ff6cb"). InnerVolumeSpecName "kube-api-access-ljw86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.094622 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-scripts" (OuterVolumeSpecName: "scripts") pod "2298644a-6792-41dc-9bc5-994bdf1ff6cb" (UID: "2298644a-6792-41dc-9bc5-994bdf1ff6cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.125896 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2298644a-6792-41dc-9bc5-994bdf1ff6cb" (UID: "2298644a-6792-41dc-9bc5-994bdf1ff6cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.132628 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-48hqz"] Feb 02 12:05:30 crc kubenswrapper[4909]: W0202 12:05:30.137348 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c8c1618_4fb8_4368_ab48_d57e0c23b55f.slice/crio-2199cdff3c608c25718d64e4ee4208b16cae9e586c5ba486cc75acb66903365a WatchSource:0}: Error finding container 2199cdff3c608c25718d64e4ee4208b16cae9e586c5ba486cc75acb66903365a: Status 404 returned error can't find the container with id 2199cdff3c608c25718d64e4ee4208b16cae9e586c5ba486cc75acb66903365a Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.146549 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-config-data" (OuterVolumeSpecName: "config-data") pod "2298644a-6792-41dc-9bc5-994bdf1ff6cb" (UID: "2298644a-6792-41dc-9bc5-994bdf1ff6cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.193697 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljw86\" (UniqueName: \"kubernetes.io/projected/2298644a-6792-41dc-9bc5-994bdf1ff6cb-kube-api-access-ljw86\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.193882 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.193968 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.194050 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2298644a-6792-41dc-9bc5-994bdf1ff6cb-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.194127 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.194199 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2298644a-6792-41dc-9bc5-994bdf1ff6cb-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.213860 4909 generic.go:334] "Generic (PLEG): container finished" podID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" containerID="da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f" exitCode=0 Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.214642 4909 generic.go:334] "Generic (PLEG): container finished" podID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" containerID="9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd" exitCode=143 Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.213979 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.213936 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2298644a-6792-41dc-9bc5-994bdf1ff6cb","Type":"ContainerDied","Data":"da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f"} Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.215109 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2298644a-6792-41dc-9bc5-994bdf1ff6cb","Type":"ContainerDied","Data":"9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd"} Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.215168 4909 scope.go:117] "RemoveContainer" containerID="da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.215134 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2298644a-6792-41dc-9bc5-994bdf1ff6cb","Type":"ContainerDied","Data":"584d3ed97644833dae057037ccb0d212fba9976f7ae94607b25515ac24351889"} Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.220202 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5dc8b554-88ed-470a-b46a-3c0f611f75a8","Type":"ContainerStarted","Data":"96f4ccc6f4ef62d04ab2978fcd1557cbd0b70ab96e442bdeae7648989d511a40"} Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.220266 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5dc8b554-88ed-470a-b46a-3c0f611f75a8","Type":"ContainerStarted","Data":"eba133e8c2a6f7a0ff6d9a6faaf3cc08378656e4600394dfc8bf8b8ae440c122"} Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.221476 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48hqz" event={"ID":"5c8c1618-4fb8-4368-ab48-d57e0c23b55f","Type":"ContainerStarted","Data":"2199cdff3c608c25718d64e4ee4208b16cae9e586c5ba486cc75acb66903365a"} Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.247643 4909 scope.go:117] "RemoveContainer" containerID="9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.271768 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.288653 4909 scope.go:117] "RemoveContainer" containerID="da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f" Feb 02 12:05:30 crc kubenswrapper[4909]: E0202 12:05:30.292282 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f\": container with ID starting with da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f not found: ID does not exist" containerID="da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.292329 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f"} err="failed to get container status \"da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f\": rpc error: code = NotFound desc = could not find container \"da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f\": container with ID starting with da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f not found: ID does not exist" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.292356 4909 scope.go:117] "RemoveContainer" containerID="9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.294548 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:05:30 crc kubenswrapper[4909]: E0202 12:05:30.296009 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd\": container with ID starting with 9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd not found: ID does not exist" containerID="9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.296058 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd"} err="failed to get container status \"9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd\": rpc error: code = NotFound desc = could not find container \"9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd\": container with ID starting with 9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd not found: ID does not exist" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.296090 4909 scope.go:117] "RemoveContainer" containerID="da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.296423 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f"} err="failed to get container status \"da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f\": rpc error: code = NotFound desc = could not find container \"da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f\": container with ID starting with da1e1bef129bd3d3cb676ddaa897a665280853742f5af02dbe29a94ad296533f not found: ID does not exist" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.296467 4909 scope.go:117] "RemoveContainer" containerID="9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.296887 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd"} err="failed to get container status \"9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd\": rpc error: code = NotFound desc = could not find container \"9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd\": container with ID starting with 9e0d1cc6cf5e137bd59410476db9b2812fedf4d81da0c2e5d618f47a98a981cd not found: ID does not exist" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.312018 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:05:30 crc kubenswrapper[4909]: E0202 12:05:30.312362 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" containerName="glance-httpd" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.312381 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" containerName="glance-httpd" Feb 02 12:05:30 crc kubenswrapper[4909]: E0202 12:05:30.312405 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" containerName="glance-log" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.312412 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" containerName="glance-log" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.312552 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" containerName="glance-httpd" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.312577 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" containerName="glance-log" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.313617 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.321745 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.322089 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.322143 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.498021 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.498319 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.498390 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-logs\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.498416 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.498432 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.498458 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfckc\" (UniqueName: \"kubernetes.io/projected/3e06882f-c657-49fc-a5a1-0657121844f9-kube-api-access-vfckc\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.498739 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.600786 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.600958 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.601021 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.601164 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-logs\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.601205 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.601224 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.601260 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfckc\" (UniqueName: \"kubernetes.io/projected/3e06882f-c657-49fc-a5a1-0657121844f9-kube-api-access-vfckc\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.602413 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.602433 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-logs\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.605858 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.606919 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.607396 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.610285 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.618612 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfckc\" (UniqueName: \"kubernetes.io/projected/3e06882f-c657-49fc-a5a1-0657121844f9-kube-api-access-vfckc\") pod \"glance-default-internal-api-0\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:05:30 crc kubenswrapper[4909]: I0202 12:05:30.634110 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:32 crc kubenswrapper[4909]: I0202 12:05:31.026443 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2298644a-6792-41dc-9bc5-994bdf1ff6cb" path="/var/lib/kubelet/pods/2298644a-6792-41dc-9bc5-994bdf1ff6cb/volumes" Feb 02 12:05:32 crc kubenswrapper[4909]: I0202 12:05:31.232008 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerID="c3b9e512432981cec4bd914da4776c45148c84603f304325d6c664bad9fee04f" exitCode=0 Feb 02 12:05:32 crc kubenswrapper[4909]: I0202 12:05:31.232056 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48hqz" event={"ID":"5c8c1618-4fb8-4368-ab48-d57e0c23b55f","Type":"ContainerDied","Data":"c3b9e512432981cec4bd914da4776c45148c84603f304325d6c664bad9fee04f"} Feb 02 12:05:32 crc kubenswrapper[4909]: I0202 12:05:31.234098 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 12:05:32 crc kubenswrapper[4909]: I0202 12:05:31.237112 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5dc8b554-88ed-470a-b46a-3c0f611f75a8","Type":"ContainerStarted","Data":"9fde8894f21d4504d06d52c3811b3597f98f90e7e5fd7842792938797aaa82fa"} Feb 02 12:05:32 crc kubenswrapper[4909]: I0202 12:05:31.285246 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.285223832 podStartE2EDuration="3.285223832s" podCreationTimestamp="2026-02-02 12:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:05:31.277148342 +0000 UTC m=+5657.023249087" watchObservedRunningTime="2026-02-02 12:05:31.285223832 +0000 UTC m=+5657.031324587" Feb 02 12:05:32 crc kubenswrapper[4909]: I0202 12:05:32.246767 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48hqz" event={"ID":"5c8c1618-4fb8-4368-ab48-d57e0c23b55f","Type":"ContainerStarted","Data":"531041ee90724a687a125c32125e6673bc461283892efe90aa011da6456629ff"} Feb 02 12:05:32 crc kubenswrapper[4909]: E0202 12:05:32.398784 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c8c1618_4fb8_4368_ab48_d57e0c23b55f.slice/crio-531041ee90724a687a125c32125e6673bc461283892efe90aa011da6456629ff.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c8c1618_4fb8_4368_ab48_d57e0c23b55f.slice/crio-conmon-531041ee90724a687a125c32125e6673bc461283892efe90aa011da6456629ff.scope\": RecentStats: unable to find data in memory cache]" Feb 02 12:05:32 crc kubenswrapper[4909]: I0202 12:05:32.518262 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:05:32 crc kubenswrapper[4909]: W0202 12:05:32.522072 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e06882f_c657_49fc_a5a1_0657121844f9.slice/crio-bc4d2dc893be6343f1e743af17426fa3406b5ac22bf25d7be5183ac9267b615a WatchSource:0}: Error finding container bc4d2dc893be6343f1e743af17426fa3406b5ac22bf25d7be5183ac9267b615a: Status 404 returned error can't find the container with id bc4d2dc893be6343f1e743af17426fa3406b5ac22bf25d7be5183ac9267b615a Feb 02 12:05:33 crc kubenswrapper[4909]: I0202 12:05:33.259763 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerID="531041ee90724a687a125c32125e6673bc461283892efe90aa011da6456629ff" exitCode=0 Feb 02 12:05:33 crc kubenswrapper[4909]: I0202 12:05:33.259930 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48hqz" event={"ID":"5c8c1618-4fb8-4368-ab48-d57e0c23b55f","Type":"ContainerDied","Data":"531041ee90724a687a125c32125e6673bc461283892efe90aa011da6456629ff"} Feb 02 12:05:33 crc kubenswrapper[4909]: I0202 12:05:33.263695 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e06882f-c657-49fc-a5a1-0657121844f9","Type":"ContainerStarted","Data":"7cc5359348c3520e4ea084e1f1374a0dfb12683f12fec55c1aafc56234509af1"} Feb 02 12:05:33 crc kubenswrapper[4909]: I0202 12:05:33.263777 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e06882f-c657-49fc-a5a1-0657121844f9","Type":"ContainerStarted","Data":"bc4d2dc893be6343f1e743af17426fa3406b5ac22bf25d7be5183ac9267b615a"} Feb 02 12:05:34 crc kubenswrapper[4909]: I0202 12:05:34.279088 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e06882f-c657-49fc-a5a1-0657121844f9","Type":"ContainerStarted","Data":"d04ce1909f7f75c4a6b91d6cba96d86f768ab4138ad910634c3ddafeb3c56643"} Feb 02 12:05:34 crc kubenswrapper[4909]: I0202 12:05:34.309007 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.308984516 podStartE2EDuration="4.308984516s" podCreationTimestamp="2026-02-02 12:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:05:34.297398657 +0000 UTC m=+5660.043499402" watchObservedRunningTime="2026-02-02 12:05:34.308984516 +0000 UTC m=+5660.055085251" Feb 02 12:05:34 crc kubenswrapper[4909]: I0202 12:05:34.948007 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:05:35 crc kubenswrapper[4909]: I0202 12:05:35.003860 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb6955cd5-q5rkf"] Feb 02 12:05:35 crc kubenswrapper[4909]: I0202 12:05:35.004081 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" podUID="90545462-3397-4592-9bc2-0ff7471e7791" containerName="dnsmasq-dns" containerID="cri-o://89ea69836f5cb241e726c47c6d625ce695f445b41f6eb8b04cc62e0824a8213f" gracePeriod=10 Feb 02 12:05:35 crc kubenswrapper[4909]: I0202 12:05:35.291308 4909 generic.go:334] "Generic (PLEG): container finished" podID="90545462-3397-4592-9bc2-0ff7471e7791" containerID="89ea69836f5cb241e726c47c6d625ce695f445b41f6eb8b04cc62e0824a8213f" exitCode=0 Feb 02 12:05:35 crc kubenswrapper[4909]: I0202 12:05:35.291400 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" event={"ID":"90545462-3397-4592-9bc2-0ff7471e7791","Type":"ContainerDied","Data":"89ea69836f5cb241e726c47c6d625ce695f445b41f6eb8b04cc62e0824a8213f"} Feb 02 12:05:35 crc kubenswrapper[4909]: I0202 12:05:35.293763 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48hqz" event={"ID":"5c8c1618-4fb8-4368-ab48-d57e0c23b55f","Type":"ContainerStarted","Data":"a9eef1c1da5d8440a2dbdec197a5148ee2baf66e29aafecf11db339264067861"} Feb 02 12:05:35 crc kubenswrapper[4909]: I0202 12:05:35.323314 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-48hqz" podStartSLOduration=2.901283207 podStartE2EDuration="6.323297836s" podCreationTimestamp="2026-02-02 12:05:29 +0000 UTC" firstStartedPulling="2026-02-02 12:05:31.233851403 +0000 UTC m=+5656.979952158" lastFinishedPulling="2026-02-02 12:05:34.655866052 +0000 UTC m=+5660.401966787" observedRunningTime="2026-02-02 12:05:35.32308265 +0000 UTC m=+5661.069183385" watchObservedRunningTime="2026-02-02 12:05:35.323297836 +0000 UTC m=+5661.069398561" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.042080 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.209763 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll44r\" (UniqueName: \"kubernetes.io/projected/90545462-3397-4592-9bc2-0ff7471e7791-kube-api-access-ll44r\") pod \"90545462-3397-4592-9bc2-0ff7471e7791\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.209864 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-nb\") pod \"90545462-3397-4592-9bc2-0ff7471e7791\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.209898 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-config\") pod \"90545462-3397-4592-9bc2-0ff7471e7791\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.210038 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-dns-svc\") pod \"90545462-3397-4592-9bc2-0ff7471e7791\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.210059 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-sb\") pod \"90545462-3397-4592-9bc2-0ff7471e7791\" (UID: \"90545462-3397-4592-9bc2-0ff7471e7791\") " Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.215771 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90545462-3397-4592-9bc2-0ff7471e7791-kube-api-access-ll44r" (OuterVolumeSpecName: "kube-api-access-ll44r") pod "90545462-3397-4592-9bc2-0ff7471e7791" (UID: "90545462-3397-4592-9bc2-0ff7471e7791"). InnerVolumeSpecName "kube-api-access-ll44r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.257940 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-config" (OuterVolumeSpecName: "config") pod "90545462-3397-4592-9bc2-0ff7471e7791" (UID: "90545462-3397-4592-9bc2-0ff7471e7791"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.257954 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "90545462-3397-4592-9bc2-0ff7471e7791" (UID: "90545462-3397-4592-9bc2-0ff7471e7791"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.259008 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "90545462-3397-4592-9bc2-0ff7471e7791" (UID: "90545462-3397-4592-9bc2-0ff7471e7791"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.264048 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "90545462-3397-4592-9bc2-0ff7471e7791" (UID: "90545462-3397-4592-9bc2-0ff7471e7791"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.303711 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" event={"ID":"90545462-3397-4592-9bc2-0ff7471e7791","Type":"ContainerDied","Data":"6c400f5836fecf500863bab1a64dc10879c04cd8a50c5160a64abcde42837d94"} Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.304506 4909 scope.go:117] "RemoveContainer" containerID="89ea69836f5cb241e726c47c6d625ce695f445b41f6eb8b04cc62e0824a8213f" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.303764 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fb6955cd5-q5rkf" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.312754 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.312829 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.312847 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll44r\" (UniqueName: \"kubernetes.io/projected/90545462-3397-4592-9bc2-0ff7471e7791-kube-api-access-ll44r\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.312860 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.312870 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90545462-3397-4592-9bc2-0ff7471e7791-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.377577 4909 scope.go:117] "RemoveContainer" containerID="a00b31552aa09e21cdc253d11272915be6a7a9d85d3b07efcb97c42332d3f81e" Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.385294 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fb6955cd5-q5rkf"] Feb 02 12:05:36 crc kubenswrapper[4909]: I0202 12:05:36.397703 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fb6955cd5-q5rkf"] Feb 02 12:05:37 crc kubenswrapper[4909]: I0202 12:05:37.026707 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90545462-3397-4592-9bc2-0ff7471e7791" path="/var/lib/kubelet/pods/90545462-3397-4592-9bc2-0ff7471e7791/volumes" Feb 02 12:05:38 crc kubenswrapper[4909]: I0202 12:05:38.640754 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 12:05:38 crc kubenswrapper[4909]: I0202 12:05:38.641450 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 12:05:38 crc kubenswrapper[4909]: I0202 12:05:38.684048 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 12:05:38 crc kubenswrapper[4909]: I0202 12:05:38.686969 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 12:05:39 crc kubenswrapper[4909]: I0202 12:05:39.331161 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 12:05:39 crc kubenswrapper[4909]: I0202 12:05:39.331432 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 12:05:39 crc kubenswrapper[4909]: I0202 12:05:39.465092 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:39 crc kubenswrapper[4909]: I0202 12:05:39.465173 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:39 crc kubenswrapper[4909]: I0202 12:05:39.513872 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:40 crc kubenswrapper[4909]: I0202 12:05:40.416907 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:40 crc kubenswrapper[4909]: I0202 12:05:40.478383 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48hqz"] Feb 02 12:05:40 crc kubenswrapper[4909]: I0202 12:05:40.635489 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:40 crc kubenswrapper[4909]: I0202 12:05:40.635539 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:40 crc kubenswrapper[4909]: I0202 12:05:40.701604 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:40 crc kubenswrapper[4909]: I0202 12:05:40.781365 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:41 crc kubenswrapper[4909]: I0202 12:05:41.357451 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:41 crc kubenswrapper[4909]: I0202 12:05:41.357833 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:41 crc kubenswrapper[4909]: I0202 12:05:41.623022 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 12:05:41 crc kubenswrapper[4909]: I0202 12:05:41.623098 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 12:05:41 crc kubenswrapper[4909]: I0202 12:05:41.714283 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 12:05:42 crc kubenswrapper[4909]: I0202 12:05:42.365070 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-48hqz" podUID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerName="registry-server" containerID="cri-o://a9eef1c1da5d8440a2dbdec197a5148ee2baf66e29aafecf11db339264067861" gracePeriod=2 Feb 02 12:05:42 crc kubenswrapper[4909]: E0202 12:05:42.642406 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c8c1618_4fb8_4368_ab48_d57e0c23b55f.slice/crio-a9eef1c1da5d8440a2dbdec197a5148ee2baf66e29aafecf11db339264067861.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c8c1618_4fb8_4368_ab48_d57e0c23b55f.slice/crio-conmon-a9eef1c1da5d8440a2dbdec197a5148ee2baf66e29aafecf11db339264067861.scope\": RecentStats: unable to find data in memory cache]" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.375504 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerID="a9eef1c1da5d8440a2dbdec197a5148ee2baf66e29aafecf11db339264067861" exitCode=0 Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.375591 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48hqz" event={"ID":"5c8c1618-4fb8-4368-ab48-d57e0c23b55f","Type":"ContainerDied","Data":"a9eef1c1da5d8440a2dbdec197a5148ee2baf66e29aafecf11db339264067861"} Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.375901 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-48hqz" event={"ID":"5c8c1618-4fb8-4368-ab48-d57e0c23b55f","Type":"ContainerDied","Data":"2199cdff3c608c25718d64e4ee4208b16cae9e586c5ba486cc75acb66903365a"} Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.375914 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2199cdff3c608c25718d64e4ee4208b16cae9e586c5ba486cc75acb66903365a" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.394106 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.402458 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.402558 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.451516 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.555378 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-catalog-content\") pod \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.555478 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-utilities\") pod \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.555638 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phlt5\" (UniqueName: \"kubernetes.io/projected/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-kube-api-access-phlt5\") pod \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\" (UID: \"5c8c1618-4fb8-4368-ab48-d57e0c23b55f\") " Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.564497 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-utilities" (OuterVolumeSpecName: "utilities") pod "5c8c1618-4fb8-4368-ab48-d57e0c23b55f" (UID: "5c8c1618-4fb8-4368-ab48-d57e0c23b55f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.571176 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-kube-api-access-phlt5" (OuterVolumeSpecName: "kube-api-access-phlt5") pod "5c8c1618-4fb8-4368-ab48-d57e0c23b55f" (UID: "5c8c1618-4fb8-4368-ab48-d57e0c23b55f"). InnerVolumeSpecName "kube-api-access-phlt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.658462 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.658499 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phlt5\" (UniqueName: \"kubernetes.io/projected/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-kube-api-access-phlt5\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.781152 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c8c1618-4fb8-4368-ab48-d57e0c23b55f" (UID: "5c8c1618-4fb8-4368-ab48-d57e0c23b55f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:05:43 crc kubenswrapper[4909]: I0202 12:05:43.861981 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8c1618-4fb8-4368-ab48-d57e0c23b55f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:44 crc kubenswrapper[4909]: I0202 12:05:44.384215 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-48hqz" Feb 02 12:05:44 crc kubenswrapper[4909]: I0202 12:05:44.424256 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-48hqz"] Feb 02 12:05:44 crc kubenswrapper[4909]: I0202 12:05:44.430560 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-48hqz"] Feb 02 12:05:45 crc kubenswrapper[4909]: I0202 12:05:45.034664 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" path="/var/lib/kubelet/pods/5c8c1618-4fb8-4368-ab48-d57e0c23b55f/volumes" Feb 02 12:05:49 crc kubenswrapper[4909]: I0202 12:05:49.510867 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:05:49 crc kubenswrapper[4909]: I0202 12:05:49.511560 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:05:49 crc kubenswrapper[4909]: I0202 12:05:49.511622 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 12:05:49 crc kubenswrapper[4909]: I0202 12:05:49.512444 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 12:05:49 crc kubenswrapper[4909]: I0202 12:05:49.512523 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" gracePeriod=600 Feb 02 12:05:49 crc kubenswrapper[4909]: E0202 12:05:49.702550 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:05:50 crc kubenswrapper[4909]: I0202 12:05:50.431306 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" exitCode=0 Feb 02 12:05:50 crc kubenswrapper[4909]: I0202 12:05:50.431348 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783"} Feb 02 12:05:50 crc kubenswrapper[4909]: I0202 12:05:50.431383 4909 scope.go:117] "RemoveContainer" containerID="c8a4532cb0e21b24c65ea2a45893f4d1a368d3a1bac20d18498ef08f25007082" Feb 02 12:05:50 crc kubenswrapper[4909]: I0202 12:05:50.432162 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:05:50 crc kubenswrapper[4909]: E0202 12:05:50.432646 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.681750 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f2d4-account-create-update-wsl5j"] Feb 02 12:05:51 crc kubenswrapper[4909]: E0202 12:05:51.682564 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerName="registry-server" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.682582 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerName="registry-server" Feb 02 12:05:51 crc kubenswrapper[4909]: E0202 12:05:51.682600 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90545462-3397-4592-9bc2-0ff7471e7791" containerName="dnsmasq-dns" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.682609 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="90545462-3397-4592-9bc2-0ff7471e7791" containerName="dnsmasq-dns" Feb 02 12:05:51 crc kubenswrapper[4909]: E0202 12:05:51.682621 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerName="extract-content" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.682631 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerName="extract-content" Feb 02 12:05:51 crc kubenswrapper[4909]: E0202 12:05:51.682651 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90545462-3397-4592-9bc2-0ff7471e7791" containerName="init" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.682656 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="90545462-3397-4592-9bc2-0ff7471e7791" containerName="init" Feb 02 12:05:51 crc kubenswrapper[4909]: E0202 12:05:51.682679 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerName="extract-utilities" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.682687 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerName="extract-utilities" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.682904 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="90545462-3397-4592-9bc2-0ff7471e7791" containerName="dnsmasq-dns" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.682926 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8c1618-4fb8-4368-ab48-d57e0c23b55f" containerName="registry-server" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.683577 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f2d4-account-create-update-wsl5j" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.685741 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.691302 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-tk8f2"] Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.697865 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tk8f2" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.744405 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tk8f2"] Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.759122 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f2d4-account-create-update-wsl5j"] Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.823911 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f467\" (UniqueName: \"kubernetes.io/projected/1428ae65-f2dd-4d55-8e4b-5da119761240-kube-api-access-8f467\") pod \"placement-f2d4-account-create-update-wsl5j\" (UID: \"1428ae65-f2dd-4d55-8e4b-5da119761240\") " pod="openstack/placement-f2d4-account-create-update-wsl5j" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.824153 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1428ae65-f2dd-4d55-8e4b-5da119761240-operator-scripts\") pod \"placement-f2d4-account-create-update-wsl5j\" (UID: \"1428ae65-f2dd-4d55-8e4b-5da119761240\") " pod="openstack/placement-f2d4-account-create-update-wsl5j" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.824689 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpjcg\" (UniqueName: \"kubernetes.io/projected/19ceb788-98ed-4b63-86c6-c872b71e4a4c-kube-api-access-zpjcg\") pod \"placement-db-create-tk8f2\" (UID: \"19ceb788-98ed-4b63-86c6-c872b71e4a4c\") " pod="openstack/placement-db-create-tk8f2" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.824956 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19ceb788-98ed-4b63-86c6-c872b71e4a4c-operator-scripts\") pod \"placement-db-create-tk8f2\" (UID: \"19ceb788-98ed-4b63-86c6-c872b71e4a4c\") " pod="openstack/placement-db-create-tk8f2" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.926991 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19ceb788-98ed-4b63-86c6-c872b71e4a4c-operator-scripts\") pod \"placement-db-create-tk8f2\" (UID: \"19ceb788-98ed-4b63-86c6-c872b71e4a4c\") " pod="openstack/placement-db-create-tk8f2" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.927064 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f467\" (UniqueName: \"kubernetes.io/projected/1428ae65-f2dd-4d55-8e4b-5da119761240-kube-api-access-8f467\") pod \"placement-f2d4-account-create-update-wsl5j\" (UID: \"1428ae65-f2dd-4d55-8e4b-5da119761240\") " pod="openstack/placement-f2d4-account-create-update-wsl5j" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.927100 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1428ae65-f2dd-4d55-8e4b-5da119761240-operator-scripts\") pod \"placement-f2d4-account-create-update-wsl5j\" (UID: \"1428ae65-f2dd-4d55-8e4b-5da119761240\") " pod="openstack/placement-f2d4-account-create-update-wsl5j" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.927210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpjcg\" (UniqueName: \"kubernetes.io/projected/19ceb788-98ed-4b63-86c6-c872b71e4a4c-kube-api-access-zpjcg\") pod \"placement-db-create-tk8f2\" (UID: \"19ceb788-98ed-4b63-86c6-c872b71e4a4c\") " pod="openstack/placement-db-create-tk8f2" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.928302 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1428ae65-f2dd-4d55-8e4b-5da119761240-operator-scripts\") pod \"placement-f2d4-account-create-update-wsl5j\" (UID: \"1428ae65-f2dd-4d55-8e4b-5da119761240\") " pod="openstack/placement-f2d4-account-create-update-wsl5j" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.928871 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19ceb788-98ed-4b63-86c6-c872b71e4a4c-operator-scripts\") pod \"placement-db-create-tk8f2\" (UID: \"19ceb788-98ed-4b63-86c6-c872b71e4a4c\") " pod="openstack/placement-db-create-tk8f2" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.945722 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f467\" (UniqueName: \"kubernetes.io/projected/1428ae65-f2dd-4d55-8e4b-5da119761240-kube-api-access-8f467\") pod \"placement-f2d4-account-create-update-wsl5j\" (UID: \"1428ae65-f2dd-4d55-8e4b-5da119761240\") " pod="openstack/placement-f2d4-account-create-update-wsl5j" Feb 02 12:05:51 crc kubenswrapper[4909]: I0202 12:05:51.950225 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpjcg\" (UniqueName: \"kubernetes.io/projected/19ceb788-98ed-4b63-86c6-c872b71e4a4c-kube-api-access-zpjcg\") pod \"placement-db-create-tk8f2\" (UID: \"19ceb788-98ed-4b63-86c6-c872b71e4a4c\") " pod="openstack/placement-db-create-tk8f2" Feb 02 12:05:52 crc kubenswrapper[4909]: I0202 12:05:52.015907 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f2d4-account-create-update-wsl5j" Feb 02 12:05:52 crc kubenswrapper[4909]: I0202 12:05:52.027492 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tk8f2" Feb 02 12:05:52 crc kubenswrapper[4909]: I0202 12:05:52.482250 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f2d4-account-create-update-wsl5j"] Feb 02 12:05:52 crc kubenswrapper[4909]: W0202 12:05:52.484914 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1428ae65_f2dd_4d55_8e4b_5da119761240.slice/crio-04f1a933c8426c10f74a9d04d4e39cabe1084ee48a21737f5ac4c823045af4ed WatchSource:0}: Error finding container 04f1a933c8426c10f74a9d04d4e39cabe1084ee48a21737f5ac4c823045af4ed: Status 404 returned error can't find the container with id 04f1a933c8426c10f74a9d04d4e39cabe1084ee48a21737f5ac4c823045af4ed Feb 02 12:05:52 crc kubenswrapper[4909]: I0202 12:05:52.564764 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-tk8f2"] Feb 02 12:05:53 crc kubenswrapper[4909]: I0202 12:05:53.461691 4909 generic.go:334] "Generic (PLEG): container finished" podID="1428ae65-f2dd-4d55-8e4b-5da119761240" containerID="98ba386a8f3d4542e0c86b3588a8190f5dfb775ddca30d03a1168434e01530dc" exitCode=0 Feb 02 12:05:53 crc kubenswrapper[4909]: I0202 12:05:53.461929 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f2d4-account-create-update-wsl5j" event={"ID":"1428ae65-f2dd-4d55-8e4b-5da119761240","Type":"ContainerDied","Data":"98ba386a8f3d4542e0c86b3588a8190f5dfb775ddca30d03a1168434e01530dc"} Feb 02 12:05:53 crc kubenswrapper[4909]: I0202 12:05:53.462352 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f2d4-account-create-update-wsl5j" event={"ID":"1428ae65-f2dd-4d55-8e4b-5da119761240","Type":"ContainerStarted","Data":"04f1a933c8426c10f74a9d04d4e39cabe1084ee48a21737f5ac4c823045af4ed"} Feb 02 12:05:53 crc kubenswrapper[4909]: I0202 12:05:53.465469 4909 generic.go:334] "Generic (PLEG): container finished" podID="19ceb788-98ed-4b63-86c6-c872b71e4a4c" containerID="1051899be5519f40d1171dc941739379b8ad062d29734ff26b8910234cac96f4" exitCode=0 Feb 02 12:05:53 crc kubenswrapper[4909]: I0202 12:05:53.465541 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tk8f2" event={"ID":"19ceb788-98ed-4b63-86c6-c872b71e4a4c","Type":"ContainerDied","Data":"1051899be5519f40d1171dc941739379b8ad062d29734ff26b8910234cac96f4"} Feb 02 12:05:53 crc kubenswrapper[4909]: I0202 12:05:53.465614 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tk8f2" event={"ID":"19ceb788-98ed-4b63-86c6-c872b71e4a4c","Type":"ContainerStarted","Data":"9a5e886cbfd5a74ccbd34f0f9e65c11d97c6fc866ce0376569149850a60f8d17"} Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.930644 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f2d4-account-create-update-wsl5j" Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.954108 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tk8f2" Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.984708 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpjcg\" (UniqueName: \"kubernetes.io/projected/19ceb788-98ed-4b63-86c6-c872b71e4a4c-kube-api-access-zpjcg\") pod \"19ceb788-98ed-4b63-86c6-c872b71e4a4c\" (UID: \"19ceb788-98ed-4b63-86c6-c872b71e4a4c\") " Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.985168 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19ceb788-98ed-4b63-86c6-c872b71e4a4c-operator-scripts\") pod \"19ceb788-98ed-4b63-86c6-c872b71e4a4c\" (UID: \"19ceb788-98ed-4b63-86c6-c872b71e4a4c\") " Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.985356 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1428ae65-f2dd-4d55-8e4b-5da119761240-operator-scripts\") pod \"1428ae65-f2dd-4d55-8e4b-5da119761240\" (UID: \"1428ae65-f2dd-4d55-8e4b-5da119761240\") " Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.985533 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f467\" (UniqueName: \"kubernetes.io/projected/1428ae65-f2dd-4d55-8e4b-5da119761240-kube-api-access-8f467\") pod \"1428ae65-f2dd-4d55-8e4b-5da119761240\" (UID: \"1428ae65-f2dd-4d55-8e4b-5da119761240\") " Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.985656 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19ceb788-98ed-4b63-86c6-c872b71e4a4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19ceb788-98ed-4b63-86c6-c872b71e4a4c" (UID: "19ceb788-98ed-4b63-86c6-c872b71e4a4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.985940 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1428ae65-f2dd-4d55-8e4b-5da119761240-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1428ae65-f2dd-4d55-8e4b-5da119761240" (UID: "1428ae65-f2dd-4d55-8e4b-5da119761240"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.986798 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19ceb788-98ed-4b63-86c6-c872b71e4a4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.986977 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1428ae65-f2dd-4d55-8e4b-5da119761240-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:54 crc kubenswrapper[4909]: I0202 12:05:54.992463 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1428ae65-f2dd-4d55-8e4b-5da119761240-kube-api-access-8f467" (OuterVolumeSpecName: "kube-api-access-8f467") pod "1428ae65-f2dd-4d55-8e4b-5da119761240" (UID: "1428ae65-f2dd-4d55-8e4b-5da119761240"). InnerVolumeSpecName "kube-api-access-8f467". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:05:55 crc kubenswrapper[4909]: I0202 12:05:55.053621 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ceb788-98ed-4b63-86c6-c872b71e4a4c-kube-api-access-zpjcg" (OuterVolumeSpecName: "kube-api-access-zpjcg") pod "19ceb788-98ed-4b63-86c6-c872b71e4a4c" (UID: "19ceb788-98ed-4b63-86c6-c872b71e4a4c"). InnerVolumeSpecName "kube-api-access-zpjcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:05:55 crc kubenswrapper[4909]: I0202 12:05:55.093430 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpjcg\" (UniqueName: \"kubernetes.io/projected/19ceb788-98ed-4b63-86c6-c872b71e4a4c-kube-api-access-zpjcg\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:55 crc kubenswrapper[4909]: I0202 12:05:55.093479 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f467\" (UniqueName: \"kubernetes.io/projected/1428ae65-f2dd-4d55-8e4b-5da119761240-kube-api-access-8f467\") on node \"crc\" DevicePath \"\"" Feb 02 12:05:55 crc kubenswrapper[4909]: I0202 12:05:55.482037 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-tk8f2" event={"ID":"19ceb788-98ed-4b63-86c6-c872b71e4a4c","Type":"ContainerDied","Data":"9a5e886cbfd5a74ccbd34f0f9e65c11d97c6fc866ce0376569149850a60f8d17"} Feb 02 12:05:55 crc kubenswrapper[4909]: I0202 12:05:55.482742 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a5e886cbfd5a74ccbd34f0f9e65c11d97c6fc866ce0376569149850a60f8d17" Feb 02 12:05:55 crc kubenswrapper[4909]: I0202 12:05:55.482257 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-tk8f2" Feb 02 12:05:55 crc kubenswrapper[4909]: I0202 12:05:55.483968 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f2d4-account-create-update-wsl5j" event={"ID":"1428ae65-f2dd-4d55-8e4b-5da119761240","Type":"ContainerDied","Data":"04f1a933c8426c10f74a9d04d4e39cabe1084ee48a21737f5ac4c823045af4ed"} Feb 02 12:05:55 crc kubenswrapper[4909]: I0202 12:05:55.484005 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04f1a933c8426c10f74a9d04d4e39cabe1084ee48a21737f5ac4c823045af4ed" Feb 02 12:05:55 crc kubenswrapper[4909]: I0202 12:05:55.484053 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f2d4-account-create-update-wsl5j" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.039391 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bfdf865-29c4x"] Feb 02 12:05:57 crc kubenswrapper[4909]: E0202 12:05:57.040697 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ceb788-98ed-4b63-86c6-c872b71e4a4c" containerName="mariadb-database-create" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.040728 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ceb788-98ed-4b63-86c6-c872b71e4a4c" containerName="mariadb-database-create" Feb 02 12:05:57 crc kubenswrapper[4909]: E0202 12:05:57.040769 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1428ae65-f2dd-4d55-8e4b-5da119761240" containerName="mariadb-account-create-update" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.040779 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1428ae65-f2dd-4d55-8e4b-5da119761240" containerName="mariadb-account-create-update" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.041026 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ceb788-98ed-4b63-86c6-c872b71e4a4c" containerName="mariadb-database-create" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.041061 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1428ae65-f2dd-4d55-8e4b-5da119761240" containerName="mariadb-account-create-update" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.042275 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.060913 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bfdf865-29c4x"] Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.101626 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mj84r"] Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.103315 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.106672 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.107069 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-c6l5q" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.113065 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.130212 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mj84r"] Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.130682 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpn8f\" (UniqueName: \"kubernetes.io/projected/907a52c1-86b1-4e4f-a18e-5089411f1704-kube-api-access-bpn8f\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.130748 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-config-data\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.130940 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q9qt\" (UniqueName: \"kubernetes.io/projected/b7a3e3af-65cc-4582-82b3-a48ec96caf36-kube-api-access-7q9qt\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.131031 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-nb\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.131072 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-sb\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.131196 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-combined-ca-bundle\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.131261 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-dns-svc\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.131416 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-config\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.131507 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-scripts\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.131563 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907a52c1-86b1-4e4f-a18e-5089411f1704-logs\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.234595 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-nb\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.234680 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-sb\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.234774 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-combined-ca-bundle\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.234846 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-dns-svc\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.234963 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-config\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.235678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-nb\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.235934 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-dns-svc\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.236294 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-sb\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.236357 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-config\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.236415 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-scripts\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.236491 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907a52c1-86b1-4e4f-a18e-5089411f1704-logs\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.236948 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907a52c1-86b1-4e4f-a18e-5089411f1704-logs\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.237148 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpn8f\" (UniqueName: \"kubernetes.io/projected/907a52c1-86b1-4e4f-a18e-5089411f1704-kube-api-access-bpn8f\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.237270 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-config-data\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.237361 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q9qt\" (UniqueName: \"kubernetes.io/projected/b7a3e3af-65cc-4582-82b3-a48ec96caf36-kube-api-access-7q9qt\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.240651 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-combined-ca-bundle\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.243348 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-scripts\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.255331 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-config-data\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.257480 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpn8f\" (UniqueName: \"kubernetes.io/projected/907a52c1-86b1-4e4f-a18e-5089411f1704-kube-api-access-bpn8f\") pod \"placement-db-sync-mj84r\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.261610 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q9qt\" (UniqueName: \"kubernetes.io/projected/b7a3e3af-65cc-4582-82b3-a48ec96caf36-kube-api-access-7q9qt\") pod \"dnsmasq-dns-84bfdf865-29c4x\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.367339 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.433770 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mj84r" Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.882211 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bfdf865-29c4x"] Feb 02 12:05:57 crc kubenswrapper[4909]: I0202 12:05:57.988674 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mj84r"] Feb 02 12:05:58 crc kubenswrapper[4909]: I0202 12:05:58.518422 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mj84r" event={"ID":"907a52c1-86b1-4e4f-a18e-5089411f1704","Type":"ContainerStarted","Data":"a965a9b2e3adc2da89a159d42f257f60e3f094b62ee5418c4c0e324cbaa8294f"} Feb 02 12:05:58 crc kubenswrapper[4909]: I0202 12:05:58.518474 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mj84r" event={"ID":"907a52c1-86b1-4e4f-a18e-5089411f1704","Type":"ContainerStarted","Data":"1266905ac79bd384972e655baab020341113bb1b1b81eb11543f77f108e730cb"} Feb 02 12:05:58 crc kubenswrapper[4909]: I0202 12:05:58.520538 4909 generic.go:334] "Generic (PLEG): container finished" podID="b7a3e3af-65cc-4582-82b3-a48ec96caf36" containerID="f7019e24e5004306825e47e31e8ab15c341d411bc394732c14250645eb0a5350" exitCode=0 Feb 02 12:05:58 crc kubenswrapper[4909]: I0202 12:05:58.520843 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" event={"ID":"b7a3e3af-65cc-4582-82b3-a48ec96caf36","Type":"ContainerDied","Data":"f7019e24e5004306825e47e31e8ab15c341d411bc394732c14250645eb0a5350"} Feb 02 12:05:58 crc kubenswrapper[4909]: I0202 12:05:58.520874 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" event={"ID":"b7a3e3af-65cc-4582-82b3-a48ec96caf36","Type":"ContainerStarted","Data":"abebce6a2e59f3275884626cc68b775d8e3c993764602958622204d8530fa16d"} Feb 02 12:05:58 crc kubenswrapper[4909]: I0202 12:05:58.574088 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mj84r" podStartSLOduration=1.574062292 podStartE2EDuration="1.574062292s" podCreationTimestamp="2026-02-02 12:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:05:58.571312014 +0000 UTC m=+5684.317412749" watchObservedRunningTime="2026-02-02 12:05:58.574062292 +0000 UTC m=+5684.320163027" Feb 02 12:05:59 crc kubenswrapper[4909]: I0202 12:05:59.532048 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" event={"ID":"b7a3e3af-65cc-4582-82b3-a48ec96caf36","Type":"ContainerStarted","Data":"d598b01f30982e22dbbf27f2a47faadbadde1e5b10d2b729aff272d027f88110"} Feb 02 12:05:59 crc kubenswrapper[4909]: I0202 12:05:59.532414 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:05:59 crc kubenswrapper[4909]: I0202 12:05:59.533912 4909 generic.go:334] "Generic (PLEG): container finished" podID="907a52c1-86b1-4e4f-a18e-5089411f1704" containerID="a965a9b2e3adc2da89a159d42f257f60e3f094b62ee5418c4c0e324cbaa8294f" exitCode=0 Feb 02 12:05:59 crc kubenswrapper[4909]: I0202 12:05:59.533967 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mj84r" event={"ID":"907a52c1-86b1-4e4f-a18e-5089411f1704","Type":"ContainerDied","Data":"a965a9b2e3adc2da89a159d42f257f60e3f094b62ee5418c4c0e324cbaa8294f"} Feb 02 12:05:59 crc kubenswrapper[4909]: I0202 12:05:59.555649 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" podStartSLOduration=3.555630753 podStartE2EDuration="3.555630753s" podCreationTimestamp="2026-02-02 12:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:05:59.552980088 +0000 UTC m=+5685.299080833" watchObservedRunningTime="2026-02-02 12:05:59.555630753 +0000 UTC m=+5685.301731488" Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.884129 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mj84r" Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.917270 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-combined-ca-bundle\") pod \"907a52c1-86b1-4e4f-a18e-5089411f1704\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.917353 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-config-data\") pod \"907a52c1-86b1-4e4f-a18e-5089411f1704\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.917439 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpn8f\" (UniqueName: \"kubernetes.io/projected/907a52c1-86b1-4e4f-a18e-5089411f1704-kube-api-access-bpn8f\") pod \"907a52c1-86b1-4e4f-a18e-5089411f1704\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.917471 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-scripts\") pod \"907a52c1-86b1-4e4f-a18e-5089411f1704\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.917497 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907a52c1-86b1-4e4f-a18e-5089411f1704-logs\") pod \"907a52c1-86b1-4e4f-a18e-5089411f1704\" (UID: \"907a52c1-86b1-4e4f-a18e-5089411f1704\") " Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.918085 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/907a52c1-86b1-4e4f-a18e-5089411f1704-logs" (OuterVolumeSpecName: "logs") pod "907a52c1-86b1-4e4f-a18e-5089411f1704" (UID: "907a52c1-86b1-4e4f-a18e-5089411f1704"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.923180 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907a52c1-86b1-4e4f-a18e-5089411f1704-kube-api-access-bpn8f" (OuterVolumeSpecName: "kube-api-access-bpn8f") pod "907a52c1-86b1-4e4f-a18e-5089411f1704" (UID: "907a52c1-86b1-4e4f-a18e-5089411f1704"). InnerVolumeSpecName "kube-api-access-bpn8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.923623 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-scripts" (OuterVolumeSpecName: "scripts") pod "907a52c1-86b1-4e4f-a18e-5089411f1704" (UID: "907a52c1-86b1-4e4f-a18e-5089411f1704"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.944473 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-config-data" (OuterVolumeSpecName: "config-data") pod "907a52c1-86b1-4e4f-a18e-5089411f1704" (UID: "907a52c1-86b1-4e4f-a18e-5089411f1704"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:06:00 crc kubenswrapper[4909]: I0202 12:06:00.946622 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "907a52c1-86b1-4e4f-a18e-5089411f1704" (UID: "907a52c1-86b1-4e4f-a18e-5089411f1704"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.019729 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.019776 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.019789 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpn8f\" (UniqueName: \"kubernetes.io/projected/907a52c1-86b1-4e4f-a18e-5089411f1704-kube-api-access-bpn8f\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.019801 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907a52c1-86b1-4e4f-a18e-5089411f1704-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.019840 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907a52c1-86b1-4e4f-a18e-5089411f1704-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.550493 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mj84r" event={"ID":"907a52c1-86b1-4e4f-a18e-5089411f1704","Type":"ContainerDied","Data":"1266905ac79bd384972e655baab020341113bb1b1b81eb11543f77f108e730cb"} Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.550837 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1266905ac79bd384972e655baab020341113bb1b1b81eb11543f77f108e730cb" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.550565 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mj84r" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.977020 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6bc5c4cc7d-29h64"] Feb 02 12:06:01 crc kubenswrapper[4909]: E0202 12:06:01.977449 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907a52c1-86b1-4e4f-a18e-5089411f1704" containerName="placement-db-sync" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.977469 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="907a52c1-86b1-4e4f-a18e-5089411f1704" containerName="placement-db-sync" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.977670 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="907a52c1-86b1-4e4f-a18e-5089411f1704" containerName="placement-db-sync" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.978677 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.981935 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.982230 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.982409 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-c6l5q" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.982590 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.982756 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 02 12:06:01 crc kubenswrapper[4909]: I0202 12:06:01.997801 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6bc5c4cc7d-29h64"] Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.016518 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:06:02 crc kubenswrapper[4909]: E0202 12:06:02.016719 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.139123 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-internal-tls-certs\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.139273 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-combined-ca-bundle\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.139753 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-public-tls-certs\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.139876 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-scripts\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.139935 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnpbq\" (UniqueName: \"kubernetes.io/projected/74164b48-4c93-4c9c-98c1-6997dc72ec87-kube-api-access-fnpbq\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.139982 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74164b48-4c93-4c9c-98c1-6997dc72ec87-logs\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.140067 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-config-data\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.242180 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-combined-ca-bundle\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.242263 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-public-tls-certs\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.242317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-scripts\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.242351 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnpbq\" (UniqueName: \"kubernetes.io/projected/74164b48-4c93-4c9c-98c1-6997dc72ec87-kube-api-access-fnpbq\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.242386 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74164b48-4c93-4c9c-98c1-6997dc72ec87-logs\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.242433 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-config-data\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.242513 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-internal-tls-certs\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.243182 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74164b48-4c93-4c9c-98c1-6997dc72ec87-logs\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.247796 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-config-data\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.250484 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-scripts\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.252324 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-public-tls-certs\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.261483 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-internal-tls-certs\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.263900 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74164b48-4c93-4c9c-98c1-6997dc72ec87-combined-ca-bundle\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.279593 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnpbq\" (UniqueName: \"kubernetes.io/projected/74164b48-4c93-4c9c-98c1-6997dc72ec87-kube-api-access-fnpbq\") pod \"placement-6bc5c4cc7d-29h64\" (UID: \"74164b48-4c93-4c9c-98c1-6997dc72ec87\") " pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.297242 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:02 crc kubenswrapper[4909]: I0202 12:06:02.845903 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6bc5c4cc7d-29h64"] Feb 02 12:06:02 crc kubenswrapper[4909]: W0202 12:06:02.853143 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74164b48_4c93_4c9c_98c1_6997dc72ec87.slice/crio-66e51106ed20515cdea57afc78cc16eedb64fb8a47c2d2a16687dbe1f60e3f6c WatchSource:0}: Error finding container 66e51106ed20515cdea57afc78cc16eedb64fb8a47c2d2a16687dbe1f60e3f6c: Status 404 returned error can't find the container with id 66e51106ed20515cdea57afc78cc16eedb64fb8a47c2d2a16687dbe1f60e3f6c Feb 02 12:06:03 crc kubenswrapper[4909]: I0202 12:06:03.568092 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bc5c4cc7d-29h64" event={"ID":"74164b48-4c93-4c9c-98c1-6997dc72ec87","Type":"ContainerStarted","Data":"a0e52a929a815df9c0908862c1e2dc71abb4060c86c5e2f70cebfa132fa9401f"} Feb 02 12:06:03 crc kubenswrapper[4909]: I0202 12:06:03.568665 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:03 crc kubenswrapper[4909]: I0202 12:06:03.568679 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:03 crc kubenswrapper[4909]: I0202 12:06:03.568688 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bc5c4cc7d-29h64" event={"ID":"74164b48-4c93-4c9c-98c1-6997dc72ec87","Type":"ContainerStarted","Data":"7c145d1d67c2181672f500aee987cfd8647e1fc30cf16c3c21d34fe179f78cf3"} Feb 02 12:06:03 crc kubenswrapper[4909]: I0202 12:06:03.568698 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bc5c4cc7d-29h64" event={"ID":"74164b48-4c93-4c9c-98c1-6997dc72ec87","Type":"ContainerStarted","Data":"66e51106ed20515cdea57afc78cc16eedb64fb8a47c2d2a16687dbe1f60e3f6c"} Feb 02 12:06:03 crc kubenswrapper[4909]: I0202 12:06:03.596364 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6bc5c4cc7d-29h64" podStartSLOduration=2.596337013 podStartE2EDuration="2.596337013s" podCreationTimestamp="2026-02-02 12:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:06:03.585087303 +0000 UTC m=+5689.331188048" watchObservedRunningTime="2026-02-02 12:06:03.596337013 +0000 UTC m=+5689.342437748" Feb 02 12:06:07 crc kubenswrapper[4909]: I0202 12:06:07.369557 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:06:07 crc kubenswrapper[4909]: I0202 12:06:07.423504 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c747cbd5-ntrd9"] Feb 02 12:06:07 crc kubenswrapper[4909]: I0202 12:06:07.423732 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" podUID="45d46867-5215-4ec7-a2a8-94c398de6a4f" containerName="dnsmasq-dns" containerID="cri-o://b19fb7db4717b78c4d3b4a5128835e2eb236b26f7f191dffe4b397a9a5bd852f" gracePeriod=10 Feb 02 12:06:07 crc kubenswrapper[4909]: I0202 12:06:07.614003 4909 generic.go:334] "Generic (PLEG): container finished" podID="45d46867-5215-4ec7-a2a8-94c398de6a4f" containerID="b19fb7db4717b78c4d3b4a5128835e2eb236b26f7f191dffe4b397a9a5bd852f" exitCode=0 Feb 02 12:06:07 crc kubenswrapper[4909]: I0202 12:06:07.614041 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" event={"ID":"45d46867-5215-4ec7-a2a8-94c398de6a4f","Type":"ContainerDied","Data":"b19fb7db4717b78c4d3b4a5128835e2eb236b26f7f191dffe4b397a9a5bd852f"} Feb 02 12:06:07 crc kubenswrapper[4909]: I0202 12:06:07.919108 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.053375 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-dns-svc\") pod \"45d46867-5215-4ec7-a2a8-94c398de6a4f\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.053487 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-sb\") pod \"45d46867-5215-4ec7-a2a8-94c398de6a4f\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.053527 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn6mn\" (UniqueName: \"kubernetes.io/projected/45d46867-5215-4ec7-a2a8-94c398de6a4f-kube-api-access-pn6mn\") pod \"45d46867-5215-4ec7-a2a8-94c398de6a4f\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.053589 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-nb\") pod \"45d46867-5215-4ec7-a2a8-94c398de6a4f\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.053648 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-config\") pod \"45d46867-5215-4ec7-a2a8-94c398de6a4f\" (UID: \"45d46867-5215-4ec7-a2a8-94c398de6a4f\") " Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.059164 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d46867-5215-4ec7-a2a8-94c398de6a4f-kube-api-access-pn6mn" (OuterVolumeSpecName: "kube-api-access-pn6mn") pod "45d46867-5215-4ec7-a2a8-94c398de6a4f" (UID: "45d46867-5215-4ec7-a2a8-94c398de6a4f"). InnerVolumeSpecName "kube-api-access-pn6mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.114640 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45d46867-5215-4ec7-a2a8-94c398de6a4f" (UID: "45d46867-5215-4ec7-a2a8-94c398de6a4f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.157943 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.157978 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn6mn\" (UniqueName: \"kubernetes.io/projected/45d46867-5215-4ec7-a2a8-94c398de6a4f-kube-api-access-pn6mn\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.621878 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" event={"ID":"45d46867-5215-4ec7-a2a8-94c398de6a4f","Type":"ContainerDied","Data":"3f90152365872ead39059b6ccf4c14a707b4499af781855043722deb3f491078"} Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.621930 4909 scope.go:117] "RemoveContainer" containerID="b19fb7db4717b78c4d3b4a5128835e2eb236b26f7f191dffe4b397a9a5bd852f" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.622230 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76c747cbd5-ntrd9" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.691209 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-config" (OuterVolumeSpecName: "config") pod "45d46867-5215-4ec7-a2a8-94c398de6a4f" (UID: "45d46867-5215-4ec7-a2a8-94c398de6a4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.693293 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45d46867-5215-4ec7-a2a8-94c398de6a4f" (UID: "45d46867-5215-4ec7-a2a8-94c398de6a4f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.693972 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45d46867-5215-4ec7-a2a8-94c398de6a4f" (UID: "45d46867-5215-4ec7-a2a8-94c398de6a4f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.701601 4909 scope.go:117] "RemoveContainer" containerID="8042a24e72598aec2239bbac4c00396a9f4536978eb8540a3b7e2dc17283efc9" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.792197 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.792236 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.792246 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45d46867-5215-4ec7-a2a8-94c398de6a4f-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.964261 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76c747cbd5-ntrd9"] Feb 02 12:06:08 crc kubenswrapper[4909]: I0202 12:06:08.972253 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76c747cbd5-ntrd9"] Feb 02 12:06:09 crc kubenswrapper[4909]: I0202 12:06:09.028547 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d46867-5215-4ec7-a2a8-94c398de6a4f" path="/var/lib/kubelet/pods/45d46867-5215-4ec7-a2a8-94c398de6a4f/volumes" Feb 02 12:06:14 crc kubenswrapper[4909]: I0202 12:06:14.018066 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:06:14 crc kubenswrapper[4909]: E0202 12:06:14.020242 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:06:26 crc kubenswrapper[4909]: I0202 12:06:26.016408 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:06:26 crc kubenswrapper[4909]: E0202 12:06:26.018222 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:06:33 crc kubenswrapper[4909]: I0202 12:06:33.381946 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:34 crc kubenswrapper[4909]: I0202 12:06:34.392711 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6bc5c4cc7d-29h64" Feb 02 12:06:41 crc kubenswrapper[4909]: I0202 12:06:41.017660 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:06:41 crc kubenswrapper[4909]: E0202 12:06:41.018771 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.059634 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hrrp5"] Feb 02 12:06:54 crc kubenswrapper[4909]: E0202 12:06:54.060558 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d46867-5215-4ec7-a2a8-94c398de6a4f" containerName="dnsmasq-dns" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.060575 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d46867-5215-4ec7-a2a8-94c398de6a4f" containerName="dnsmasq-dns" Feb 02 12:06:54 crc kubenswrapper[4909]: E0202 12:06:54.060598 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d46867-5215-4ec7-a2a8-94c398de6a4f" containerName="init" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.060605 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d46867-5215-4ec7-a2a8-94c398de6a4f" containerName="init" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.060778 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d46867-5215-4ec7-a2a8-94c398de6a4f" containerName="dnsmasq-dns" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.061397 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hrrp5" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.079884 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hrrp5"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.128252 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg6fk\" (UniqueName: \"kubernetes.io/projected/44f1c177-7f74-49e3-a737-1cf825d08c5d-kube-api-access-fg6fk\") pod \"nova-api-db-create-hrrp5\" (UID: \"44f1c177-7f74-49e3-a737-1cf825d08c5d\") " pod="openstack/nova-api-db-create-hrrp5" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.128380 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1c177-7f74-49e3-a737-1cf825d08c5d-operator-scripts\") pod \"nova-api-db-create-hrrp5\" (UID: \"44f1c177-7f74-49e3-a737-1cf825d08c5d\") " pod="openstack/nova-api-db-create-hrrp5" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.158744 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5399-account-create-update-rlcdb"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.160045 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5399-account-create-update-rlcdb" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.162249 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.168127 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-m8g58"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.170053 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m8g58" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.182426 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-m8g58"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.209495 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5399-account-create-update-rlcdb"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.229597 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg6fk\" (UniqueName: \"kubernetes.io/projected/44f1c177-7f74-49e3-a737-1cf825d08c5d-kube-api-access-fg6fk\") pod \"nova-api-db-create-hrrp5\" (UID: \"44f1c177-7f74-49e3-a737-1cf825d08c5d\") " pod="openstack/nova-api-db-create-hrrp5" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.229727 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96q5s\" (UniqueName: \"kubernetes.io/projected/4848315d-b355-43b8-961b-440dd5a94e2b-kube-api-access-96q5s\") pod \"nova-cell0-db-create-m8g58\" (UID: \"4848315d-b355-43b8-961b-440dd5a94e2b\") " pod="openstack/nova-cell0-db-create-m8g58" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.229779 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1c177-7f74-49e3-a737-1cf825d08c5d-operator-scripts\") pod \"nova-api-db-create-hrrp5\" (UID: \"44f1c177-7f74-49e3-a737-1cf825d08c5d\") " pod="openstack/nova-api-db-create-hrrp5" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.229828 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4848315d-b355-43b8-961b-440dd5a94e2b-operator-scripts\") pod \"nova-cell0-db-create-m8g58\" (UID: \"4848315d-b355-43b8-961b-440dd5a94e2b\") " pod="openstack/nova-cell0-db-create-m8g58" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.229887 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42e427d5-c936-4052-958c-18d89b07527c-operator-scripts\") pod \"nova-api-5399-account-create-update-rlcdb\" (UID: \"42e427d5-c936-4052-958c-18d89b07527c\") " pod="openstack/nova-api-5399-account-create-update-rlcdb" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.229964 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c72wh\" (UniqueName: \"kubernetes.io/projected/42e427d5-c936-4052-958c-18d89b07527c-kube-api-access-c72wh\") pod \"nova-api-5399-account-create-update-rlcdb\" (UID: \"42e427d5-c936-4052-958c-18d89b07527c\") " pod="openstack/nova-api-5399-account-create-update-rlcdb" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.231424 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1c177-7f74-49e3-a737-1cf825d08c5d-operator-scripts\") pod \"nova-api-db-create-hrrp5\" (UID: \"44f1c177-7f74-49e3-a737-1cf825d08c5d\") " pod="openstack/nova-api-db-create-hrrp5" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.252898 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg6fk\" (UniqueName: \"kubernetes.io/projected/44f1c177-7f74-49e3-a737-1cf825d08c5d-kube-api-access-fg6fk\") pod \"nova-api-db-create-hrrp5\" (UID: \"44f1c177-7f74-49e3-a737-1cf825d08c5d\") " pod="openstack/nova-api-db-create-hrrp5" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.331927 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c72wh\" (UniqueName: \"kubernetes.io/projected/42e427d5-c936-4052-958c-18d89b07527c-kube-api-access-c72wh\") pod \"nova-api-5399-account-create-update-rlcdb\" (UID: \"42e427d5-c936-4052-958c-18d89b07527c\") " pod="openstack/nova-api-5399-account-create-update-rlcdb" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.332086 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96q5s\" (UniqueName: \"kubernetes.io/projected/4848315d-b355-43b8-961b-440dd5a94e2b-kube-api-access-96q5s\") pod \"nova-cell0-db-create-m8g58\" (UID: \"4848315d-b355-43b8-961b-440dd5a94e2b\") " pod="openstack/nova-cell0-db-create-m8g58" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.332158 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4848315d-b355-43b8-961b-440dd5a94e2b-operator-scripts\") pod \"nova-cell0-db-create-m8g58\" (UID: \"4848315d-b355-43b8-961b-440dd5a94e2b\") " pod="openstack/nova-cell0-db-create-m8g58" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.332219 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42e427d5-c936-4052-958c-18d89b07527c-operator-scripts\") pod \"nova-api-5399-account-create-update-rlcdb\" (UID: \"42e427d5-c936-4052-958c-18d89b07527c\") " pod="openstack/nova-api-5399-account-create-update-rlcdb" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.333266 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42e427d5-c936-4052-958c-18d89b07527c-operator-scripts\") pod \"nova-api-5399-account-create-update-rlcdb\" (UID: \"42e427d5-c936-4052-958c-18d89b07527c\") " pod="openstack/nova-api-5399-account-create-update-rlcdb" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.333278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4848315d-b355-43b8-961b-440dd5a94e2b-operator-scripts\") pod \"nova-cell0-db-create-m8g58\" (UID: \"4848315d-b355-43b8-961b-440dd5a94e2b\") " pod="openstack/nova-cell0-db-create-m8g58" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.353082 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-9rcwj"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.354287 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9rcwj" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.359975 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96q5s\" (UniqueName: \"kubernetes.io/projected/4848315d-b355-43b8-961b-440dd5a94e2b-kube-api-access-96q5s\") pod \"nova-cell0-db-create-m8g58\" (UID: \"4848315d-b355-43b8-961b-440dd5a94e2b\") " pod="openstack/nova-cell0-db-create-m8g58" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.363409 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c72wh\" (UniqueName: \"kubernetes.io/projected/42e427d5-c936-4052-958c-18d89b07527c-kube-api-access-c72wh\") pod \"nova-api-5399-account-create-update-rlcdb\" (UID: \"42e427d5-c936-4052-958c-18d89b07527c\") " pod="openstack/nova-api-5399-account-create-update-rlcdb" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.372267 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-5743-account-create-update-v2jqx"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.373428 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5743-account-create-update-v2jqx" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.375333 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.394565 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hrrp5" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.431852 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9rcwj"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.436581 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-987l8\" (UniqueName: \"kubernetes.io/projected/8714421b-a562-42dc-8b61-262ddf02239f-kube-api-access-987l8\") pod \"nova-cell0-5743-account-create-update-v2jqx\" (UID: \"8714421b-a562-42dc-8b61-262ddf02239f\") " pod="openstack/nova-cell0-5743-account-create-update-v2jqx" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.438014 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98jh7\" (UniqueName: \"kubernetes.io/projected/c5d11b84-2bb6-432f-9a25-04877030be31-kube-api-access-98jh7\") pod \"nova-cell1-db-create-9rcwj\" (UID: \"c5d11b84-2bb6-432f-9a25-04877030be31\") " pod="openstack/nova-cell1-db-create-9rcwj" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.438127 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d11b84-2bb6-432f-9a25-04877030be31-operator-scripts\") pod \"nova-cell1-db-create-9rcwj\" (UID: \"c5d11b84-2bb6-432f-9a25-04877030be31\") " pod="openstack/nova-cell1-db-create-9rcwj" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.438304 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8714421b-a562-42dc-8b61-262ddf02239f-operator-scripts\") pod \"nova-cell0-5743-account-create-update-v2jqx\" (UID: \"8714421b-a562-42dc-8b61-262ddf02239f\") " pod="openstack/nova-cell0-5743-account-create-update-v2jqx" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.448536 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5743-account-create-update-v2jqx"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.481712 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5399-account-create-update-rlcdb" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.501413 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m8g58" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.540286 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8714421b-a562-42dc-8b61-262ddf02239f-operator-scripts\") pod \"nova-cell0-5743-account-create-update-v2jqx\" (UID: \"8714421b-a562-42dc-8b61-262ddf02239f\") " pod="openstack/nova-cell0-5743-account-create-update-v2jqx" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.540349 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-987l8\" (UniqueName: \"kubernetes.io/projected/8714421b-a562-42dc-8b61-262ddf02239f-kube-api-access-987l8\") pod \"nova-cell0-5743-account-create-update-v2jqx\" (UID: \"8714421b-a562-42dc-8b61-262ddf02239f\") " pod="openstack/nova-cell0-5743-account-create-update-v2jqx" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.540442 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98jh7\" (UniqueName: \"kubernetes.io/projected/c5d11b84-2bb6-432f-9a25-04877030be31-kube-api-access-98jh7\") pod \"nova-cell1-db-create-9rcwj\" (UID: \"c5d11b84-2bb6-432f-9a25-04877030be31\") " pod="openstack/nova-cell1-db-create-9rcwj" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.540480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d11b84-2bb6-432f-9a25-04877030be31-operator-scripts\") pod \"nova-cell1-db-create-9rcwj\" (UID: \"c5d11b84-2bb6-432f-9a25-04877030be31\") " pod="openstack/nova-cell1-db-create-9rcwj" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.541325 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d11b84-2bb6-432f-9a25-04877030be31-operator-scripts\") pod \"nova-cell1-db-create-9rcwj\" (UID: \"c5d11b84-2bb6-432f-9a25-04877030be31\") " pod="openstack/nova-cell1-db-create-9rcwj" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.541325 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8714421b-a562-42dc-8b61-262ddf02239f-operator-scripts\") pod \"nova-cell0-5743-account-create-update-v2jqx\" (UID: \"8714421b-a562-42dc-8b61-262ddf02239f\") " pod="openstack/nova-cell0-5743-account-create-update-v2jqx" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.569410 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-987l8\" (UniqueName: \"kubernetes.io/projected/8714421b-a562-42dc-8b61-262ddf02239f-kube-api-access-987l8\") pod \"nova-cell0-5743-account-create-update-v2jqx\" (UID: \"8714421b-a562-42dc-8b61-262ddf02239f\") " pod="openstack/nova-cell0-5743-account-create-update-v2jqx" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.574910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98jh7\" (UniqueName: \"kubernetes.io/projected/c5d11b84-2bb6-432f-9a25-04877030be31-kube-api-access-98jh7\") pod \"nova-cell1-db-create-9rcwj\" (UID: \"c5d11b84-2bb6-432f-9a25-04877030be31\") " pod="openstack/nova-cell1-db-create-9rcwj" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.578847 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9rcwj" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.588498 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-b701-account-create-update-vpmq8"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.590945 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b701-account-create-update-vpmq8" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.594233 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.598391 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b701-account-create-update-vpmq8"] Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.598448 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5743-account-create-update-v2jqx" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.641713 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2986b45-6e46-4553-999c-6a89b4565b88-operator-scripts\") pod \"nova-cell1-b701-account-create-update-vpmq8\" (UID: \"d2986b45-6e46-4553-999c-6a89b4565b88\") " pod="openstack/nova-cell1-b701-account-create-update-vpmq8" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.641841 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqkk2\" (UniqueName: \"kubernetes.io/projected/d2986b45-6e46-4553-999c-6a89b4565b88-kube-api-access-cqkk2\") pod \"nova-cell1-b701-account-create-update-vpmq8\" (UID: \"d2986b45-6e46-4553-999c-6a89b4565b88\") " pod="openstack/nova-cell1-b701-account-create-update-vpmq8" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.745495 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqkk2\" (UniqueName: \"kubernetes.io/projected/d2986b45-6e46-4553-999c-6a89b4565b88-kube-api-access-cqkk2\") pod \"nova-cell1-b701-account-create-update-vpmq8\" (UID: \"d2986b45-6e46-4553-999c-6a89b4565b88\") " pod="openstack/nova-cell1-b701-account-create-update-vpmq8" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.745830 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2986b45-6e46-4553-999c-6a89b4565b88-operator-scripts\") pod \"nova-cell1-b701-account-create-update-vpmq8\" (UID: \"d2986b45-6e46-4553-999c-6a89b4565b88\") " pod="openstack/nova-cell1-b701-account-create-update-vpmq8" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.748739 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2986b45-6e46-4553-999c-6a89b4565b88-operator-scripts\") pod \"nova-cell1-b701-account-create-update-vpmq8\" (UID: \"d2986b45-6e46-4553-999c-6a89b4565b88\") " pod="openstack/nova-cell1-b701-account-create-update-vpmq8" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.782682 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqkk2\" (UniqueName: \"kubernetes.io/projected/d2986b45-6e46-4553-999c-6a89b4565b88-kube-api-access-cqkk2\") pod \"nova-cell1-b701-account-create-update-vpmq8\" (UID: \"d2986b45-6e46-4553-999c-6a89b4565b88\") " pod="openstack/nova-cell1-b701-account-create-update-vpmq8" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.916424 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b701-account-create-update-vpmq8" Feb 02 12:06:54 crc kubenswrapper[4909]: I0202 12:06:54.978461 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hrrp5"] Feb 02 12:06:55 crc kubenswrapper[4909]: I0202 12:06:55.056968 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hrrp5" event={"ID":"44f1c177-7f74-49e3-a737-1cf825d08c5d","Type":"ContainerStarted","Data":"0cbdf9bfa57310360b9c0b08ec4b140fbb8e053ad3c5fdf3fd1b6eed91a6eb66"} Feb 02 12:06:55 crc kubenswrapper[4909]: I0202 12:06:55.156822 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5399-account-create-update-rlcdb"] Feb 02 12:06:55 crc kubenswrapper[4909]: W0202 12:06:55.162441 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42e427d5_c936_4052_958c_18d89b07527c.slice/crio-89ec57e63eae935470570bc988b2a59e010e41450fe3f5a8e75fed3bf4dc9d26 WatchSource:0}: Error finding container 89ec57e63eae935470570bc988b2a59e010e41450fe3f5a8e75fed3bf4dc9d26: Status 404 returned error can't find the container with id 89ec57e63eae935470570bc988b2a59e010e41450fe3f5a8e75fed3bf4dc9d26 Feb 02 12:06:55 crc kubenswrapper[4909]: I0202 12:06:55.249920 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-5743-account-create-update-v2jqx"] Feb 02 12:06:55 crc kubenswrapper[4909]: I0202 12:06:55.277013 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9rcwj"] Feb 02 12:06:55 crc kubenswrapper[4909]: I0202 12:06:55.289753 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-m8g58"] Feb 02 12:06:55 crc kubenswrapper[4909]: W0202 12:06:55.296707 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4848315d_b355_43b8_961b_440dd5a94e2b.slice/crio-55bc5835572a67748a015e0bba83fcefd80ead3fb7c6bb8d2ffc709cbc496253 WatchSource:0}: Error finding container 55bc5835572a67748a015e0bba83fcefd80ead3fb7c6bb8d2ffc709cbc496253: Status 404 returned error can't find the container with id 55bc5835572a67748a015e0bba83fcefd80ead3fb7c6bb8d2ffc709cbc496253 Feb 02 12:06:55 crc kubenswrapper[4909]: I0202 12:06:55.501983 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-b701-account-create-update-vpmq8"] Feb 02 12:06:55 crc kubenswrapper[4909]: W0202 12:06:55.514330 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2986b45_6e46_4553_999c_6a89b4565b88.slice/crio-f6567cbaab231f0a81691301c487be313868dce316065b55d4e08ed054944d59 WatchSource:0}: Error finding container f6567cbaab231f0a81691301c487be313868dce316065b55d4e08ed054944d59: Status 404 returned error can't find the container with id f6567cbaab231f0a81691301c487be313868dce316065b55d4e08ed054944d59 Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.016243 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:06:56 crc kubenswrapper[4909]: E0202 12:06:56.016848 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.067051 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5743-account-create-update-v2jqx" event={"ID":"8714421b-a562-42dc-8b61-262ddf02239f","Type":"ContainerStarted","Data":"48081bf6362bf6146ca90dc6fa377e02b8965d15aca0a3b9cd6b4eee0776927f"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.067093 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5743-account-create-update-v2jqx" event={"ID":"8714421b-a562-42dc-8b61-262ddf02239f","Type":"ContainerStarted","Data":"27c9b56dd3e2db8067c81763af57874e37ec1818ad83e83e3cf7f54bb05b007b"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.069135 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9rcwj" event={"ID":"c5d11b84-2bb6-432f-9a25-04877030be31","Type":"ContainerStarted","Data":"24c9a991d3fd21a9ed5d3b49b487ede40551fd41bdb80efea4b32e9ff8af9291"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.069189 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9rcwj" event={"ID":"c5d11b84-2bb6-432f-9a25-04877030be31","Type":"ContainerStarted","Data":"0c488002e98b2833998eec3e8ad8673e332bdd6014b57898d388d8dc0e13c3f6"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.072245 4909 generic.go:334] "Generic (PLEG): container finished" podID="44f1c177-7f74-49e3-a737-1cf825d08c5d" containerID="bb88c2966cceec49ea9bc8516b2c47570c6c0c018f1d7519bb9180de67e69a78" exitCode=0 Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.072299 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hrrp5" event={"ID":"44f1c177-7f74-49e3-a737-1cf825d08c5d","Type":"ContainerDied","Data":"bb88c2966cceec49ea9bc8516b2c47570c6c0c018f1d7519bb9180de67e69a78"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.075355 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-m8g58" event={"ID":"4848315d-b355-43b8-961b-440dd5a94e2b","Type":"ContainerStarted","Data":"769460b0ab5109d1895e253677ff79de6e322d836ca318ae6308a6fbbf4e02b4"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.075426 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-m8g58" event={"ID":"4848315d-b355-43b8-961b-440dd5a94e2b","Type":"ContainerStarted","Data":"55bc5835572a67748a015e0bba83fcefd80ead3fb7c6bb8d2ffc709cbc496253"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.077343 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5399-account-create-update-rlcdb" event={"ID":"42e427d5-c936-4052-958c-18d89b07527c","Type":"ContainerStarted","Data":"35dd6db6f8142a6fa8503586aa43ded6a3fc9638519c7a152f8c877b372bdc13"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.077398 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5399-account-create-update-rlcdb" event={"ID":"42e427d5-c936-4052-958c-18d89b07527c","Type":"ContainerStarted","Data":"89ec57e63eae935470570bc988b2a59e010e41450fe3f5a8e75fed3bf4dc9d26"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.079652 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b701-account-create-update-vpmq8" event={"ID":"d2986b45-6e46-4553-999c-6a89b4565b88","Type":"ContainerStarted","Data":"2e6a2733feb147774be39298787e3b3e5b27d1fe88d1ac9eba9603544ababe3d"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.079687 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b701-account-create-update-vpmq8" event={"ID":"d2986b45-6e46-4553-999c-6a89b4565b88","Type":"ContainerStarted","Data":"f6567cbaab231f0a81691301c487be313868dce316065b55d4e08ed054944d59"} Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.094804 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-5743-account-create-update-v2jqx" podStartSLOduration=2.094783139 podStartE2EDuration="2.094783139s" podCreationTimestamp="2026-02-02 12:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:06:56.082652105 +0000 UTC m=+5741.828752840" watchObservedRunningTime="2026-02-02 12:06:56.094783139 +0000 UTC m=+5741.840883874" Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.106651 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-9rcwj" podStartSLOduration=2.106631765 podStartE2EDuration="2.106631765s" podCreationTimestamp="2026-02-02 12:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:06:56.098399512 +0000 UTC m=+5741.844500247" watchObservedRunningTime="2026-02-02 12:06:56.106631765 +0000 UTC m=+5741.852732500" Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.119246 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-5399-account-create-update-rlcdb" podStartSLOduration=2.119226693 podStartE2EDuration="2.119226693s" podCreationTimestamp="2026-02-02 12:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:06:56.117339609 +0000 UTC m=+5741.863440364" watchObservedRunningTime="2026-02-02 12:06:56.119226693 +0000 UTC m=+5741.865327428" Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.156797 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-m8g58" podStartSLOduration=2.156766668 podStartE2EDuration="2.156766668s" podCreationTimestamp="2026-02-02 12:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:06:56.142416931 +0000 UTC m=+5741.888517666" watchObservedRunningTime="2026-02-02 12:06:56.156766668 +0000 UTC m=+5741.902867393" Feb 02 12:06:56 crc kubenswrapper[4909]: I0202 12:06:56.165280 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-b701-account-create-update-vpmq8" podStartSLOduration=2.165251519 podStartE2EDuration="2.165251519s" podCreationTimestamp="2026-02-02 12:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:06:56.1585932 +0000 UTC m=+5741.904693935" watchObservedRunningTime="2026-02-02 12:06:56.165251519 +0000 UTC m=+5741.911352274" Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.095854 4909 generic.go:334] "Generic (PLEG): container finished" podID="c5d11b84-2bb6-432f-9a25-04877030be31" containerID="24c9a991d3fd21a9ed5d3b49b487ede40551fd41bdb80efea4b32e9ff8af9291" exitCode=0 Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.096242 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9rcwj" event={"ID":"c5d11b84-2bb6-432f-9a25-04877030be31","Type":"ContainerDied","Data":"24c9a991d3fd21a9ed5d3b49b487ede40551fd41bdb80efea4b32e9ff8af9291"} Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.100518 4909 generic.go:334] "Generic (PLEG): container finished" podID="4848315d-b355-43b8-961b-440dd5a94e2b" containerID="769460b0ab5109d1895e253677ff79de6e322d836ca318ae6308a6fbbf4e02b4" exitCode=0 Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.100583 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-m8g58" event={"ID":"4848315d-b355-43b8-961b-440dd5a94e2b","Type":"ContainerDied","Data":"769460b0ab5109d1895e253677ff79de6e322d836ca318ae6308a6fbbf4e02b4"} Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.103043 4909 generic.go:334] "Generic (PLEG): container finished" podID="42e427d5-c936-4052-958c-18d89b07527c" containerID="35dd6db6f8142a6fa8503586aa43ded6a3fc9638519c7a152f8c877b372bdc13" exitCode=0 Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.103105 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5399-account-create-update-rlcdb" event={"ID":"42e427d5-c936-4052-958c-18d89b07527c","Type":"ContainerDied","Data":"35dd6db6f8142a6fa8503586aa43ded6a3fc9638519c7a152f8c877b372bdc13"} Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.105631 4909 generic.go:334] "Generic (PLEG): container finished" podID="d2986b45-6e46-4553-999c-6a89b4565b88" containerID="2e6a2733feb147774be39298787e3b3e5b27d1fe88d1ac9eba9603544ababe3d" exitCode=0 Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.105679 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b701-account-create-update-vpmq8" event={"ID":"d2986b45-6e46-4553-999c-6a89b4565b88","Type":"ContainerDied","Data":"2e6a2733feb147774be39298787e3b3e5b27d1fe88d1ac9eba9603544ababe3d"} Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.109857 4909 generic.go:334] "Generic (PLEG): container finished" podID="8714421b-a562-42dc-8b61-262ddf02239f" containerID="48081bf6362bf6146ca90dc6fa377e02b8965d15aca0a3b9cd6b4eee0776927f" exitCode=0 Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.110222 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5743-account-create-update-v2jqx" event={"ID":"8714421b-a562-42dc-8b61-262ddf02239f","Type":"ContainerDied","Data":"48081bf6362bf6146ca90dc6fa377e02b8965d15aca0a3b9cd6b4eee0776927f"} Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.554242 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hrrp5" Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.604696 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg6fk\" (UniqueName: \"kubernetes.io/projected/44f1c177-7f74-49e3-a737-1cf825d08c5d-kube-api-access-fg6fk\") pod \"44f1c177-7f74-49e3-a737-1cf825d08c5d\" (UID: \"44f1c177-7f74-49e3-a737-1cf825d08c5d\") " Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.605022 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1c177-7f74-49e3-a737-1cf825d08c5d-operator-scripts\") pod \"44f1c177-7f74-49e3-a737-1cf825d08c5d\" (UID: \"44f1c177-7f74-49e3-a737-1cf825d08c5d\") " Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.605630 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44f1c177-7f74-49e3-a737-1cf825d08c5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44f1c177-7f74-49e3-a737-1cf825d08c5d" (UID: "44f1c177-7f74-49e3-a737-1cf825d08c5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.611240 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f1c177-7f74-49e3-a737-1cf825d08c5d-kube-api-access-fg6fk" (OuterVolumeSpecName: "kube-api-access-fg6fk") pod "44f1c177-7f74-49e3-a737-1cf825d08c5d" (UID: "44f1c177-7f74-49e3-a737-1cf825d08c5d"). InnerVolumeSpecName "kube-api-access-fg6fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.707322 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44f1c177-7f74-49e3-a737-1cf825d08c5d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:57 crc kubenswrapper[4909]: I0202 12:06:57.707369 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg6fk\" (UniqueName: \"kubernetes.io/projected/44f1c177-7f74-49e3-a737-1cf825d08c5d-kube-api-access-fg6fk\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.123141 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hrrp5" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.123157 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hrrp5" event={"ID":"44f1c177-7f74-49e3-a737-1cf825d08c5d","Type":"ContainerDied","Data":"0cbdf9bfa57310360b9c0b08ec4b140fbb8e053ad3c5fdf3fd1b6eed91a6eb66"} Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.123436 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cbdf9bfa57310360b9c0b08ec4b140fbb8e053ad3c5fdf3fd1b6eed91a6eb66" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.469217 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9rcwj" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.519623 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98jh7\" (UniqueName: \"kubernetes.io/projected/c5d11b84-2bb6-432f-9a25-04877030be31-kube-api-access-98jh7\") pod \"c5d11b84-2bb6-432f-9a25-04877030be31\" (UID: \"c5d11b84-2bb6-432f-9a25-04877030be31\") " Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.519803 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d11b84-2bb6-432f-9a25-04877030be31-operator-scripts\") pod \"c5d11b84-2bb6-432f-9a25-04877030be31\" (UID: \"c5d11b84-2bb6-432f-9a25-04877030be31\") " Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.521129 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5d11b84-2bb6-432f-9a25-04877030be31-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5d11b84-2bb6-432f-9a25-04877030be31" (UID: "c5d11b84-2bb6-432f-9a25-04877030be31"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.529855 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d11b84-2bb6-432f-9a25-04877030be31-kube-api-access-98jh7" (OuterVolumeSpecName: "kube-api-access-98jh7") pod "c5d11b84-2bb6-432f-9a25-04877030be31" (UID: "c5d11b84-2bb6-432f-9a25-04877030be31"). InnerVolumeSpecName "kube-api-access-98jh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.622465 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5d11b84-2bb6-432f-9a25-04877030be31-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.622493 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98jh7\" (UniqueName: \"kubernetes.io/projected/c5d11b84-2bb6-432f-9a25-04877030be31-kube-api-access-98jh7\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.826690 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5743-account-create-update-v2jqx" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.834967 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b701-account-create-update-vpmq8" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.845462 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5399-account-create-update-rlcdb" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.852237 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m8g58" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.927340 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4848315d-b355-43b8-961b-440dd5a94e2b-operator-scripts\") pod \"4848315d-b355-43b8-961b-440dd5a94e2b\" (UID: \"4848315d-b355-43b8-961b-440dd5a94e2b\") " Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.927388 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96q5s\" (UniqueName: \"kubernetes.io/projected/4848315d-b355-43b8-961b-440dd5a94e2b-kube-api-access-96q5s\") pod \"4848315d-b355-43b8-961b-440dd5a94e2b\" (UID: \"4848315d-b355-43b8-961b-440dd5a94e2b\") " Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.927411 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2986b45-6e46-4553-999c-6a89b4565b88-operator-scripts\") pod \"d2986b45-6e46-4553-999c-6a89b4565b88\" (UID: \"d2986b45-6e46-4553-999c-6a89b4565b88\") " Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.927450 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-987l8\" (UniqueName: \"kubernetes.io/projected/8714421b-a562-42dc-8b61-262ddf02239f-kube-api-access-987l8\") pod \"8714421b-a562-42dc-8b61-262ddf02239f\" (UID: \"8714421b-a562-42dc-8b61-262ddf02239f\") " Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.927509 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42e427d5-c936-4052-958c-18d89b07527c-operator-scripts\") pod \"42e427d5-c936-4052-958c-18d89b07527c\" (UID: \"42e427d5-c936-4052-958c-18d89b07527c\") " Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.927581 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqkk2\" (UniqueName: \"kubernetes.io/projected/d2986b45-6e46-4553-999c-6a89b4565b88-kube-api-access-cqkk2\") pod \"d2986b45-6e46-4553-999c-6a89b4565b88\" (UID: \"d2986b45-6e46-4553-999c-6a89b4565b88\") " Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.927622 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c72wh\" (UniqueName: \"kubernetes.io/projected/42e427d5-c936-4052-958c-18d89b07527c-kube-api-access-c72wh\") pod \"42e427d5-c936-4052-958c-18d89b07527c\" (UID: \"42e427d5-c936-4052-958c-18d89b07527c\") " Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.927649 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8714421b-a562-42dc-8b61-262ddf02239f-operator-scripts\") pod \"8714421b-a562-42dc-8b61-262ddf02239f\" (UID: \"8714421b-a562-42dc-8b61-262ddf02239f\") " Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.928119 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8714421b-a562-42dc-8b61-262ddf02239f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8714421b-a562-42dc-8b61-262ddf02239f" (UID: "8714421b-a562-42dc-8b61-262ddf02239f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.928176 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e427d5-c936-4052-958c-18d89b07527c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "42e427d5-c936-4052-958c-18d89b07527c" (UID: "42e427d5-c936-4052-958c-18d89b07527c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.928437 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4848315d-b355-43b8-961b-440dd5a94e2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4848315d-b355-43b8-961b-440dd5a94e2b" (UID: "4848315d-b355-43b8-961b-440dd5a94e2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.929185 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2986b45-6e46-4553-999c-6a89b4565b88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2986b45-6e46-4553-999c-6a89b4565b88" (UID: "d2986b45-6e46-4553-999c-6a89b4565b88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.931184 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2986b45-6e46-4553-999c-6a89b4565b88-kube-api-access-cqkk2" (OuterVolumeSpecName: "kube-api-access-cqkk2") pod "d2986b45-6e46-4553-999c-6a89b4565b88" (UID: "d2986b45-6e46-4553-999c-6a89b4565b88"). InnerVolumeSpecName "kube-api-access-cqkk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.931217 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4848315d-b355-43b8-961b-440dd5a94e2b-kube-api-access-96q5s" (OuterVolumeSpecName: "kube-api-access-96q5s") pod "4848315d-b355-43b8-961b-440dd5a94e2b" (UID: "4848315d-b355-43b8-961b-440dd5a94e2b"). InnerVolumeSpecName "kube-api-access-96q5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.931239 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e427d5-c936-4052-958c-18d89b07527c-kube-api-access-c72wh" (OuterVolumeSpecName: "kube-api-access-c72wh") pod "42e427d5-c936-4052-958c-18d89b07527c" (UID: "42e427d5-c936-4052-958c-18d89b07527c"). InnerVolumeSpecName "kube-api-access-c72wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:06:58 crc kubenswrapper[4909]: I0202 12:06:58.936163 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8714421b-a562-42dc-8b61-262ddf02239f-kube-api-access-987l8" (OuterVolumeSpecName: "kube-api-access-987l8") pod "8714421b-a562-42dc-8b61-262ddf02239f" (UID: "8714421b-a562-42dc-8b61-262ddf02239f"). InnerVolumeSpecName "kube-api-access-987l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.029468 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4848315d-b355-43b8-961b-440dd5a94e2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.029710 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96q5s\" (UniqueName: \"kubernetes.io/projected/4848315d-b355-43b8-961b-440dd5a94e2b-kube-api-access-96q5s\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.029746 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2986b45-6e46-4553-999c-6a89b4565b88-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.029756 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-987l8\" (UniqueName: \"kubernetes.io/projected/8714421b-a562-42dc-8b61-262ddf02239f-kube-api-access-987l8\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.029765 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/42e427d5-c936-4052-958c-18d89b07527c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.029774 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqkk2\" (UniqueName: \"kubernetes.io/projected/d2986b45-6e46-4553-999c-6a89b4565b88-kube-api-access-cqkk2\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.029784 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c72wh\" (UniqueName: \"kubernetes.io/projected/42e427d5-c936-4052-958c-18d89b07527c-kube-api-access-c72wh\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.029793 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8714421b-a562-42dc-8b61-262ddf02239f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.140391 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9rcwj" event={"ID":"c5d11b84-2bb6-432f-9a25-04877030be31","Type":"ContainerDied","Data":"0c488002e98b2833998eec3e8ad8673e332bdd6014b57898d388d8dc0e13c3f6"} Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.140431 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c488002e98b2833998eec3e8ad8673e332bdd6014b57898d388d8dc0e13c3f6" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.140488 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9rcwj" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.142728 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-m8g58" event={"ID":"4848315d-b355-43b8-961b-440dd5a94e2b","Type":"ContainerDied","Data":"55bc5835572a67748a015e0bba83fcefd80ead3fb7c6bb8d2ffc709cbc496253"} Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.142773 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55bc5835572a67748a015e0bba83fcefd80ead3fb7c6bb8d2ffc709cbc496253" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.142744 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m8g58" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.147575 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5399-account-create-update-rlcdb" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.147580 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5399-account-create-update-rlcdb" event={"ID":"42e427d5-c936-4052-958c-18d89b07527c","Type":"ContainerDied","Data":"89ec57e63eae935470570bc988b2a59e010e41450fe3f5a8e75fed3bf4dc9d26"} Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.147832 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89ec57e63eae935470570bc988b2a59e010e41450fe3f5a8e75fed3bf4dc9d26" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.155880 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-b701-account-create-update-vpmq8" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.156908 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-b701-account-create-update-vpmq8" event={"ID":"d2986b45-6e46-4553-999c-6a89b4565b88","Type":"ContainerDied","Data":"f6567cbaab231f0a81691301c487be313868dce316065b55d4e08ed054944d59"} Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.156969 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6567cbaab231f0a81691301c487be313868dce316065b55d4e08ed054944d59" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.159254 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-5743-account-create-update-v2jqx" event={"ID":"8714421b-a562-42dc-8b61-262ddf02239f","Type":"ContainerDied","Data":"27c9b56dd3e2db8067c81763af57874e37ec1818ad83e83e3cf7f54bb05b007b"} Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.159284 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27c9b56dd3e2db8067c81763af57874e37ec1818ad83e83e3cf7f54bb05b007b" Feb 02 12:06:59 crc kubenswrapper[4909]: I0202 12:06:59.159344 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-5743-account-create-update-v2jqx" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.558515 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lljl6"] Feb 02 12:07:04 crc kubenswrapper[4909]: E0202 12:07:04.559678 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4848315d-b355-43b8-961b-440dd5a94e2b" containerName="mariadb-database-create" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.559696 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4848315d-b355-43b8-961b-440dd5a94e2b" containerName="mariadb-database-create" Feb 02 12:07:04 crc kubenswrapper[4909]: E0202 12:07:04.559713 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8714421b-a562-42dc-8b61-262ddf02239f" containerName="mariadb-account-create-update" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.559720 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8714421b-a562-42dc-8b61-262ddf02239f" containerName="mariadb-account-create-update" Feb 02 12:07:04 crc kubenswrapper[4909]: E0202 12:07:04.559750 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d11b84-2bb6-432f-9a25-04877030be31" containerName="mariadb-database-create" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.559760 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d11b84-2bb6-432f-9a25-04877030be31" containerName="mariadb-database-create" Feb 02 12:07:04 crc kubenswrapper[4909]: E0202 12:07:04.559793 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f1c177-7f74-49e3-a737-1cf825d08c5d" containerName="mariadb-database-create" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.559800 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f1c177-7f74-49e3-a737-1cf825d08c5d" containerName="mariadb-database-create" Feb 02 12:07:04 crc kubenswrapper[4909]: E0202 12:07:04.559829 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2986b45-6e46-4553-999c-6a89b4565b88" containerName="mariadb-account-create-update" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.559837 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2986b45-6e46-4553-999c-6a89b4565b88" containerName="mariadb-account-create-update" Feb 02 12:07:04 crc kubenswrapper[4909]: E0202 12:07:04.559847 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e427d5-c936-4052-958c-18d89b07527c" containerName="mariadb-account-create-update" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.559854 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e427d5-c936-4052-958c-18d89b07527c" containerName="mariadb-account-create-update" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.560093 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2986b45-6e46-4553-999c-6a89b4565b88" containerName="mariadb-account-create-update" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.560114 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d11b84-2bb6-432f-9a25-04877030be31" containerName="mariadb-database-create" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.560127 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4848315d-b355-43b8-961b-440dd5a94e2b" containerName="mariadb-database-create" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.560140 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e427d5-c936-4052-958c-18d89b07527c" containerName="mariadb-account-create-update" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.560154 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f1c177-7f74-49e3-a737-1cf825d08c5d" containerName="mariadb-database-create" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.560165 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8714421b-a562-42dc-8b61-262ddf02239f" containerName="mariadb-account-create-update" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.561004 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.565319 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.565794 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.565848 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2rr8j" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.570688 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lljl6"] Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.730584 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-config-data\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.730696 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2qj6\" (UniqueName: \"kubernetes.io/projected/5c8ec03f-2941-41a2-b43f-10c7041993d0-kube-api-access-k2qj6\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.730760 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.730919 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-scripts\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.833183 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-config-data\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.833235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2qj6\" (UniqueName: \"kubernetes.io/projected/5c8ec03f-2941-41a2-b43f-10c7041993d0-kube-api-access-k2qj6\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.833262 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.833295 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-scripts\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.839214 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.839267 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-scripts\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.839678 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-config-data\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.855116 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2qj6\" (UniqueName: \"kubernetes.io/projected/5c8ec03f-2941-41a2-b43f-10c7041993d0-kube-api-access-k2qj6\") pod \"nova-cell0-conductor-db-sync-lljl6\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:04 crc kubenswrapper[4909]: I0202 12:07:04.885120 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:05 crc kubenswrapper[4909]: I0202 12:07:05.370416 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lljl6"] Feb 02 12:07:06 crc kubenswrapper[4909]: I0202 12:07:06.211915 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lljl6" event={"ID":"5c8ec03f-2941-41a2-b43f-10c7041993d0","Type":"ContainerStarted","Data":"f509d4206fb48841bfc489248aad52ac9240136f0d669359c7867bdb92d643bc"} Feb 02 12:07:06 crc kubenswrapper[4909]: I0202 12:07:06.213177 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lljl6" event={"ID":"5c8ec03f-2941-41a2-b43f-10c7041993d0","Type":"ContainerStarted","Data":"d3b1ccf2b17cc063713a1de087db5fbee0393e60443fc7ce5789b6c58cd15688"} Feb 02 12:07:06 crc kubenswrapper[4909]: I0202 12:07:06.231125 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-lljl6" podStartSLOduration=2.231105183 podStartE2EDuration="2.231105183s" podCreationTimestamp="2026-02-02 12:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:06.226737229 +0000 UTC m=+5751.972837964" watchObservedRunningTime="2026-02-02 12:07:06.231105183 +0000 UTC m=+5751.977205918" Feb 02 12:07:07 crc kubenswrapper[4909]: I0202 12:07:07.016549 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:07:07 crc kubenswrapper[4909]: E0202 12:07:07.016957 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:07:11 crc kubenswrapper[4909]: I0202 12:07:11.250786 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c8ec03f-2941-41a2-b43f-10c7041993d0" containerID="f509d4206fb48841bfc489248aad52ac9240136f0d669359c7867bdb92d643bc" exitCode=0 Feb 02 12:07:11 crc kubenswrapper[4909]: I0202 12:07:11.250879 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lljl6" event={"ID":"5c8ec03f-2941-41a2-b43f-10c7041993d0","Type":"ContainerDied","Data":"f509d4206fb48841bfc489248aad52ac9240136f0d669359c7867bdb92d643bc"} Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.526216 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.682474 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-combined-ca-bundle\") pod \"5c8ec03f-2941-41a2-b43f-10c7041993d0\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.682782 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2qj6\" (UniqueName: \"kubernetes.io/projected/5c8ec03f-2941-41a2-b43f-10c7041993d0-kube-api-access-k2qj6\") pod \"5c8ec03f-2941-41a2-b43f-10c7041993d0\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.682941 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-scripts\") pod \"5c8ec03f-2941-41a2-b43f-10c7041993d0\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.683767 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-config-data\") pod \"5c8ec03f-2941-41a2-b43f-10c7041993d0\" (UID: \"5c8ec03f-2941-41a2-b43f-10c7041993d0\") " Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.687853 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-scripts" (OuterVolumeSpecName: "scripts") pod "5c8ec03f-2941-41a2-b43f-10c7041993d0" (UID: "5c8ec03f-2941-41a2-b43f-10c7041993d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.688502 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8ec03f-2941-41a2-b43f-10c7041993d0-kube-api-access-k2qj6" (OuterVolumeSpecName: "kube-api-access-k2qj6") pod "5c8ec03f-2941-41a2-b43f-10c7041993d0" (UID: "5c8ec03f-2941-41a2-b43f-10c7041993d0"). InnerVolumeSpecName "kube-api-access-k2qj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.707046 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c8ec03f-2941-41a2-b43f-10c7041993d0" (UID: "5c8ec03f-2941-41a2-b43f-10c7041993d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.708185 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-config-data" (OuterVolumeSpecName: "config-data") pod "5c8ec03f-2941-41a2-b43f-10c7041993d0" (UID: "5c8ec03f-2941-41a2-b43f-10c7041993d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.788132 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2qj6\" (UniqueName: \"kubernetes.io/projected/5c8ec03f-2941-41a2-b43f-10c7041993d0-kube-api-access-k2qj6\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.788172 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.788189 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:12 crc kubenswrapper[4909]: I0202 12:07:12.788201 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c8ec03f-2941-41a2-b43f-10c7041993d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.271149 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-lljl6" event={"ID":"5c8ec03f-2941-41a2-b43f-10c7041993d0","Type":"ContainerDied","Data":"d3b1ccf2b17cc063713a1de087db5fbee0393e60443fc7ce5789b6c58cd15688"} Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.271392 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3b1ccf2b17cc063713a1de087db5fbee0393e60443fc7ce5789b6c58cd15688" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.271221 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-lljl6" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.331026 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 12:07:13 crc kubenswrapper[4909]: E0202 12:07:13.331423 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c8ec03f-2941-41a2-b43f-10c7041993d0" containerName="nova-cell0-conductor-db-sync" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.331436 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8ec03f-2941-41a2-b43f-10c7041993d0" containerName="nova-cell0-conductor-db-sync" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.331622 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c8ec03f-2941-41a2-b43f-10c7041993d0" containerName="nova-cell0-conductor-db-sync" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.332276 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.334165 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.334551 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2rr8j" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.355022 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.499960 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.500098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dljv9\" (UniqueName: \"kubernetes.io/projected/d686e2d0-26e4-43fc-9e98-36226276b450-kube-api-access-dljv9\") pod \"nova-cell0-conductor-0\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.500149 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.601705 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.601886 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.601977 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dljv9\" (UniqueName: \"kubernetes.io/projected/d686e2d0-26e4-43fc-9e98-36226276b450-kube-api-access-dljv9\") pod \"nova-cell0-conductor-0\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.606719 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.607475 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.618418 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dljv9\" (UniqueName: \"kubernetes.io/projected/d686e2d0-26e4-43fc-9e98-36226276b450-kube-api-access-dljv9\") pod \"nova-cell0-conductor-0\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:13 crc kubenswrapper[4909]: I0202 12:07:13.648207 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:14 crc kubenswrapper[4909]: I0202 12:07:14.125532 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 12:07:14 crc kubenswrapper[4909]: I0202 12:07:14.282472 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d686e2d0-26e4-43fc-9e98-36226276b450","Type":"ContainerStarted","Data":"1f0b7de235ecef1b5a04ec9d79b3e40d7da93417b36fff7fd0b7c9fde3c57bf7"} Feb 02 12:07:15 crc kubenswrapper[4909]: I0202 12:07:15.292467 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d686e2d0-26e4-43fc-9e98-36226276b450","Type":"ContainerStarted","Data":"5d6a983ac213af33ce5f76d9939df082c1493a1a5e3016bfcd43a001501e8bff"} Feb 02 12:07:15 crc kubenswrapper[4909]: I0202 12:07:15.293823 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:15 crc kubenswrapper[4909]: I0202 12:07:15.319608 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.319589926 podStartE2EDuration="2.319589926s" podCreationTimestamp="2026-02-02 12:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:15.314664636 +0000 UTC m=+5761.060765371" watchObservedRunningTime="2026-02-02 12:07:15.319589926 +0000 UTC m=+5761.065690661" Feb 02 12:07:21 crc kubenswrapper[4909]: I0202 12:07:21.016573 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:07:21 crc kubenswrapper[4909]: E0202 12:07:21.017360 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:07:23 crc kubenswrapper[4909]: I0202 12:07:23.682942 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.136310 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gwh2v"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.137642 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.139912 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.140022 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.148006 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gwh2v"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.279426 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.280520 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.289286 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.297286 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.298255 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv458\" (UniqueName: \"kubernetes.io/projected/2bf1731a-52df-489c-a7f3-8f07774153a1-kube-api-access-wv458\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.298405 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.298430 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-config-data\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.298481 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-scripts\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.350803 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.352744 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.358674 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.369887 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.402936 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.403089 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-config-data\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.403116 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.403145 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.403169 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l85tt\" (UniqueName: \"kubernetes.io/projected/1dbfc015-241c-45df-baf1-789bf92b3642-kube-api-access-l85tt\") pod \"nova-cell1-novncproxy-0\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.403209 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-scripts\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.403249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv458\" (UniqueName: \"kubernetes.io/projected/2bf1731a-52df-489c-a7f3-8f07774153a1-kube-api-access-wv458\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.411526 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-config-data\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.411846 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.412160 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-scripts\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.436974 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv458\" (UniqueName: \"kubernetes.io/projected/2bf1731a-52df-489c-a7f3-8f07774153a1-kube-api-access-wv458\") pod \"nova-cell0-cell-mapping-gwh2v\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.466971 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.491412 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.493300 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.505944 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77pkp\" (UniqueName: \"kubernetes.io/projected/c7e1382a-3ffc-4779-a8ea-03261453225a-kube-api-access-77pkp\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.505995 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.506015 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l85tt\" (UniqueName: \"kubernetes.io/projected/1dbfc015-241c-45df-baf1-789bf92b3642-kube-api-access-l85tt\") pod \"nova-cell1-novncproxy-0\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.506101 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.506126 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e1382a-3ffc-4779-a8ea-03261453225a-logs\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.506156 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.506195 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-config-data\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.513661 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.517586 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.532834 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.556909 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b98ff868f-ppx5k"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.558734 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.586010 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.588059 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.588442 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l85tt\" (UniqueName: \"kubernetes.io/projected/1dbfc015-241c-45df-baf1-789bf92b3642-kube-api-access-l85tt\") pod \"nova-cell1-novncproxy-0\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.596151 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.608003 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf67e66-a29f-46d3-8f82-1b1b21211189-logs\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.608059 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77pkp\" (UniqueName: \"kubernetes.io/projected/c7e1382a-3ffc-4779-a8ea-03261453225a-kube-api-access-77pkp\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.608125 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.608143 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwd9\" (UniqueName: \"kubernetes.io/projected/ccf67e66-a29f-46d3-8f82-1b1b21211189-kube-api-access-6nwd9\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.608180 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.608206 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e1382a-3ffc-4779-a8ea-03261453225a-logs\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.608244 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-config-data\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.608300 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-config-data\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.610267 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e1382a-3ffc-4779-a8ea-03261453225a-logs\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.612921 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.613724 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.619157 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-config-data\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.625935 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.635875 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77pkp\" (UniqueName: \"kubernetes.io/projected/c7e1382a-3ffc-4779-a8ea-03261453225a-kube-api-access-77pkp\") pod \"nova-metadata-0\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.658886 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.674439 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b98ff868f-ppx5k"] Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.681360 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.716018 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr6tf\" (UniqueName: \"kubernetes.io/projected/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-kube-api-access-mr6tf\") pod \"nova-scheduler-0\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " pod="openstack/nova-scheduler-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.716479 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s829q\" (UniqueName: \"kubernetes.io/projected/846cb0f7-da5f-4088-bcca-05c67d940e83-kube-api-access-s829q\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.716540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.716575 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " pod="openstack/nova-scheduler-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.716682 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwd9\" (UniqueName: \"kubernetes.io/projected/ccf67e66-a29f-46d3-8f82-1b1b21211189-kube-api-access-6nwd9\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.716832 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-dns-svc\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.716945 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-config\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.717051 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-sb\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.717130 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-nb\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.717269 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-config-data\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.717538 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf67e66-a29f-46d3-8f82-1b1b21211189-logs\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.717561 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-config-data\") pod \"nova-scheduler-0\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " pod="openstack/nova-scheduler-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.726166 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.727162 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf67e66-a29f-46d3-8f82-1b1b21211189-logs\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.753216 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-config-data\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.776769 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwd9\" (UniqueName: \"kubernetes.io/projected/ccf67e66-a29f-46d3-8f82-1b1b21211189-kube-api-access-6nwd9\") pod \"nova-api-0\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " pod="openstack/nova-api-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.823118 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-config-data\") pod \"nova-scheduler-0\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " pod="openstack/nova-scheduler-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.823233 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr6tf\" (UniqueName: \"kubernetes.io/projected/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-kube-api-access-mr6tf\") pod \"nova-scheduler-0\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " pod="openstack/nova-scheduler-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.823282 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s829q\" (UniqueName: \"kubernetes.io/projected/846cb0f7-da5f-4088-bcca-05c67d940e83-kube-api-access-s829q\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.823334 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " pod="openstack/nova-scheduler-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.823362 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-dns-svc\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.823386 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-config\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.823422 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-sb\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.823460 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-nb\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.824592 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-nb\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.827498 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-sb\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.828403 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-config\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.829883 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-dns-svc\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.834239 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-config-data\") pod \"nova-scheduler-0\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " pod="openstack/nova-scheduler-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.846152 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " pod="openstack/nova-scheduler-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.888456 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr6tf\" (UniqueName: \"kubernetes.io/projected/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-kube-api-access-mr6tf\") pod \"nova-scheduler-0\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " pod="openstack/nova-scheduler-0" Feb 02 12:07:24 crc kubenswrapper[4909]: I0202 12:07:24.889361 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s829q\" (UniqueName: \"kubernetes.io/projected/846cb0f7-da5f-4088-bcca-05c67d940e83-kube-api-access-s829q\") pod \"dnsmasq-dns-b98ff868f-ppx5k\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.023048 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.064410 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.091085 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.240432 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gwh2v"] Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.426447 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gwh2v" event={"ID":"2bf1731a-52df-489c-a7f3-8f07774153a1","Type":"ContainerStarted","Data":"56ebf8933d509159848434579c0e500adce1bb4e2fd28487f42b06538b92c427"} Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.488382 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.609214 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nv42m"] Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.610731 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.615327 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.615500 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.622357 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nv42m"] Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.669255 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 12:07:25 crc kubenswrapper[4909]: W0202 12:07:25.742140 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dbfc015_241c_45df_baf1_789bf92b3642.slice/crio-d60b08c23b42e8757e3c418fdeb53373e12b52b840bf97aafec9145150ad205f WatchSource:0}: Error finding container d60b08c23b42e8757e3c418fdeb53373e12b52b840bf97aafec9145150ad205f: Status 404 returned error can't find the container with id d60b08c23b42e8757e3c418fdeb53373e12b52b840bf97aafec9145150ad205f Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.745290 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.745335 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-config-data\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.745385 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4mqm\" (UniqueName: \"kubernetes.io/projected/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-kube-api-access-m4mqm\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.745415 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-scripts\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.849164 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4mqm\" (UniqueName: \"kubernetes.io/projected/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-kube-api-access-m4mqm\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.849647 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-scripts\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.850041 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.850117 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-config-data\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.853580 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.854503 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.856695 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-scripts\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.867028 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-config-data\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.877917 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4mqm\" (UniqueName: \"kubernetes.io/projected/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-kube-api-access-m4mqm\") pod \"nova-cell1-conductor-db-sync-nv42m\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:25 crc kubenswrapper[4909]: I0202 12:07:25.954749 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b98ff868f-ppx5k"] Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.001471 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.126337 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.447526 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7","Type":"ContainerStarted","Data":"b6a65f1080c0ab6c72a4db50014fe622195670290139bd499d2d6d826b74a6dd"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.448537 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7","Type":"ContainerStarted","Data":"18915acb9ba6e2cb3557edbc0b73a2cba72ded8260d4fc54e15feb2375a641d6"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.465003 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ccf67e66-a29f-46d3-8f82-1b1b21211189","Type":"ContainerStarted","Data":"64c09969500bb7b421cdfcbd7ed9710827601fcd712ecc9c1501c68919dc1a4b"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.465241 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ccf67e66-a29f-46d3-8f82-1b1b21211189","Type":"ContainerStarted","Data":"44c9928a76f06252169ec8da41f7fe6c50bc1d1848ed5fe6dd867101383f49fc"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.465303 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ccf67e66-a29f-46d3-8f82-1b1b21211189","Type":"ContainerStarted","Data":"20d39084384efab7ebead9847aa1c22e6f7d9bc994967cb043c697540a8bbf62"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.486791 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.4867694289999998 podStartE2EDuration="2.486769429s" podCreationTimestamp="2026-02-02 12:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:26.47727249 +0000 UTC m=+5772.223373225" watchObservedRunningTime="2026-02-02 12:07:26.486769429 +0000 UTC m=+5772.232870164" Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.490336 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1dbfc015-241c-45df-baf1-789bf92b3642","Type":"ContainerStarted","Data":"d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.490387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1dbfc015-241c-45df-baf1-789bf92b3642","Type":"ContainerStarted","Data":"d60b08c23b42e8757e3c418fdeb53373e12b52b840bf97aafec9145150ad205f"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.508567 4909 generic.go:334] "Generic (PLEG): container finished" podID="846cb0f7-da5f-4088-bcca-05c67d940e83" containerID="f9907d6fba5061a45123d1fbecc75ca822de08d7ef12ae69af5f51716153c2ce" exitCode=0 Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.508624 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" event={"ID":"846cb0f7-da5f-4088-bcca-05c67d940e83","Type":"ContainerDied","Data":"f9907d6fba5061a45123d1fbecc75ca822de08d7ef12ae69af5f51716153c2ce"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.508688 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" event={"ID":"846cb0f7-da5f-4088-bcca-05c67d940e83","Type":"ContainerStarted","Data":"c3b3ddd00568a433b3c9b62b2c87e9ebbdd846205cb577475659d20d4ac73846"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.517782 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gwh2v" event={"ID":"2bf1731a-52df-489c-a7f3-8f07774153a1","Type":"ContainerStarted","Data":"f8e95fbdf18d135b654dd615e0aa94162c15ef0a4cc4e312b5c15941e23926bc"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.552289 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1382a-3ffc-4779-a8ea-03261453225a","Type":"ContainerStarted","Data":"bd64739b2440984cbb4b168db2c259434809904a4de7df6d8b5d5fa773dc2b8a"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.552643 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1382a-3ffc-4779-a8ea-03261453225a","Type":"ContainerStarted","Data":"affa4329f70d92173c21853cca9d9d0833cf6071e27b3dd38ad052542a118e31"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.552662 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1382a-3ffc-4779-a8ea-03261453225a","Type":"ContainerStarted","Data":"ac7f78ef6d3e3bb744dca80998ef0e4ed2a840b8b0d194d187cf5a8fefe1d6e8"} Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.593618 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.593589601 podStartE2EDuration="2.593589601s" podCreationTimestamp="2026-02-02 12:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:26.525365565 +0000 UTC m=+5772.271466300" watchObservedRunningTime="2026-02-02 12:07:26.593589601 +0000 UTC m=+5772.339690336" Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.626004 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.625985071 podStartE2EDuration="2.625985071s" podCreationTimestamp="2026-02-02 12:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:26.507157078 +0000 UTC m=+5772.253257833" watchObservedRunningTime="2026-02-02 12:07:26.625985071 +0000 UTC m=+5772.372085806" Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.651990 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gwh2v" podStartSLOduration=2.651970688 podStartE2EDuration="2.651970688s" podCreationTimestamp="2026-02-02 12:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:26.60695015 +0000 UTC m=+5772.353050895" watchObservedRunningTime="2026-02-02 12:07:26.651970688 +0000 UTC m=+5772.398071423" Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.737063 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.737034873 podStartE2EDuration="2.737034873s" podCreationTimestamp="2026-02-02 12:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:26.632916947 +0000 UTC m=+5772.379017692" watchObservedRunningTime="2026-02-02 12:07:26.737034873 +0000 UTC m=+5772.483135608" Feb 02 12:07:26 crc kubenswrapper[4909]: I0202 12:07:26.784965 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nv42m"] Feb 02 12:07:27 crc kubenswrapper[4909]: I0202 12:07:27.563046 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" event={"ID":"846cb0f7-da5f-4088-bcca-05c67d940e83","Type":"ContainerStarted","Data":"1e22a58adfb790c4dcc149d293cb56c9c93cf1e035e4bc68748662266b1baa0d"} Feb 02 12:07:27 crc kubenswrapper[4909]: I0202 12:07:27.563595 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:27 crc kubenswrapper[4909]: I0202 12:07:27.565023 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nv42m" event={"ID":"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736","Type":"ContainerStarted","Data":"be1161d05bde089eede341d85c3b89f4a7814c7514ec7035c999b85395228059"} Feb 02 12:07:27 crc kubenswrapper[4909]: I0202 12:07:27.565069 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nv42m" event={"ID":"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736","Type":"ContainerStarted","Data":"598c856a553124b3feb36c7fe9be0b5b5e6d509becbd3ef25010065fa18e73f8"} Feb 02 12:07:27 crc kubenswrapper[4909]: I0202 12:07:27.584200 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" podStartSLOduration=3.584179957 podStartE2EDuration="3.584179957s" podCreationTimestamp="2026-02-02 12:07:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:27.57970054 +0000 UTC m=+5773.325801265" watchObservedRunningTime="2026-02-02 12:07:27.584179957 +0000 UTC m=+5773.330280692" Feb 02 12:07:27 crc kubenswrapper[4909]: I0202 12:07:27.602288 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-nv42m" podStartSLOduration=2.60226796 podStartE2EDuration="2.60226796s" podCreationTimestamp="2026-02-02 12:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:27.599054779 +0000 UTC m=+5773.345155514" watchObservedRunningTime="2026-02-02 12:07:27.60226796 +0000 UTC m=+5773.348368695" Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.223935 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.224192 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1dbfc015-241c-45df-baf1-789bf92b3642" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1" gracePeriod=30 Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.252596 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.252867 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c7e1382a-3ffc-4779-a8ea-03261453225a" containerName="nova-metadata-log" containerID="cri-o://affa4329f70d92173c21853cca9d9d0833cf6071e27b3dd38ad052542a118e31" gracePeriod=30 Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.252979 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c7e1382a-3ffc-4779-a8ea-03261453225a" containerName="nova-metadata-metadata" containerID="cri-o://bd64739b2440984cbb4b168db2c259434809904a4de7df6d8b5d5fa773dc2b8a" gracePeriod=30 Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.582773 4909 generic.go:334] "Generic (PLEG): container finished" podID="c7e1382a-3ffc-4779-a8ea-03261453225a" containerID="bd64739b2440984cbb4b168db2c259434809904a4de7df6d8b5d5fa773dc2b8a" exitCode=0 Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.582823 4909 generic.go:334] "Generic (PLEG): container finished" podID="c7e1382a-3ffc-4779-a8ea-03261453225a" containerID="affa4329f70d92173c21853cca9d9d0833cf6071e27b3dd38ad052542a118e31" exitCode=143 Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.582851 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1382a-3ffc-4779-a8ea-03261453225a","Type":"ContainerDied","Data":"bd64739b2440984cbb4b168db2c259434809904a4de7df6d8b5d5fa773dc2b8a"} Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.582880 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1382a-3ffc-4779-a8ea-03261453225a","Type":"ContainerDied","Data":"affa4329f70d92173c21853cca9d9d0833cf6071e27b3dd38ad052542a118e31"} Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.613613 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.682672 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 12:07:29 crc kubenswrapper[4909]: I0202 12:07:29.682736 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.091949 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.330067 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.361501 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.474676 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-combined-ca-bundle\") pod \"1dbfc015-241c-45df-baf1-789bf92b3642\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.474734 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77pkp\" (UniqueName: \"kubernetes.io/projected/c7e1382a-3ffc-4779-a8ea-03261453225a-kube-api-access-77pkp\") pod \"c7e1382a-3ffc-4779-a8ea-03261453225a\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.474863 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l85tt\" (UniqueName: \"kubernetes.io/projected/1dbfc015-241c-45df-baf1-789bf92b3642-kube-api-access-l85tt\") pod \"1dbfc015-241c-45df-baf1-789bf92b3642\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.475073 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e1382a-3ffc-4779-a8ea-03261453225a-logs\") pod \"c7e1382a-3ffc-4779-a8ea-03261453225a\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.475098 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-combined-ca-bundle\") pod \"c7e1382a-3ffc-4779-a8ea-03261453225a\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.475134 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-config-data\") pod \"1dbfc015-241c-45df-baf1-789bf92b3642\" (UID: \"1dbfc015-241c-45df-baf1-789bf92b3642\") " Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.475193 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-config-data\") pod \"c7e1382a-3ffc-4779-a8ea-03261453225a\" (UID: \"c7e1382a-3ffc-4779-a8ea-03261453225a\") " Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.475735 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7e1382a-3ffc-4779-a8ea-03261453225a-logs" (OuterVolumeSpecName: "logs") pod "c7e1382a-3ffc-4779-a8ea-03261453225a" (UID: "c7e1382a-3ffc-4779-a8ea-03261453225a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.480191 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dbfc015-241c-45df-baf1-789bf92b3642-kube-api-access-l85tt" (OuterVolumeSpecName: "kube-api-access-l85tt") pod "1dbfc015-241c-45df-baf1-789bf92b3642" (UID: "1dbfc015-241c-45df-baf1-789bf92b3642"). InnerVolumeSpecName "kube-api-access-l85tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.480323 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e1382a-3ffc-4779-a8ea-03261453225a-kube-api-access-77pkp" (OuterVolumeSpecName: "kube-api-access-77pkp") pod "c7e1382a-3ffc-4779-a8ea-03261453225a" (UID: "c7e1382a-3ffc-4779-a8ea-03261453225a"). InnerVolumeSpecName "kube-api-access-77pkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.510280 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dbfc015-241c-45df-baf1-789bf92b3642" (UID: "1dbfc015-241c-45df-baf1-789bf92b3642"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.510295 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-config-data" (OuterVolumeSpecName: "config-data") pod "c7e1382a-3ffc-4779-a8ea-03261453225a" (UID: "c7e1382a-3ffc-4779-a8ea-03261453225a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.513786 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-config-data" (OuterVolumeSpecName: "config-data") pod "1dbfc015-241c-45df-baf1-789bf92b3642" (UID: "1dbfc015-241c-45df-baf1-789bf92b3642"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.520984 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7e1382a-3ffc-4779-a8ea-03261453225a" (UID: "c7e1382a-3ffc-4779-a8ea-03261453225a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.578742 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.578774 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77pkp\" (UniqueName: \"kubernetes.io/projected/c7e1382a-3ffc-4779-a8ea-03261453225a-kube-api-access-77pkp\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.578786 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l85tt\" (UniqueName: \"kubernetes.io/projected/1dbfc015-241c-45df-baf1-789bf92b3642-kube-api-access-l85tt\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.578795 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.578820 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e1382a-3ffc-4779-a8ea-03261453225a-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.578831 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dbfc015-241c-45df-baf1-789bf92b3642-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.578839 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e1382a-3ffc-4779-a8ea-03261453225a-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.591764 4909 generic.go:334] "Generic (PLEG): container finished" podID="1dbfc015-241c-45df-baf1-789bf92b3642" containerID="d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1" exitCode=0 Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.591833 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1dbfc015-241c-45df-baf1-789bf92b3642","Type":"ContainerDied","Data":"d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1"} Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.591859 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1dbfc015-241c-45df-baf1-789bf92b3642","Type":"ContainerDied","Data":"d60b08c23b42e8757e3c418fdeb53373e12b52b840bf97aafec9145150ad205f"} Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.591875 4909 scope.go:117] "RemoveContainer" containerID="d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.591993 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.595975 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c7e1382a-3ffc-4779-a8ea-03261453225a","Type":"ContainerDied","Data":"ac7f78ef6d3e3bb744dca80998ef0e4ed2a840b8b0d194d187cf5a8fefe1d6e8"} Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.596117 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.597648 4909 generic.go:334] "Generic (PLEG): container finished" podID="61a9bf80-53fb-48a1-b6ce-92cb6c3b0736" containerID="be1161d05bde089eede341d85c3b89f4a7814c7514ec7035c999b85395228059" exitCode=0 Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.597776 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nv42m" event={"ID":"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736","Type":"ContainerDied","Data":"be1161d05bde089eede341d85c3b89f4a7814c7514ec7035c999b85395228059"} Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.630107 4909 scope.go:117] "RemoveContainer" containerID="d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1" Feb 02 12:07:30 crc kubenswrapper[4909]: E0202 12:07:30.635976 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1\": container with ID starting with d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1 not found: ID does not exist" containerID="d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.636045 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1"} err="failed to get container status \"d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1\": rpc error: code = NotFound desc = could not find container \"d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1\": container with ID starting with d1f6fdb0d837275886d8798fe1233c7d167e4d3b69749311ee9b548bafbfe7a1 not found: ID does not exist" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.636074 4909 scope.go:117] "RemoveContainer" containerID="bd64739b2440984cbb4b168db2c259434809904a4de7df6d8b5d5fa773dc2b8a" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.647703 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.664491 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.678277 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.688636 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.697783 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 12:07:30 crc kubenswrapper[4909]: E0202 12:07:30.698209 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e1382a-3ffc-4779-a8ea-03261453225a" containerName="nova-metadata-log" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.698226 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e1382a-3ffc-4779-a8ea-03261453225a" containerName="nova-metadata-log" Feb 02 12:07:30 crc kubenswrapper[4909]: E0202 12:07:30.698239 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dbfc015-241c-45df-baf1-789bf92b3642" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.698246 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dbfc015-241c-45df-baf1-789bf92b3642" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 12:07:30 crc kubenswrapper[4909]: E0202 12:07:30.698264 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e1382a-3ffc-4779-a8ea-03261453225a" containerName="nova-metadata-metadata" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.698270 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e1382a-3ffc-4779-a8ea-03261453225a" containerName="nova-metadata-metadata" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.698428 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e1382a-3ffc-4779-a8ea-03261453225a" containerName="nova-metadata-log" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.698443 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dbfc015-241c-45df-baf1-789bf92b3642" containerName="nova-cell1-novncproxy-novncproxy" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.698459 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e1382a-3ffc-4779-a8ea-03261453225a" containerName="nova-metadata-metadata" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.699081 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.700701 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.701090 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.702274 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.702414 4909 scope.go:117] "RemoveContainer" containerID="affa4329f70d92173c21853cca9d9d0833cf6071e27b3dd38ad052542a118e31" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.714952 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.730913 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.733885 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.737583 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.741949 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.741986 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.782801 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.782917 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqnxk\" (UniqueName: \"kubernetes.io/projected/e34df5c5-803e-42a9-9cba-6562cc33f0d1-kube-api-access-pqnxk\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.782951 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.783001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.783111 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.884775 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.884894 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.884941 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-config-data\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.884965 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.885009 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.885086 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llft7\" (UniqueName: \"kubernetes.io/projected/70897894-4676-41ac-96a3-ef5c97b052e6-kube-api-access-llft7\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.885147 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.885170 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqnxk\" (UniqueName: \"kubernetes.io/projected/e34df5c5-803e-42a9-9cba-6562cc33f0d1-kube-api-access-pqnxk\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.885199 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.885434 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70897894-4676-41ac-96a3-ef5c97b052e6-logs\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.889873 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.889873 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.890143 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.903524 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e34df5c5-803e-42a9-9cba-6562cc33f0d1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.903838 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqnxk\" (UniqueName: \"kubernetes.io/projected/e34df5c5-803e-42a9-9cba-6562cc33f0d1-kube-api-access-pqnxk\") pod \"nova-cell1-novncproxy-0\" (UID: \"e34df5c5-803e-42a9-9cba-6562cc33f0d1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.987437 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.987499 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-config-data\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.987520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.987597 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llft7\" (UniqueName: \"kubernetes.io/projected/70897894-4676-41ac-96a3-ef5c97b052e6-kube-api-access-llft7\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.987657 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70897894-4676-41ac-96a3-ef5c97b052e6-logs\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.988103 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70897894-4676-41ac-96a3-ef5c97b052e6-logs\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.991219 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.991272 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:30 crc kubenswrapper[4909]: I0202 12:07:30.991512 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-config-data\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:31 crc kubenswrapper[4909]: I0202 12:07:31.005698 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llft7\" (UniqueName: \"kubernetes.io/projected/70897894-4676-41ac-96a3-ef5c97b052e6-kube-api-access-llft7\") pod \"nova-metadata-0\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " pod="openstack/nova-metadata-0" Feb 02 12:07:31 crc kubenswrapper[4909]: I0202 12:07:31.027506 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dbfc015-241c-45df-baf1-789bf92b3642" path="/var/lib/kubelet/pods/1dbfc015-241c-45df-baf1-789bf92b3642/volumes" Feb 02 12:07:31 crc kubenswrapper[4909]: I0202 12:07:31.028280 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e1382a-3ffc-4779-a8ea-03261453225a" path="/var/lib/kubelet/pods/c7e1382a-3ffc-4779-a8ea-03261453225a/volumes" Feb 02 12:07:31 crc kubenswrapper[4909]: I0202 12:07:31.035647 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:31 crc kubenswrapper[4909]: I0202 12:07:31.062929 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:31 crc kubenswrapper[4909]: I0202 12:07:31.504688 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 02 12:07:31 crc kubenswrapper[4909]: I0202 12:07:31.584052 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:31 crc kubenswrapper[4909]: W0202 12:07:31.595934 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70897894_4676_41ac_96a3_ef5c97b052e6.slice/crio-3bc7db43901878125a8ee721eed6c44ebec1bd983a33d07e3a45446cb1673464 WatchSource:0}: Error finding container 3bc7db43901878125a8ee721eed6c44ebec1bd983a33d07e3a45446cb1673464: Status 404 returned error can't find the container with id 3bc7db43901878125a8ee721eed6c44ebec1bd983a33d07e3a45446cb1673464 Feb 02 12:07:31 crc kubenswrapper[4909]: I0202 12:07:31.622589 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e34df5c5-803e-42a9-9cba-6562cc33f0d1","Type":"ContainerStarted","Data":"08a628388edbef9539afd01cdd1f3940022249c349bd1787048dada9fb032a13"} Feb 02 12:07:31 crc kubenswrapper[4909]: I0202 12:07:31.625174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70897894-4676-41ac-96a3-ef5c97b052e6","Type":"ContainerStarted","Data":"3bc7db43901878125a8ee721eed6c44ebec1bd983a33d07e3a45446cb1673464"} Feb 02 12:07:31 crc kubenswrapper[4909]: I0202 12:07:31.896185 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.023830 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-scripts\") pod \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.024005 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-combined-ca-bundle\") pod \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.024033 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4mqm\" (UniqueName: \"kubernetes.io/projected/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-kube-api-access-m4mqm\") pod \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.024088 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-config-data\") pod \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\" (UID: \"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736\") " Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.027965 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-scripts" (OuterVolumeSpecName: "scripts") pod "61a9bf80-53fb-48a1-b6ce-92cb6c3b0736" (UID: "61a9bf80-53fb-48a1-b6ce-92cb6c3b0736"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.028347 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-kube-api-access-m4mqm" (OuterVolumeSpecName: "kube-api-access-m4mqm") pod "61a9bf80-53fb-48a1-b6ce-92cb6c3b0736" (UID: "61a9bf80-53fb-48a1-b6ce-92cb6c3b0736"). InnerVolumeSpecName "kube-api-access-m4mqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.048705 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61a9bf80-53fb-48a1-b6ce-92cb6c3b0736" (UID: "61a9bf80-53fb-48a1-b6ce-92cb6c3b0736"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.056735 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-config-data" (OuterVolumeSpecName: "config-data") pod "61a9bf80-53fb-48a1-b6ce-92cb6c3b0736" (UID: "61a9bf80-53fb-48a1-b6ce-92cb6c3b0736"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.127174 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.128212 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4mqm\" (UniqueName: \"kubernetes.io/projected/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-kube-api-access-m4mqm\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.128242 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.128250 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.639310 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e34df5c5-803e-42a9-9cba-6562cc33f0d1","Type":"ContainerStarted","Data":"771ec02e314eae59a8090cd0cc30ea5bc01c979914c050970d57e20755e3fec9"} Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.642639 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70897894-4676-41ac-96a3-ef5c97b052e6","Type":"ContainerStarted","Data":"082c6bbb00cb0a505a9d04763b16e320bfd7b585edaaa078881e9127a177fe27"} Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.642685 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70897894-4676-41ac-96a3-ef5c97b052e6","Type":"ContainerStarted","Data":"52168aae2079b0f9e0aebaa6dcc352f265b13b95f0ea7b9877a772ac6141fd22"} Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.645330 4909 generic.go:334] "Generic (PLEG): container finished" podID="2bf1731a-52df-489c-a7f3-8f07774153a1" containerID="f8e95fbdf18d135b654dd615e0aa94162c15ef0a4cc4e312b5c15941e23926bc" exitCode=0 Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.645391 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gwh2v" event={"ID":"2bf1731a-52df-489c-a7f3-8f07774153a1","Type":"ContainerDied","Data":"f8e95fbdf18d135b654dd615e0aa94162c15ef0a4cc4e312b5c15941e23926bc"} Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.647146 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-nv42m" event={"ID":"61a9bf80-53fb-48a1-b6ce-92cb6c3b0736","Type":"ContainerDied","Data":"598c856a553124b3feb36c7fe9be0b5b5e6d509becbd3ef25010065fa18e73f8"} Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.647178 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="598c856a553124b3feb36c7fe9be0b5b5e6d509becbd3ef25010065fa18e73f8" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.647232 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-nv42m" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.671256 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.671235335 podStartE2EDuration="2.671235335s" podCreationTimestamp="2026-02-02 12:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:32.659177093 +0000 UTC m=+5778.405277828" watchObservedRunningTime="2026-02-02 12:07:32.671235335 +0000 UTC m=+5778.417336070" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.682228 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.682213227 podStartE2EDuration="2.682213227s" podCreationTimestamp="2026-02-02 12:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:32.681112266 +0000 UTC m=+5778.427213001" watchObservedRunningTime="2026-02-02 12:07:32.682213227 +0000 UTC m=+5778.428313962" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.746242 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 12:07:32 crc kubenswrapper[4909]: E0202 12:07:32.746668 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a9bf80-53fb-48a1-b6ce-92cb6c3b0736" containerName="nova-cell1-conductor-db-sync" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.746690 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a9bf80-53fb-48a1-b6ce-92cb6c3b0736" containerName="nova-cell1-conductor-db-sync" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.746884 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a9bf80-53fb-48a1-b6ce-92cb6c3b0736" containerName="nova-cell1-conductor-db-sync" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.747486 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.756059 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.757762 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.839701 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.839846 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.839920 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wv8d\" (UniqueName: \"kubernetes.io/projected/41dd6b28-b792-4d80-9cd2-3ab8f738be53-kube-api-access-7wv8d\") pod \"nova-cell1-conductor-0\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.941524 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.941655 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.941715 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wv8d\" (UniqueName: \"kubernetes.io/projected/41dd6b28-b792-4d80-9cd2-3ab8f738be53-kube-api-access-7wv8d\") pod \"nova-cell1-conductor-0\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.947315 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.959615 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:32 crc kubenswrapper[4909]: I0202 12:07:32.960156 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wv8d\" (UniqueName: \"kubernetes.io/projected/41dd6b28-b792-4d80-9cd2-3ab8f738be53-kube-api-access-7wv8d\") pod \"nova-cell1-conductor-0\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:33 crc kubenswrapper[4909]: I0202 12:07:33.073638 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:33 crc kubenswrapper[4909]: W0202 12:07:33.576780 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41dd6b28_b792_4d80_9cd2_3ab8f738be53.slice/crio-5326420884cca1e66181c396a42c94bb9e82bc331939135d827e46fc78875aae WatchSource:0}: Error finding container 5326420884cca1e66181c396a42c94bb9e82bc331939135d827e46fc78875aae: Status 404 returned error can't find the container with id 5326420884cca1e66181c396a42c94bb9e82bc331939135d827e46fc78875aae Feb 02 12:07:33 crc kubenswrapper[4909]: I0202 12:07:33.582999 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 12:07:33 crc kubenswrapper[4909]: I0202 12:07:33.657459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"41dd6b28-b792-4d80-9cd2-3ab8f738be53","Type":"ContainerStarted","Data":"5326420884cca1e66181c396a42c94bb9e82bc331939135d827e46fc78875aae"} Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.013962 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.016352 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:07:34 crc kubenswrapper[4909]: E0202 12:07:34.016629 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.166476 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-combined-ca-bundle\") pod \"2bf1731a-52df-489c-a7f3-8f07774153a1\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.166560 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv458\" (UniqueName: \"kubernetes.io/projected/2bf1731a-52df-489c-a7f3-8f07774153a1-kube-api-access-wv458\") pod \"2bf1731a-52df-489c-a7f3-8f07774153a1\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.166602 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-config-data\") pod \"2bf1731a-52df-489c-a7f3-8f07774153a1\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.166633 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-scripts\") pod \"2bf1731a-52df-489c-a7f3-8f07774153a1\" (UID: \"2bf1731a-52df-489c-a7f3-8f07774153a1\") " Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.172085 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-scripts" (OuterVolumeSpecName: "scripts") pod "2bf1731a-52df-489c-a7f3-8f07774153a1" (UID: "2bf1731a-52df-489c-a7f3-8f07774153a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.172239 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bf1731a-52df-489c-a7f3-8f07774153a1-kube-api-access-wv458" (OuterVolumeSpecName: "kube-api-access-wv458") pod "2bf1731a-52df-489c-a7f3-8f07774153a1" (UID: "2bf1731a-52df-489c-a7f3-8f07774153a1"). InnerVolumeSpecName "kube-api-access-wv458". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.195598 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-config-data" (OuterVolumeSpecName: "config-data") pod "2bf1731a-52df-489c-a7f3-8f07774153a1" (UID: "2bf1731a-52df-489c-a7f3-8f07774153a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.201942 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bf1731a-52df-489c-a7f3-8f07774153a1" (UID: "2bf1731a-52df-489c-a7f3-8f07774153a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.270176 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.270209 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.270218 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bf1731a-52df-489c-a7f3-8f07774153a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.270227 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv458\" (UniqueName: \"kubernetes.io/projected/2bf1731a-52df-489c-a7f3-8f07774153a1-kube-api-access-wv458\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.667051 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gwh2v" event={"ID":"2bf1731a-52df-489c-a7f3-8f07774153a1","Type":"ContainerDied","Data":"56ebf8933d509159848434579c0e500adce1bb4e2fd28487f42b06538b92c427"} Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.667094 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56ebf8933d509159848434579c0e500adce1bb4e2fd28487f42b06538b92c427" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.667124 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gwh2v" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.668708 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"41dd6b28-b792-4d80-9cd2-3ab8f738be53","Type":"ContainerStarted","Data":"0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f"} Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.669214 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.694649 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.694632856 podStartE2EDuration="2.694632856s" podCreationTimestamp="2026-02-02 12:07:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:34.686074073 +0000 UTC m=+5780.432174808" watchObservedRunningTime="2026-02-02 12:07:34.694632856 +0000 UTC m=+5780.440733591" Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.937486 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.937733 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7" containerName="nova-scheduler-scheduler" containerID="cri-o://b6a65f1080c0ab6c72a4db50014fe622195670290139bd499d2d6d826b74a6dd" gracePeriod=30 Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.947670 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.948050 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ccf67e66-a29f-46d3-8f82-1b1b21211189" containerName="nova-api-log" containerID="cri-o://44c9928a76f06252169ec8da41f7fe6c50bc1d1848ed5fe6dd867101383f49fc" gracePeriod=30 Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.948176 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ccf67e66-a29f-46d3-8f82-1b1b21211189" containerName="nova-api-api" containerID="cri-o://64c09969500bb7b421cdfcbd7ed9710827601fcd712ecc9c1501c68919dc1a4b" gracePeriod=30 Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.968757 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.969019 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70897894-4676-41ac-96a3-ef5c97b052e6" containerName="nova-metadata-log" containerID="cri-o://52168aae2079b0f9e0aebaa6dcc352f265b13b95f0ea7b9877a772ac6141fd22" gracePeriod=30 Feb 02 12:07:34 crc kubenswrapper[4909]: I0202 12:07:34.969181 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="70897894-4676-41ac-96a3-ef5c97b052e6" containerName="nova-metadata-metadata" containerID="cri-o://082c6bbb00cb0a505a9d04763b16e320bfd7b585edaaa078881e9127a177fe27" gracePeriod=30 Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.067639 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.140772 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bfdf865-29c4x"] Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.141971 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" podUID="b7a3e3af-65cc-4582-82b3-a48ec96caf36" containerName="dnsmasq-dns" containerID="cri-o://d598b01f30982e22dbbf27f2a47faadbadde1e5b10d2b729aff272d027f88110" gracePeriod=10 Feb 02 12:07:35 crc kubenswrapper[4909]: E0202 12:07:35.566376 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7a3e3af_65cc_4582_82b3_a48ec96caf36.slice/crio-conmon-d598b01f30982e22dbbf27f2a47faadbadde1e5b10d2b729aff272d027f88110.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7a3e3af_65cc_4582_82b3_a48ec96caf36.slice/crio-d598b01f30982e22dbbf27f2a47faadbadde1e5b10d2b729aff272d027f88110.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccf67e66_a29f_46d3_8f82_1b1b21211189.slice/crio-conmon-64c09969500bb7b421cdfcbd7ed9710827601fcd712ecc9c1501c68919dc1a4b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70897894_4676_41ac_96a3_ef5c97b052e6.slice/crio-082c6bbb00cb0a505a9d04763b16e320bfd7b585edaaa078881e9127a177fe27.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccf67e66_a29f_46d3_8f82_1b1b21211189.slice/crio-64c09969500bb7b421cdfcbd7ed9710827601fcd712ecc9c1501c68919dc1a4b.scope\": RecentStats: unable to find data in memory cache]" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.714056 4909 generic.go:334] "Generic (PLEG): container finished" podID="70897894-4676-41ac-96a3-ef5c97b052e6" containerID="082c6bbb00cb0a505a9d04763b16e320bfd7b585edaaa078881e9127a177fe27" exitCode=0 Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.714097 4909 generic.go:334] "Generic (PLEG): container finished" podID="70897894-4676-41ac-96a3-ef5c97b052e6" containerID="52168aae2079b0f9e0aebaa6dcc352f265b13b95f0ea7b9877a772ac6141fd22" exitCode=143 Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.714172 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70897894-4676-41ac-96a3-ef5c97b052e6","Type":"ContainerDied","Data":"082c6bbb00cb0a505a9d04763b16e320bfd7b585edaaa078881e9127a177fe27"} Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.714196 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70897894-4676-41ac-96a3-ef5c97b052e6","Type":"ContainerDied","Data":"52168aae2079b0f9e0aebaa6dcc352f265b13b95f0ea7b9877a772ac6141fd22"} Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.717167 4909 generic.go:334] "Generic (PLEG): container finished" podID="b7a3e3af-65cc-4582-82b3-a48ec96caf36" containerID="d598b01f30982e22dbbf27f2a47faadbadde1e5b10d2b729aff272d027f88110" exitCode=0 Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.717292 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" event={"ID":"b7a3e3af-65cc-4582-82b3-a48ec96caf36","Type":"ContainerDied","Data":"d598b01f30982e22dbbf27f2a47faadbadde1e5b10d2b729aff272d027f88110"} Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.719290 4909 generic.go:334] "Generic (PLEG): container finished" podID="ccf67e66-a29f-46d3-8f82-1b1b21211189" containerID="64c09969500bb7b421cdfcbd7ed9710827601fcd712ecc9c1501c68919dc1a4b" exitCode=0 Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.719314 4909 generic.go:334] "Generic (PLEG): container finished" podID="ccf67e66-a29f-46d3-8f82-1b1b21211189" containerID="44c9928a76f06252169ec8da41f7fe6c50bc1d1848ed5fe6dd867101383f49fc" exitCode=143 Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.719958 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ccf67e66-a29f-46d3-8f82-1b1b21211189","Type":"ContainerDied","Data":"64c09969500bb7b421cdfcbd7ed9710827601fcd712ecc9c1501c68919dc1a4b"} Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.720027 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ccf67e66-a29f-46d3-8f82-1b1b21211189","Type":"ContainerDied","Data":"44c9928a76f06252169ec8da41f7fe6c50bc1d1848ed5fe6dd867101383f49fc"} Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.720043 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ccf67e66-a29f-46d3-8f82-1b1b21211189","Type":"ContainerDied","Data":"20d39084384efab7ebead9847aa1c22e6f7d9bc994967cb043c697540a8bbf62"} Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.720059 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20d39084384efab7ebead9847aa1c22e6f7d9bc994967cb043c697540a8bbf62" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.769023 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.867230 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.875336 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.914612 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf67e66-a29f-46d3-8f82-1b1b21211189-logs\") pod \"ccf67e66-a29f-46d3-8f82-1b1b21211189\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.914703 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-combined-ca-bundle\") pod \"ccf67e66-a29f-46d3-8f82-1b1b21211189\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.914779 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-combined-ca-bundle\") pod \"70897894-4676-41ac-96a3-ef5c97b052e6\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.914994 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-nova-metadata-tls-certs\") pod \"70897894-4676-41ac-96a3-ef5c97b052e6\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.915029 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-config-data\") pod \"70897894-4676-41ac-96a3-ef5c97b052e6\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.915215 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nwd9\" (UniqueName: \"kubernetes.io/projected/ccf67e66-a29f-46d3-8f82-1b1b21211189-kube-api-access-6nwd9\") pod \"ccf67e66-a29f-46d3-8f82-1b1b21211189\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.915260 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-config-data\") pod \"ccf67e66-a29f-46d3-8f82-1b1b21211189\" (UID: \"ccf67e66-a29f-46d3-8f82-1b1b21211189\") " Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.923076 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccf67e66-a29f-46d3-8f82-1b1b21211189-logs" (OuterVolumeSpecName: "logs") pod "ccf67e66-a29f-46d3-8f82-1b1b21211189" (UID: "ccf67e66-a29f-46d3-8f82-1b1b21211189"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.932747 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf67e66-a29f-46d3-8f82-1b1b21211189-kube-api-access-6nwd9" (OuterVolumeSpecName: "kube-api-access-6nwd9") pod "ccf67e66-a29f-46d3-8f82-1b1b21211189" (UID: "ccf67e66-a29f-46d3-8f82-1b1b21211189"). InnerVolumeSpecName "kube-api-access-6nwd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.950817 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-config-data" (OuterVolumeSpecName: "config-data") pod "70897894-4676-41ac-96a3-ef5c97b052e6" (UID: "70897894-4676-41ac-96a3-ef5c97b052e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.962539 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70897894-4676-41ac-96a3-ef5c97b052e6" (UID: "70897894-4676-41ac-96a3-ef5c97b052e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.962910 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-config-data" (OuterVolumeSpecName: "config-data") pod "ccf67e66-a29f-46d3-8f82-1b1b21211189" (UID: "ccf67e66-a29f-46d3-8f82-1b1b21211189"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.971669 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccf67e66-a29f-46d3-8f82-1b1b21211189" (UID: "ccf67e66-a29f-46d3-8f82-1b1b21211189"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:35 crc kubenswrapper[4909]: I0202 12:07:35.996709 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "70897894-4676-41ac-96a3-ef5c97b052e6" (UID: "70897894-4676-41ac-96a3-ef5c97b052e6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.017175 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-sb\") pod \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.017256 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q9qt\" (UniqueName: \"kubernetes.io/projected/b7a3e3af-65cc-4582-82b3-a48ec96caf36-kube-api-access-7q9qt\") pod \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.017301 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-dns-svc\") pod \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.017327 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-config\") pod \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.017472 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llft7\" (UniqueName: \"kubernetes.io/projected/70897894-4676-41ac-96a3-ef5c97b052e6-kube-api-access-llft7\") pod \"70897894-4676-41ac-96a3-ef5c97b052e6\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.017508 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-nb\") pod \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\" (UID: \"b7a3e3af-65cc-4582-82b3-a48ec96caf36\") " Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.017566 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70897894-4676-41ac-96a3-ef5c97b052e6-logs\") pod \"70897894-4676-41ac-96a3-ef5c97b052e6\" (UID: \"70897894-4676-41ac-96a3-ef5c97b052e6\") " Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.018184 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccf67e66-a29f-46d3-8f82-1b1b21211189-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.018211 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.018225 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.018239 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.018252 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70897894-4676-41ac-96a3-ef5c97b052e6-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.018262 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nwd9\" (UniqueName: \"kubernetes.io/projected/ccf67e66-a29f-46d3-8f82-1b1b21211189-kube-api-access-6nwd9\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.018272 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccf67e66-a29f-46d3-8f82-1b1b21211189-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.019125 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70897894-4676-41ac-96a3-ef5c97b052e6-logs" (OuterVolumeSpecName: "logs") pod "70897894-4676-41ac-96a3-ef5c97b052e6" (UID: "70897894-4676-41ac-96a3-ef5c97b052e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.021950 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70897894-4676-41ac-96a3-ef5c97b052e6-kube-api-access-llft7" (OuterVolumeSpecName: "kube-api-access-llft7") pod "70897894-4676-41ac-96a3-ef5c97b052e6" (UID: "70897894-4676-41ac-96a3-ef5c97b052e6"). InnerVolumeSpecName "kube-api-access-llft7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.027566 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a3e3af-65cc-4582-82b3-a48ec96caf36-kube-api-access-7q9qt" (OuterVolumeSpecName: "kube-api-access-7q9qt") pod "b7a3e3af-65cc-4582-82b3-a48ec96caf36" (UID: "b7a3e3af-65cc-4582-82b3-a48ec96caf36"). InnerVolumeSpecName "kube-api-access-7q9qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.036607 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.069659 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7a3e3af-65cc-4582-82b3-a48ec96caf36" (UID: "b7a3e3af-65cc-4582-82b3-a48ec96caf36"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.073233 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-config" (OuterVolumeSpecName: "config") pod "b7a3e3af-65cc-4582-82b3-a48ec96caf36" (UID: "b7a3e3af-65cc-4582-82b3-a48ec96caf36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.073687 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b7a3e3af-65cc-4582-82b3-a48ec96caf36" (UID: "b7a3e3af-65cc-4582-82b3-a48ec96caf36"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.084591 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b7a3e3af-65cc-4582-82b3-a48ec96caf36" (UID: "b7a3e3af-65cc-4582-82b3-a48ec96caf36"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.120603 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q9qt\" (UniqueName: \"kubernetes.io/projected/b7a3e3af-65cc-4582-82b3-a48ec96caf36-kube-api-access-7q9qt\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.120634 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.120644 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.120653 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llft7\" (UniqueName: \"kubernetes.io/projected/70897894-4676-41ac-96a3-ef5c97b052e6-kube-api-access-llft7\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.120665 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.120674 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/70897894-4676-41ac-96a3-ef5c97b052e6-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.120682 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b7a3e3af-65cc-4582-82b3-a48ec96caf36-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.730340 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" event={"ID":"b7a3e3af-65cc-4582-82b3-a48ec96caf36","Type":"ContainerDied","Data":"abebce6a2e59f3275884626cc68b775d8e3c993764602958622204d8530fa16d"} Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.730395 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bfdf865-29c4x" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.730533 4909 scope.go:117] "RemoveContainer" containerID="d598b01f30982e22dbbf27f2a47faadbadde1e5b10d2b729aff272d027f88110" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.732625 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"70897894-4676-41ac-96a3-ef5c97b052e6","Type":"ContainerDied","Data":"3bc7db43901878125a8ee721eed6c44ebec1bd983a33d07e3a45446cb1673464"} Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.732653 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.732669 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.753374 4909 scope.go:117] "RemoveContainer" containerID="f7019e24e5004306825e47e31e8ab15c341d411bc394732c14250645eb0a5350" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.778436 4909 scope.go:117] "RemoveContainer" containerID="082c6bbb00cb0a505a9d04763b16e320bfd7b585edaaa078881e9127a177fe27" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.817065 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bfdf865-29c4x"] Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.846334 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bfdf865-29c4x"] Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.846475 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.855893 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.863312 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.871649 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:36 crc kubenswrapper[4909]: E0202 12:07:36.872069 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a3e3af-65cc-4582-82b3-a48ec96caf36" containerName="init" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872094 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a3e3af-65cc-4582-82b3-a48ec96caf36" containerName="init" Feb 02 12:07:36 crc kubenswrapper[4909]: E0202 12:07:36.872111 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf67e66-a29f-46d3-8f82-1b1b21211189" containerName="nova-api-api" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872124 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf67e66-a29f-46d3-8f82-1b1b21211189" containerName="nova-api-api" Feb 02 12:07:36 crc kubenswrapper[4909]: E0202 12:07:36.872144 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70897894-4676-41ac-96a3-ef5c97b052e6" containerName="nova-metadata-metadata" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872153 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="70897894-4676-41ac-96a3-ef5c97b052e6" containerName="nova-metadata-metadata" Feb 02 12:07:36 crc kubenswrapper[4909]: E0202 12:07:36.872173 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70897894-4676-41ac-96a3-ef5c97b052e6" containerName="nova-metadata-log" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872181 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="70897894-4676-41ac-96a3-ef5c97b052e6" containerName="nova-metadata-log" Feb 02 12:07:36 crc kubenswrapper[4909]: E0202 12:07:36.872196 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bf1731a-52df-489c-a7f3-8f07774153a1" containerName="nova-manage" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872203 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bf1731a-52df-489c-a7f3-8f07774153a1" containerName="nova-manage" Feb 02 12:07:36 crc kubenswrapper[4909]: E0202 12:07:36.872212 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf67e66-a29f-46d3-8f82-1b1b21211189" containerName="nova-api-log" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872219 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf67e66-a29f-46d3-8f82-1b1b21211189" containerName="nova-api-log" Feb 02 12:07:36 crc kubenswrapper[4909]: E0202 12:07:36.872231 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a3e3af-65cc-4582-82b3-a48ec96caf36" containerName="dnsmasq-dns" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872239 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a3e3af-65cc-4582-82b3-a48ec96caf36" containerName="dnsmasq-dns" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872463 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a3e3af-65cc-4582-82b3-a48ec96caf36" containerName="dnsmasq-dns" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872486 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bf1731a-52df-489c-a7f3-8f07774153a1" containerName="nova-manage" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872497 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccf67e66-a29f-46d3-8f82-1b1b21211189" containerName="nova-api-api" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872508 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="70897894-4676-41ac-96a3-ef5c97b052e6" containerName="nova-metadata-metadata" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872519 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="70897894-4676-41ac-96a3-ef5c97b052e6" containerName="nova-metadata-log" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.872532 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccf67e66-a29f-46d3-8f82-1b1b21211189" containerName="nova-api-log" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.873750 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.875087 4909 scope.go:117] "RemoveContainer" containerID="52168aae2079b0f9e0aebaa6dcc352f265b13b95f0ea7b9877a772ac6141fd22" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.886164 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.907674 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.922799 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.931940 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.934457 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.935330 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqx76\" (UniqueName: \"kubernetes.io/projected/9f539212-959e-4d6a-98b7-c4783ed1b735-kube-api-access-mqx76\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.935513 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-config-data\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.935655 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.935729 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f539212-959e-4d6a-98b7-c4783ed1b735-logs\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.937254 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.937587 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 12:07:36 crc kubenswrapper[4909]: I0202 12:07:36.944748 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.028568 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70897894-4676-41ac-96a3-ef5c97b052e6" path="/var/lib/kubelet/pods/70897894-4676-41ac-96a3-ef5c97b052e6/volumes" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.029519 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a3e3af-65cc-4582-82b3-a48ec96caf36" path="/var/lib/kubelet/pods/b7a3e3af-65cc-4582-82b3-a48ec96caf36/volumes" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.031235 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccf67e66-a29f-46d3-8f82-1b1b21211189" path="/var/lib/kubelet/pods/ccf67e66-a29f-46d3-8f82-1b1b21211189/volumes" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.036894 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-config-data\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.037078 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-logs\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.037214 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqx76\" (UniqueName: \"kubernetes.io/projected/9f539212-959e-4d6a-98b7-c4783ed1b735-kube-api-access-mqx76\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.037367 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk7wl\" (UniqueName: \"kubernetes.io/projected/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-kube-api-access-fk7wl\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.037483 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-config-data\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.037581 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.037728 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.037841 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f539212-959e-4d6a-98b7-c4783ed1b735-logs\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.037981 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.038273 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f539212-959e-4d6a-98b7-c4783ed1b735-logs\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.052888 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.055460 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqx76\" (UniqueName: \"kubernetes.io/projected/9f539212-959e-4d6a-98b7-c4783ed1b735-kube-api-access-mqx76\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.056351 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-config-data\") pod \"nova-api-0\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " pod="openstack/nova-api-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.140025 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-config-data\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.140132 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-logs\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.140227 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk7wl\" (UniqueName: \"kubernetes.io/projected/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-kube-api-access-fk7wl\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.140245 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.140349 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.141326 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-logs\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.143409 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-config-data\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.144487 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.145429 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.157099 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk7wl\" (UniqueName: \"kubernetes.io/projected/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-kube-api-access-fk7wl\") pod \"nova-metadata-0\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.199990 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.257906 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.472905 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:37 crc kubenswrapper[4909]: W0202 12:07:37.474519 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f539212_959e_4d6a_98b7_c4783ed1b735.slice/crio-1f4fe377ed98b494a4311dad52bccffd8bceda34eebfcd72b0dedbea32c29f31 WatchSource:0}: Error finding container 1f4fe377ed98b494a4311dad52bccffd8bceda34eebfcd72b0dedbea32c29f31: Status 404 returned error can't find the container with id 1f4fe377ed98b494a4311dad52bccffd8bceda34eebfcd72b0dedbea32c29f31 Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.749197 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9f539212-959e-4d6a-98b7-c4783ed1b735","Type":"ContainerStarted","Data":"1f4fe377ed98b494a4311dad52bccffd8bceda34eebfcd72b0dedbea32c29f31"} Feb 02 12:07:37 crc kubenswrapper[4909]: I0202 12:07:37.816515 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:38 crc kubenswrapper[4909]: I0202 12:07:38.106977 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 02 12:07:38 crc kubenswrapper[4909]: I0202 12:07:38.761854 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9f539212-959e-4d6a-98b7-c4783ed1b735","Type":"ContainerStarted","Data":"a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa"} Feb 02 12:07:38 crc kubenswrapper[4909]: I0202 12:07:38.761923 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9f539212-959e-4d6a-98b7-c4783ed1b735","Type":"ContainerStarted","Data":"2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98"} Feb 02 12:07:38 crc kubenswrapper[4909]: I0202 12:07:38.765379 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c98e76b9-03c2-42e5-a71f-bdcc385c68c8","Type":"ContainerStarted","Data":"f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8"} Feb 02 12:07:38 crc kubenswrapper[4909]: I0202 12:07:38.765438 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c98e76b9-03c2-42e5-a71f-bdcc385c68c8","Type":"ContainerStarted","Data":"de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62"} Feb 02 12:07:38 crc kubenswrapper[4909]: I0202 12:07:38.765450 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c98e76b9-03c2-42e5-a71f-bdcc385c68c8","Type":"ContainerStarted","Data":"373a260647c749d2524dcd430e91ce3062dad16c542e16c23252f3bb4d3abe16"} Feb 02 12:07:38 crc kubenswrapper[4909]: I0202 12:07:38.788761 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.788741621 podStartE2EDuration="2.788741621s" podCreationTimestamp="2026-02-02 12:07:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:38.780344402 +0000 UTC m=+5784.526445127" watchObservedRunningTime="2026-02-02 12:07:38.788741621 +0000 UTC m=+5784.534842356" Feb 02 12:07:38 crc kubenswrapper[4909]: I0202 12:07:38.806566 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.806545286 podStartE2EDuration="2.806545286s" podCreationTimestamp="2026-02-02 12:07:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:38.802211953 +0000 UTC m=+5784.548312698" watchObservedRunningTime="2026-02-02 12:07:38.806545286 +0000 UTC m=+5784.552646021" Feb 02 12:07:41 crc kubenswrapper[4909]: I0202 12:07:41.036430 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:41 crc kubenswrapper[4909]: I0202 12:07:41.055455 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:41 crc kubenswrapper[4909]: I0202 12:07:41.803643 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 02 12:07:41 crc kubenswrapper[4909]: I0202 12:07:41.964650 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-p47c9"] Feb 02 12:07:41 crc kubenswrapper[4909]: I0202 12:07:41.966067 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:41 crc kubenswrapper[4909]: I0202 12:07:41.968657 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 02 12:07:41 crc kubenswrapper[4909]: I0202 12:07:41.968694 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 02 12:07:41 crc kubenswrapper[4909]: I0202 12:07:41.979095 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-p47c9"] Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.027100 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-config-data\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.027179 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-scripts\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.027203 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.027241 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tgfq\" (UniqueName: \"kubernetes.io/projected/1c373ff3-6de7-4d89-be4c-b79c826a5d05-kube-api-access-2tgfq\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.129556 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-scripts\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.129613 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.129671 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tgfq\" (UniqueName: \"kubernetes.io/projected/1c373ff3-6de7-4d89-be4c-b79c826a5d05-kube-api-access-2tgfq\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.129926 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-config-data\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.137953 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-scripts\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.138161 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.138271 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-config-data\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.150070 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tgfq\" (UniqueName: \"kubernetes.io/projected/1c373ff3-6de7-4d89-be4c-b79c826a5d05-kube-api-access-2tgfq\") pod \"nova-cell1-cell-mapping-p47c9\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.258360 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.259574 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.294543 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.758619 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-p47c9"] Feb 02 12:07:42 crc kubenswrapper[4909]: W0202 12:07:42.762000 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c373ff3_6de7_4d89_be4c_b79c826a5d05.slice/crio-b12d5728c02b532c9fd379f81033ed365143d9045459940ae4d5c286a2f7f4da WatchSource:0}: Error finding container b12d5728c02b532c9fd379f81033ed365143d9045459940ae4d5c286a2f7f4da: Status 404 returned error can't find the container with id b12d5728c02b532c9fd379f81033ed365143d9045459940ae4d5c286a2f7f4da Feb 02 12:07:42 crc kubenswrapper[4909]: I0202 12:07:42.798338 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-p47c9" event={"ID":"1c373ff3-6de7-4d89-be4c-b79c826a5d05","Type":"ContainerStarted","Data":"b12d5728c02b532c9fd379f81033ed365143d9045459940ae4d5c286a2f7f4da"} Feb 02 12:07:43 crc kubenswrapper[4909]: I0202 12:07:43.808195 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-p47c9" event={"ID":"1c373ff3-6de7-4d89-be4c-b79c826a5d05","Type":"ContainerStarted","Data":"bf69fd00c80753a26faf7b00cc5e277be0917380cad15a3769e68b8edc6c98dd"} Feb 02 12:07:43 crc kubenswrapper[4909]: I0202 12:07:43.830245 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-p47c9" podStartSLOduration=2.830227556 podStartE2EDuration="2.830227556s" podCreationTimestamp="2026-02-02 12:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:43.821697394 +0000 UTC m=+5789.567798129" watchObservedRunningTime="2026-02-02 12:07:43.830227556 +0000 UTC m=+5789.576328291" Feb 02 12:07:46 crc kubenswrapper[4909]: I0202 12:07:46.016724 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:07:46 crc kubenswrapper[4909]: E0202 12:07:46.017471 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:07:47 crc kubenswrapper[4909]: I0202 12:07:47.200379 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 12:07:47 crc kubenswrapper[4909]: I0202 12:07:47.200461 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 12:07:47 crc kubenswrapper[4909]: I0202 12:07:47.258679 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 12:07:47 crc kubenswrapper[4909]: I0202 12:07:47.259127 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 12:07:47 crc kubenswrapper[4909]: I0202 12:07:47.864087 4909 generic.go:334] "Generic (PLEG): container finished" podID="1c373ff3-6de7-4d89-be4c-b79c826a5d05" containerID="bf69fd00c80753a26faf7b00cc5e277be0917380cad15a3769e68b8edc6c98dd" exitCode=0 Feb 02 12:07:47 crc kubenswrapper[4909]: I0202 12:07:47.864207 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-p47c9" event={"ID":"1c373ff3-6de7-4d89-be4c-b79c826a5d05","Type":"ContainerDied","Data":"bf69fd00c80753a26faf7b00cc5e277be0917380cad15a3769e68b8edc6c98dd"} Feb 02 12:07:48 crc kubenswrapper[4909]: I0202 12:07:48.283098 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 12:07:48 crc kubenswrapper[4909]: I0202 12:07:48.296040 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 12:07:48 crc kubenswrapper[4909]: I0202 12:07:48.296094 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.87:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 12:07:48 crc kubenswrapper[4909]: I0202 12:07:48.296074 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.87:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.342013 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.476722 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-config-data\") pod \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.477109 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tgfq\" (UniqueName: \"kubernetes.io/projected/1c373ff3-6de7-4d89-be4c-b79c826a5d05-kube-api-access-2tgfq\") pod \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.477246 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-combined-ca-bundle\") pod \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.477387 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-scripts\") pod \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\" (UID: \"1c373ff3-6de7-4d89-be4c-b79c826a5d05\") " Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.483059 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-scripts" (OuterVolumeSpecName: "scripts") pod "1c373ff3-6de7-4d89-be4c-b79c826a5d05" (UID: "1c373ff3-6de7-4d89-be4c-b79c826a5d05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.483275 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c373ff3-6de7-4d89-be4c-b79c826a5d05-kube-api-access-2tgfq" (OuterVolumeSpecName: "kube-api-access-2tgfq") pod "1c373ff3-6de7-4d89-be4c-b79c826a5d05" (UID: "1c373ff3-6de7-4d89-be4c-b79c826a5d05"). InnerVolumeSpecName "kube-api-access-2tgfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.505398 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-config-data" (OuterVolumeSpecName: "config-data") pod "1c373ff3-6de7-4d89-be4c-b79c826a5d05" (UID: "1c373ff3-6de7-4d89-be4c-b79c826a5d05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.516920 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c373ff3-6de7-4d89-be4c-b79c826a5d05" (UID: "1c373ff3-6de7-4d89-be4c-b79c826a5d05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.579914 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.579952 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tgfq\" (UniqueName: \"kubernetes.io/projected/1c373ff3-6de7-4d89-be4c-b79c826a5d05-kube-api-access-2tgfq\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.579963 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.579973 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c373ff3-6de7-4d89-be4c-b79c826a5d05-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.894981 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-p47c9" event={"ID":"1c373ff3-6de7-4d89-be4c-b79c826a5d05","Type":"ContainerDied","Data":"b12d5728c02b532c9fd379f81033ed365143d9045459940ae4d5c286a2f7f4da"} Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.895030 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b12d5728c02b532c9fd379f81033ed365143d9045459940ae4d5c286a2f7f4da" Feb 02 12:07:49 crc kubenswrapper[4909]: I0202 12:07:49.895101 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-p47c9" Feb 02 12:07:50 crc kubenswrapper[4909]: I0202 12:07:50.065368 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:50 crc kubenswrapper[4909]: I0202 12:07:50.065619 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerName="nova-api-log" containerID="cri-o://2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98" gracePeriod=30 Feb 02 12:07:50 crc kubenswrapper[4909]: I0202 12:07:50.065738 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerName="nova-api-api" containerID="cri-o://a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa" gracePeriod=30 Feb 02 12:07:50 crc kubenswrapper[4909]: I0202 12:07:50.151222 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:50 crc kubenswrapper[4909]: I0202 12:07:50.151481 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerName="nova-metadata-log" containerID="cri-o://de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62" gracePeriod=30 Feb 02 12:07:50 crc kubenswrapper[4909]: I0202 12:07:50.151664 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerName="nova-metadata-metadata" containerID="cri-o://f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8" gracePeriod=30 Feb 02 12:07:50 crc kubenswrapper[4909]: I0202 12:07:50.909145 4909 generic.go:334] "Generic (PLEG): container finished" podID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerID="de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62" exitCode=143 Feb 02 12:07:50 crc kubenswrapper[4909]: I0202 12:07:50.909253 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c98e76b9-03c2-42e5-a71f-bdcc385c68c8","Type":"ContainerDied","Data":"de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62"} Feb 02 12:07:50 crc kubenswrapper[4909]: I0202 12:07:50.912594 4909 generic.go:334] "Generic (PLEG): container finished" podID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerID="2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98" exitCode=143 Feb 02 12:07:50 crc kubenswrapper[4909]: I0202 12:07:50.912634 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9f539212-959e-4d6a-98b7-c4783ed1b735","Type":"ContainerDied","Data":"2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98"} Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.696865 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.704185 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.778270 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqx76\" (UniqueName: \"kubernetes.io/projected/9f539212-959e-4d6a-98b7-c4783ed1b735-kube-api-access-mqx76\") pod \"9f539212-959e-4d6a-98b7-c4783ed1b735\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.778414 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-logs\") pod \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.778442 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-config-data\") pod \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.778507 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-combined-ca-bundle\") pod \"9f539212-959e-4d6a-98b7-c4783ed1b735\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.778537 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk7wl\" (UniqueName: \"kubernetes.io/projected/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-kube-api-access-fk7wl\") pod \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.778566 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-combined-ca-bundle\") pod \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.778611 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-config-data\") pod \"9f539212-959e-4d6a-98b7-c4783ed1b735\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.778653 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-nova-metadata-tls-certs\") pod \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\" (UID: \"c98e76b9-03c2-42e5-a71f-bdcc385c68c8\") " Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.778687 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f539212-959e-4d6a-98b7-c4783ed1b735-logs\") pod \"9f539212-959e-4d6a-98b7-c4783ed1b735\" (UID: \"9f539212-959e-4d6a-98b7-c4783ed1b735\") " Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.779778 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-logs" (OuterVolumeSpecName: "logs") pod "c98e76b9-03c2-42e5-a71f-bdcc385c68c8" (UID: "c98e76b9-03c2-42e5-a71f-bdcc385c68c8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.779789 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f539212-959e-4d6a-98b7-c4783ed1b735-logs" (OuterVolumeSpecName: "logs") pod "9f539212-959e-4d6a-98b7-c4783ed1b735" (UID: "9f539212-959e-4d6a-98b7-c4783ed1b735"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.783905 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-kube-api-access-fk7wl" (OuterVolumeSpecName: "kube-api-access-fk7wl") pod "c98e76b9-03c2-42e5-a71f-bdcc385c68c8" (UID: "c98e76b9-03c2-42e5-a71f-bdcc385c68c8"). InnerVolumeSpecName "kube-api-access-fk7wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.784579 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f539212-959e-4d6a-98b7-c4783ed1b735-kube-api-access-mqx76" (OuterVolumeSpecName: "kube-api-access-mqx76") pod "9f539212-959e-4d6a-98b7-c4783ed1b735" (UID: "9f539212-959e-4d6a-98b7-c4783ed1b735"). InnerVolumeSpecName "kube-api-access-mqx76". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.803614 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-config-data" (OuterVolumeSpecName: "config-data") pod "9f539212-959e-4d6a-98b7-c4783ed1b735" (UID: "9f539212-959e-4d6a-98b7-c4783ed1b735"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.806153 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f539212-959e-4d6a-98b7-c4783ed1b735" (UID: "9f539212-959e-4d6a-98b7-c4783ed1b735"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.808263 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-config-data" (OuterVolumeSpecName: "config-data") pod "c98e76b9-03c2-42e5-a71f-bdcc385c68c8" (UID: "c98e76b9-03c2-42e5-a71f-bdcc385c68c8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.808970 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c98e76b9-03c2-42e5-a71f-bdcc385c68c8" (UID: "c98e76b9-03c2-42e5-a71f-bdcc385c68c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.825962 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c98e76b9-03c2-42e5-a71f-bdcc385c68c8" (UID: "c98e76b9-03c2-42e5-a71f-bdcc385c68c8"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.880963 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.881016 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk7wl\" (UniqueName: \"kubernetes.io/projected/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-kube-api-access-fk7wl\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.881035 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.881050 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f539212-959e-4d6a-98b7-c4783ed1b735-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.881062 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.881073 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f539212-959e-4d6a-98b7-c4783ed1b735-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.881083 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqx76\" (UniqueName: \"kubernetes.io/projected/9f539212-959e-4d6a-98b7-c4783ed1b735-kube-api-access-mqx76\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.881093 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.881103 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98e76b9-03c2-42e5-a71f-bdcc385c68c8-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.937383 4909 generic.go:334] "Generic (PLEG): container finished" podID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerID="f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8" exitCode=0 Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.937450 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c98e76b9-03c2-42e5-a71f-bdcc385c68c8","Type":"ContainerDied","Data":"f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8"} Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.937462 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.937478 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c98e76b9-03c2-42e5-a71f-bdcc385c68c8","Type":"ContainerDied","Data":"373a260647c749d2524dcd430e91ce3062dad16c542e16c23252f3bb4d3abe16"} Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.937495 4909 scope.go:117] "RemoveContainer" containerID="f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.940139 4909 generic.go:334] "Generic (PLEG): container finished" podID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerID="a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa" exitCode=0 Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.940205 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9f539212-959e-4d6a-98b7-c4783ed1b735","Type":"ContainerDied","Data":"a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa"} Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.940252 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9f539212-959e-4d6a-98b7-c4783ed1b735","Type":"ContainerDied","Data":"1f4fe377ed98b494a4311dad52bccffd8bceda34eebfcd72b0dedbea32c29f31"} Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.940337 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:07:53 crc kubenswrapper[4909]: I0202 12:07:53.987418 4909 scope.go:117] "RemoveContainer" containerID="de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.001018 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.012462 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.021330 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.033547 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:54 crc kubenswrapper[4909]: E0202 12:07:54.034096 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c373ff3-6de7-4d89-be4c-b79c826a5d05" containerName="nova-manage" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.034114 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c373ff3-6de7-4d89-be4c-b79c826a5d05" containerName="nova-manage" Feb 02 12:07:54 crc kubenswrapper[4909]: E0202 12:07:54.034131 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerName="nova-metadata-metadata" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.034138 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerName="nova-metadata-metadata" Feb 02 12:07:54 crc kubenswrapper[4909]: E0202 12:07:54.034150 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerName="nova-metadata-log" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.034158 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerName="nova-metadata-log" Feb 02 12:07:54 crc kubenswrapper[4909]: E0202 12:07:54.034166 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerName="nova-api-log" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.034172 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerName="nova-api-log" Feb 02 12:07:54 crc kubenswrapper[4909]: E0202 12:07:54.034191 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerName="nova-api-api" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.034196 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerName="nova-api-api" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.034863 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerName="nova-api-log" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.034885 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerName="nova-metadata-log" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.034905 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" containerName="nova-metadata-metadata" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.034923 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" containerName="nova-api-api" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.034933 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c373ff3-6de7-4d89-be4c-b79c826a5d05" containerName="nova-manage" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.036057 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.055555 4909 scope.go:117] "RemoveContainer" containerID="f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.055638 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.055960 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.056168 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 12:07:54 crc kubenswrapper[4909]: E0202 12:07:54.056565 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8\": container with ID starting with f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8 not found: ID does not exist" containerID="f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.056605 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8"} err="failed to get container status \"f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8\": rpc error: code = NotFound desc = could not find container \"f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8\": container with ID starting with f9a4a9fbab78ab7164303aa1b42bfd914ccaedd1e12f285a006a5e11cc97dea8 not found: ID does not exist" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.056630 4909 scope.go:117] "RemoveContainer" containerID="de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62" Feb 02 12:07:54 crc kubenswrapper[4909]: E0202 12:07:54.057316 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62\": container with ID starting with de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62 not found: ID does not exist" containerID="de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.057343 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62"} err="failed to get container status \"de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62\": rpc error: code = NotFound desc = could not find container \"de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62\": container with ID starting with de4e19cb34c32422e90659334a05c6d24934264f25d54f77fb6b867a9d52dd62 not found: ID does not exist" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.057358 4909 scope.go:117] "RemoveContainer" containerID="a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.060951 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.076672 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.081351 4909 scope.go:117] "RemoveContainer" containerID="2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.081728 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.086043 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.089909 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.090937 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-config-data\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.091106 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-logs\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.091133 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.091213 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.091450 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbhfs\" (UniqueName: \"kubernetes.io/projected/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-kube-api-access-pbhfs\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.123901 4909 scope.go:117] "RemoveContainer" containerID="a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa" Feb 02 12:07:54 crc kubenswrapper[4909]: E0202 12:07:54.127371 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa\": container with ID starting with a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa not found: ID does not exist" containerID="a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.127405 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa"} err="failed to get container status \"a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa\": rpc error: code = NotFound desc = could not find container \"a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa\": container with ID starting with a8078d608066325839dc973a63fc06c8076fd9ec0c11dd517db371ccc21d81fa not found: ID does not exist" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.127428 4909 scope.go:117] "RemoveContainer" containerID="2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98" Feb 02 12:07:54 crc kubenswrapper[4909]: E0202 12:07:54.127848 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98\": container with ID starting with 2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98 not found: ID does not exist" containerID="2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.127869 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98"} err="failed to get container status \"2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98\": rpc error: code = NotFound desc = could not find container \"2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98\": container with ID starting with 2378f1fcf9f9b92c1fc2480529688d3985aa3da5b166fdffd563e3128212af98 not found: ID does not exist" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.192580 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-config-data\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.192863 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-logs\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.192956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.193075 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-logs\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.193161 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.193338 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbhfs\" (UniqueName: \"kubernetes.io/projected/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-kube-api-access-pbhfs\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.193412 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.193500 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-config-data\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.193615 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5dsm\" (UniqueName: \"kubernetes.io/projected/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-kube-api-access-p5dsm\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.193412 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-logs\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.196825 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.196992 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.197365 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-config-data\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.209170 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbhfs\" (UniqueName: \"kubernetes.io/projected/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-kube-api-access-pbhfs\") pod \"nova-metadata-0\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.295408 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.295495 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5dsm\" (UniqueName: \"kubernetes.io/projected/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-kube-api-access-p5dsm\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.295543 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-config-data\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.295586 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-logs\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.296129 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-logs\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.299563 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.300016 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-config-data\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.312468 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5dsm\" (UniqueName: \"kubernetes.io/projected/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-kube-api-access-p5dsm\") pod \"nova-api-0\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.380323 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.405264 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.840278 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.892350 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:07:54 crc kubenswrapper[4909]: W0202 12:07:54.893106 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43d94ff7_5cc5_4f62_93fb_f7e9c1315847.slice/crio-294068f028bfc6f5fd81a210a7a51805a5e7d9488283eff42e56da9d42ff8cf6 WatchSource:0}: Error finding container 294068f028bfc6f5fd81a210a7a51805a5e7d9488283eff42e56da9d42ff8cf6: Status 404 returned error can't find the container with id 294068f028bfc6f5fd81a210a7a51805a5e7d9488283eff42e56da9d42ff8cf6 Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.951112 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43d94ff7-5cc5-4f62-93fb-f7e9c1315847","Type":"ContainerStarted","Data":"294068f028bfc6f5fd81a210a7a51805a5e7d9488283eff42e56da9d42ff8cf6"} Feb 02 12:07:54 crc kubenswrapper[4909]: I0202 12:07:54.955006 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64","Type":"ContainerStarted","Data":"266c8e5a1e11e509a08531cb9cb8ac85e7c4083d16ce6c004b1a327bcbf5fddb"} Feb 02 12:07:55 crc kubenswrapper[4909]: I0202 12:07:55.028833 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f539212-959e-4d6a-98b7-c4783ed1b735" path="/var/lib/kubelet/pods/9f539212-959e-4d6a-98b7-c4783ed1b735/volumes" Feb 02 12:07:55 crc kubenswrapper[4909]: I0202 12:07:55.029720 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98e76b9-03c2-42e5-a71f-bdcc385c68c8" path="/var/lib/kubelet/pods/c98e76b9-03c2-42e5-a71f-bdcc385c68c8/volumes" Feb 02 12:07:55 crc kubenswrapper[4909]: I0202 12:07:55.967043 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43d94ff7-5cc5-4f62-93fb-f7e9c1315847","Type":"ContainerStarted","Data":"1412daa98da71cf6ba008933a60fbdb47f5bbffef23ba414f41991ea55d17659"} Feb 02 12:07:55 crc kubenswrapper[4909]: I0202 12:07:55.967384 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43d94ff7-5cc5-4f62-93fb-f7e9c1315847","Type":"ContainerStarted","Data":"60c17e6fbbb93ef6b2d9c1b8fb0647e9c3dae092820bf72a473a5c0855a7e532"} Feb 02 12:07:55 crc kubenswrapper[4909]: I0202 12:07:55.971397 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64","Type":"ContainerStarted","Data":"65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51"} Feb 02 12:07:55 crc kubenswrapper[4909]: I0202 12:07:55.971452 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64","Type":"ContainerStarted","Data":"367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad"} Feb 02 12:07:56 crc kubenswrapper[4909]: I0202 12:07:56.002212 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.00219234 podStartE2EDuration="3.00219234s" podCreationTimestamp="2026-02-02 12:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:55.988342117 +0000 UTC m=+5801.734442862" watchObservedRunningTime="2026-02-02 12:07:56.00219234 +0000 UTC m=+5801.748293085" Feb 02 12:07:56 crc kubenswrapper[4909]: I0202 12:07:56.026679 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.026647564 podStartE2EDuration="3.026647564s" podCreationTimestamp="2026-02-02 12:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:07:56.011758632 +0000 UTC m=+5801.757859377" watchObservedRunningTime="2026-02-02 12:07:56.026647564 +0000 UTC m=+5801.772748299" Feb 02 12:07:57 crc kubenswrapper[4909]: I0202 12:07:57.016620 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:07:57 crc kubenswrapper[4909]: E0202 12:07:57.017214 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:07:59 crc kubenswrapper[4909]: I0202 12:07:59.381283 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 12:07:59 crc kubenswrapper[4909]: I0202 12:07:59.381948 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 12:08:04 crc kubenswrapper[4909]: I0202 12:08:04.381623 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 12:08:04 crc kubenswrapper[4909]: I0202 12:08:04.382149 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 12:08:04 crc kubenswrapper[4909]: I0202 12:08:04.405880 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 12:08:04 crc kubenswrapper[4909]: I0202 12:08:04.405973 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.081315 4909 generic.go:334] "Generic (PLEG): container finished" podID="45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7" containerID="b6a65f1080c0ab6c72a4db50014fe622195670290139bd499d2d6d826b74a6dd" exitCode=137 Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.081757 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7","Type":"ContainerDied","Data":"b6a65f1080c0ab6c72a4db50014fe622195670290139bd499d2d6d826b74a6dd"} Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.376225 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.399047 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.399137 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.89:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.436348 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-combined-ca-bundle\") pod \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.436579 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-config-data\") pod \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.436649 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mr6tf\" (UniqueName: \"kubernetes.io/projected/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-kube-api-access-mr6tf\") pod \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\" (UID: \"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7\") " Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.456209 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-kube-api-access-mr6tf" (OuterVolumeSpecName: "kube-api-access-mr6tf") pod "45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7" (UID: "45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7"). InnerVolumeSpecName "kube-api-access-mr6tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.484177 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7" (UID: "45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.487443 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-config-data" (OuterVolumeSpecName: "config-data") pod "45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7" (UID: "45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.489987 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.90:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.490038 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.90:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.539738 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.539788 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mr6tf\" (UniqueName: \"kubernetes.io/projected/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-kube-api-access-mr6tf\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:05 crc kubenswrapper[4909]: I0202 12:08:05.539800 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.092653 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7","Type":"ContainerDied","Data":"18915acb9ba6e2cb3557edbc0b73a2cba72ded8260d4fc54e15feb2375a641d6"} Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.092702 4909 scope.go:117] "RemoveContainer" containerID="b6a65f1080c0ab6c72a4db50014fe622195670290139bd499d2d6d826b74a6dd" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.092937 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.132473 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.143995 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.156170 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:08:06 crc kubenswrapper[4909]: E0202 12:08:06.156625 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7" containerName="nova-scheduler-scheduler" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.156638 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7" containerName="nova-scheduler-scheduler" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.156892 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7" containerName="nova-scheduler-scheduler" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.157600 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.160027 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.168412 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.251865 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.251928 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-config-data\") pod \"nova-scheduler-0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.251992 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gfv2\" (UniqueName: \"kubernetes.io/projected/01d1231b-15a9-4876-9a03-1c8963164da0-kube-api-access-6gfv2\") pod \"nova-scheduler-0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: E0202 12:08:06.312354 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45d0fd77_6083_4f4a_b2cf_dbbb7656f3b7.slice/crio-18915acb9ba6e2cb3557edbc0b73a2cba72ded8260d4fc54e15feb2375a641d6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45d0fd77_6083_4f4a_b2cf_dbbb7656f3b7.slice\": RecentStats: unable to find data in memory cache]" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.354286 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.354362 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-config-data\") pod \"nova-scheduler-0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.354405 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gfv2\" (UniqueName: \"kubernetes.io/projected/01d1231b-15a9-4876-9a03-1c8963164da0-kube-api-access-6gfv2\") pod \"nova-scheduler-0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.359692 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-config-data\") pod \"nova-scheduler-0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.371570 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gfv2\" (UniqueName: \"kubernetes.io/projected/01d1231b-15a9-4876-9a03-1c8963164da0-kube-api-access-6gfv2\") pod \"nova-scheduler-0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.379496 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.489310 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 12:08:06 crc kubenswrapper[4909]: I0202 12:08:06.964376 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:08:07 crc kubenswrapper[4909]: I0202 12:08:07.028255 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7" path="/var/lib/kubelet/pods/45d0fd77-6083-4f4a-b2cf-dbbb7656f3b7/volumes" Feb 02 12:08:07 crc kubenswrapper[4909]: I0202 12:08:07.101469 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01d1231b-15a9-4876-9a03-1c8963164da0","Type":"ContainerStarted","Data":"67e2a2f3f7098dfc760ca19e70e801c10b93ded7314b7d53f46e0b4dabe229d8"} Feb 02 12:08:08 crc kubenswrapper[4909]: I0202 12:08:08.131150 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01d1231b-15a9-4876-9a03-1c8963164da0","Type":"ContainerStarted","Data":"a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3"} Feb 02 12:08:08 crc kubenswrapper[4909]: I0202 12:08:08.541635 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5416142710000003 podStartE2EDuration="2.541614271s" podCreationTimestamp="2026-02-02 12:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:08:08.53452593 +0000 UTC m=+5814.280626665" watchObservedRunningTime="2026-02-02 12:08:08.541614271 +0000 UTC m=+5814.287715016" Feb 02 12:08:09 crc kubenswrapper[4909]: I0202 12:08:09.017370 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:08:09 crc kubenswrapper[4909]: E0202 12:08:09.017669 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:08:11 crc kubenswrapper[4909]: I0202 12:08:11.490906 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 12:08:14 crc kubenswrapper[4909]: I0202 12:08:14.386141 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 12:08:14 crc kubenswrapper[4909]: I0202 12:08:14.387893 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 12:08:14 crc kubenswrapper[4909]: I0202 12:08:14.393006 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 12:08:14 crc kubenswrapper[4909]: I0202 12:08:14.415293 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 12:08:14 crc kubenswrapper[4909]: I0202 12:08:14.415633 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 12:08:14 crc kubenswrapper[4909]: I0202 12:08:14.416702 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 12:08:14 crc kubenswrapper[4909]: I0202 12:08:14.418295 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.194033 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.198573 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.201926 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.418926 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86d4bc6c6f-wd2wq"] Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.422506 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.458847 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d4bc6c6f-wd2wq"] Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.539440 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-config\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.539521 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-sb\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.539694 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-dns-svc\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.539721 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjcmp\" (UniqueName: \"kubernetes.io/projected/02dee8a0-2a8f-471d-b345-56620697930f-kube-api-access-mjcmp\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.539763 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-nb\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.642109 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-nb\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.642195 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-config\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.642238 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-sb\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.642354 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-dns-svc\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.642375 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjcmp\" (UniqueName: \"kubernetes.io/projected/02dee8a0-2a8f-471d-b345-56620697930f-kube-api-access-mjcmp\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.643349 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-config\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.643450 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-sb\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.644205 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-dns-svc\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.644583 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-nb\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.678580 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjcmp\" (UniqueName: \"kubernetes.io/projected/02dee8a0-2a8f-471d-b345-56620697930f-kube-api-access-mjcmp\") pod \"dnsmasq-dns-86d4bc6c6f-wd2wq\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:15 crc kubenswrapper[4909]: I0202 12:08:15.766339 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:16 crc kubenswrapper[4909]: W0202 12:08:16.291833 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02dee8a0_2a8f_471d_b345_56620697930f.slice/crio-fed58ec3a28cc1a88d382a3438d42e6a2f7733d1d1c32eca752ff00627517fb5 WatchSource:0}: Error finding container fed58ec3a28cc1a88d382a3438d42e6a2f7733d1d1c32eca752ff00627517fb5: Status 404 returned error can't find the container with id fed58ec3a28cc1a88d382a3438d42e6a2f7733d1d1c32eca752ff00627517fb5 Feb 02 12:08:16 crc kubenswrapper[4909]: I0202 12:08:16.295611 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d4bc6c6f-wd2wq"] Feb 02 12:08:16 crc kubenswrapper[4909]: I0202 12:08:16.490240 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 12:08:16 crc kubenswrapper[4909]: I0202 12:08:16.528512 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 12:08:17 crc kubenswrapper[4909]: I0202 12:08:17.217241 4909 generic.go:334] "Generic (PLEG): container finished" podID="02dee8a0-2a8f-471d-b345-56620697930f" containerID="a036980d6ac355b6fff60685a4b354a3bc947bead551eff24929b8c2ec2143e8" exitCode=0 Feb 02 12:08:17 crc kubenswrapper[4909]: I0202 12:08:17.219353 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" event={"ID":"02dee8a0-2a8f-471d-b345-56620697930f","Type":"ContainerDied","Data":"a036980d6ac355b6fff60685a4b354a3bc947bead551eff24929b8c2ec2143e8"} Feb 02 12:08:17 crc kubenswrapper[4909]: I0202 12:08:17.219403 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" event={"ID":"02dee8a0-2a8f-471d-b345-56620697930f","Type":"ContainerStarted","Data":"fed58ec3a28cc1a88d382a3438d42e6a2f7733d1d1c32eca752ff00627517fb5"} Feb 02 12:08:17 crc kubenswrapper[4909]: I0202 12:08:17.277090 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 12:08:18 crc kubenswrapper[4909]: I0202 12:08:18.228333 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" event={"ID":"02dee8a0-2a8f-471d-b345-56620697930f","Type":"ContainerStarted","Data":"866cf9644aadcf2c661c4b7660023c223d1995ead8613324dc731173f33f610a"} Feb 02 12:08:18 crc kubenswrapper[4909]: I0202 12:08:18.228853 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:18 crc kubenswrapper[4909]: I0202 12:08:18.269625 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" podStartSLOduration=3.2695980049999998 podStartE2EDuration="3.269598005s" podCreationTimestamp="2026-02-02 12:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:08:18.262094742 +0000 UTC m=+5824.008195467" watchObservedRunningTime="2026-02-02 12:08:18.269598005 +0000 UTC m=+5824.015698750" Feb 02 12:08:18 crc kubenswrapper[4909]: I0202 12:08:18.483347 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:08:18 crc kubenswrapper[4909]: I0202 12:08:18.483562 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerName="nova-api-log" containerID="cri-o://60c17e6fbbb93ef6b2d9c1b8fb0647e9c3dae092820bf72a473a5c0855a7e532" gracePeriod=30 Feb 02 12:08:18 crc kubenswrapper[4909]: I0202 12:08:18.483877 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerName="nova-api-api" containerID="cri-o://1412daa98da71cf6ba008933a60fbdb47f5bbffef23ba414f41991ea55d17659" gracePeriod=30 Feb 02 12:08:19 crc kubenswrapper[4909]: I0202 12:08:19.238767 4909 generic.go:334] "Generic (PLEG): container finished" podID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerID="60c17e6fbbb93ef6b2d9c1b8fb0647e9c3dae092820bf72a473a5c0855a7e532" exitCode=143 Feb 02 12:08:19 crc kubenswrapper[4909]: I0202 12:08:19.238894 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43d94ff7-5cc5-4f62-93fb-f7e9c1315847","Type":"ContainerDied","Data":"60c17e6fbbb93ef6b2d9c1b8fb0647e9c3dae092820bf72a473a5c0855a7e532"} Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.276692 4909 generic.go:334] "Generic (PLEG): container finished" podID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerID="1412daa98da71cf6ba008933a60fbdb47f5bbffef23ba414f41991ea55d17659" exitCode=0 Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.276753 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43d94ff7-5cc5-4f62-93fb-f7e9c1315847","Type":"ContainerDied","Data":"1412daa98da71cf6ba008933a60fbdb47f5bbffef23ba414f41991ea55d17659"} Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.277369 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"43d94ff7-5cc5-4f62-93fb-f7e9c1315847","Type":"ContainerDied","Data":"294068f028bfc6f5fd81a210a7a51805a5e7d9488283eff42e56da9d42ff8cf6"} Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.277391 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="294068f028bfc6f5fd81a210a7a51805a5e7d9488283eff42e56da9d42ff8cf6" Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.345071 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.395066 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-combined-ca-bundle\") pod \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.395241 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-config-data\") pod \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.395280 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5dsm\" (UniqueName: \"kubernetes.io/projected/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-kube-api-access-p5dsm\") pod \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.395323 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-logs\") pod \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\" (UID: \"43d94ff7-5cc5-4f62-93fb-f7e9c1315847\") " Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.395986 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-logs" (OuterVolumeSpecName: "logs") pod "43d94ff7-5cc5-4f62-93fb-f7e9c1315847" (UID: "43d94ff7-5cc5-4f62-93fb-f7e9c1315847"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.400742 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-kube-api-access-p5dsm" (OuterVolumeSpecName: "kube-api-access-p5dsm") pod "43d94ff7-5cc5-4f62-93fb-f7e9c1315847" (UID: "43d94ff7-5cc5-4f62-93fb-f7e9c1315847"). InnerVolumeSpecName "kube-api-access-p5dsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.423890 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-config-data" (OuterVolumeSpecName: "config-data") pod "43d94ff7-5cc5-4f62-93fb-f7e9c1315847" (UID: "43d94ff7-5cc5-4f62-93fb-f7e9c1315847"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.428059 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43d94ff7-5cc5-4f62-93fb-f7e9c1315847" (UID: "43d94ff7-5cc5-4f62-93fb-f7e9c1315847"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.498033 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.498068 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5dsm\" (UniqueName: \"kubernetes.io/projected/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-kube-api-access-p5dsm\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.498082 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:22 crc kubenswrapper[4909]: I0202 12:08:22.498279 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d94ff7-5cc5-4f62-93fb-f7e9c1315847-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.017155 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:08:23 crc kubenswrapper[4909]: E0202 12:08:23.017761 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.290629 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.319289 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.333530 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.353559 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 12:08:23 crc kubenswrapper[4909]: E0202 12:08:23.354036 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerName="nova-api-api" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.354060 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerName="nova-api-api" Feb 02 12:08:23 crc kubenswrapper[4909]: E0202 12:08:23.354080 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerName="nova-api-log" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.354088 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerName="nova-api-log" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.354337 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerName="nova-api-log" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.354373 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" containerName="nova-api-api" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.355561 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.357731 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.363307 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.365683 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.367041 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.426389 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-logs\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.426460 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-config-data\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.426498 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-public-tls-certs\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.426520 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.426732 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.426902 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvq2w\" (UniqueName: \"kubernetes.io/projected/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-kube-api-access-tvq2w\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.528783 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.528920 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.528963 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvq2w\" (UniqueName: \"kubernetes.io/projected/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-kube-api-access-tvq2w\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.529092 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-logs\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.529129 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-config-data\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.529167 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-public-tls-certs\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.529501 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-logs\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.533537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-public-tls-certs\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.534165 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-config-data\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.534904 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.534910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.546746 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvq2w\" (UniqueName: \"kubernetes.io/projected/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-kube-api-access-tvq2w\") pod \"nova-api-0\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " pod="openstack/nova-api-0" Feb 02 12:08:23 crc kubenswrapper[4909]: I0202 12:08:23.683084 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:08:24 crc kubenswrapper[4909]: I0202 12:08:24.123264 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:08:24 crc kubenswrapper[4909]: I0202 12:08:24.301053 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244","Type":"ContainerStarted","Data":"62cbb60193b939a1be8159f1f192aad858644094fda7698c64377c032d79a9b5"} Feb 02 12:08:24 crc kubenswrapper[4909]: I0202 12:08:24.301101 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244","Type":"ContainerStarted","Data":"2d2294367c378b4b1e1c258ac4dfa2fd669a4b3ae4096190e3135975076c8fbb"} Feb 02 12:08:25 crc kubenswrapper[4909]: I0202 12:08:25.030525 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43d94ff7-5cc5-4f62-93fb-f7e9c1315847" path="/var/lib/kubelet/pods/43d94ff7-5cc5-4f62-93fb-f7e9c1315847/volumes" Feb 02 12:08:25 crc kubenswrapper[4909]: I0202 12:08:25.315316 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244","Type":"ContainerStarted","Data":"f40794129432336f08a314a7f97963bb5b13e8594e67bc11cc45f18e1bb9f82e"} Feb 02 12:08:25 crc kubenswrapper[4909]: I0202 12:08:25.346520 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.346500222 podStartE2EDuration="2.346500222s" podCreationTimestamp="2026-02-02 12:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:08:25.33657757 +0000 UTC m=+5831.082678305" watchObservedRunningTime="2026-02-02 12:08:25.346500222 +0000 UTC m=+5831.092600967" Feb 02 12:08:25 crc kubenswrapper[4909]: I0202 12:08:25.768837 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:08:25 crc kubenswrapper[4909]: I0202 12:08:25.832671 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b98ff868f-ppx5k"] Feb 02 12:08:25 crc kubenswrapper[4909]: I0202 12:08:25.832946 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" podUID="846cb0f7-da5f-4088-bcca-05c67d940e83" containerName="dnsmasq-dns" containerID="cri-o://1e22a58adfb790c4dcc149d293cb56c9c93cf1e035e4bc68748662266b1baa0d" gracePeriod=10 Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.325677 4909 generic.go:334] "Generic (PLEG): container finished" podID="846cb0f7-da5f-4088-bcca-05c67d940e83" containerID="1e22a58adfb790c4dcc149d293cb56c9c93cf1e035e4bc68748662266b1baa0d" exitCode=0 Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.325765 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" event={"ID":"846cb0f7-da5f-4088-bcca-05c67d940e83","Type":"ContainerDied","Data":"1e22a58adfb790c4dcc149d293cb56c9c93cf1e035e4bc68748662266b1baa0d"} Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.326063 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" event={"ID":"846cb0f7-da5f-4088-bcca-05c67d940e83","Type":"ContainerDied","Data":"c3b3ddd00568a433b3c9b62b2c87e9ebbdd846205cb577475659d20d4ac73846"} Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.326077 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3b3ddd00568a433b3c9b62b2c87e9ebbdd846205cb577475659d20d4ac73846" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.354530 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.386697 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-nb\") pod \"846cb0f7-da5f-4088-bcca-05c67d940e83\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.386893 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s829q\" (UniqueName: \"kubernetes.io/projected/846cb0f7-da5f-4088-bcca-05c67d940e83-kube-api-access-s829q\") pod \"846cb0f7-da5f-4088-bcca-05c67d940e83\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.386922 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-sb\") pod \"846cb0f7-da5f-4088-bcca-05c67d940e83\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.386982 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-config\") pod \"846cb0f7-da5f-4088-bcca-05c67d940e83\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.387007 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-dns-svc\") pod \"846cb0f7-da5f-4088-bcca-05c67d940e83\" (UID: \"846cb0f7-da5f-4088-bcca-05c67d940e83\") " Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.401006 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/846cb0f7-da5f-4088-bcca-05c67d940e83-kube-api-access-s829q" (OuterVolumeSpecName: "kube-api-access-s829q") pod "846cb0f7-da5f-4088-bcca-05c67d940e83" (UID: "846cb0f7-da5f-4088-bcca-05c67d940e83"). InnerVolumeSpecName "kube-api-access-s829q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.436977 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "846cb0f7-da5f-4088-bcca-05c67d940e83" (UID: "846cb0f7-da5f-4088-bcca-05c67d940e83"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.447205 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-config" (OuterVolumeSpecName: "config") pod "846cb0f7-da5f-4088-bcca-05c67d940e83" (UID: "846cb0f7-da5f-4088-bcca-05c67d940e83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.469328 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "846cb0f7-da5f-4088-bcca-05c67d940e83" (UID: "846cb0f7-da5f-4088-bcca-05c67d940e83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.486513 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "846cb0f7-da5f-4088-bcca-05c67d940e83" (UID: "846cb0f7-da5f-4088-bcca-05c67d940e83"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.492763 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s829q\" (UniqueName: \"kubernetes.io/projected/846cb0f7-da5f-4088-bcca-05c67d940e83-kube-api-access-s829q\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.492801 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.492872 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.492881 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:26 crc kubenswrapper[4909]: I0202 12:08:26.492889 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/846cb0f7-da5f-4088-bcca-05c67d940e83-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:08:27 crc kubenswrapper[4909]: I0202 12:08:27.333274 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b98ff868f-ppx5k" Feb 02 12:08:27 crc kubenswrapper[4909]: I0202 12:08:27.360927 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b98ff868f-ppx5k"] Feb 02 12:08:27 crc kubenswrapper[4909]: I0202 12:08:27.369125 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b98ff868f-ppx5k"] Feb 02 12:08:29 crc kubenswrapper[4909]: I0202 12:08:29.026374 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="846cb0f7-da5f-4088-bcca-05c67d940e83" path="/var/lib/kubelet/pods/846cb0f7-da5f-4088-bcca-05c67d940e83/volumes" Feb 02 12:08:33 crc kubenswrapper[4909]: I0202 12:08:33.684121 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 12:08:33 crc kubenswrapper[4909]: I0202 12:08:33.684749 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 12:08:34 crc kubenswrapper[4909]: I0202 12:08:34.696035 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.93:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 12:08:34 crc kubenswrapper[4909]: I0202 12:08:34.696035 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.93:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 12:08:36 crc kubenswrapper[4909]: I0202 12:08:36.017367 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:08:36 crc kubenswrapper[4909]: E0202 12:08:36.018016 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:08:43 crc kubenswrapper[4909]: I0202 12:08:43.692428 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 12:08:43 crc kubenswrapper[4909]: I0202 12:08:43.693261 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 12:08:43 crc kubenswrapper[4909]: I0202 12:08:43.694441 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 12:08:43 crc kubenswrapper[4909]: I0202 12:08:43.699878 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 12:08:44 crc kubenswrapper[4909]: I0202 12:08:44.475249 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 12:08:44 crc kubenswrapper[4909]: I0202 12:08:44.481568 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 12:08:47 crc kubenswrapper[4909]: I0202 12:08:47.016248 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:08:47 crc kubenswrapper[4909]: E0202 12:08:47.017064 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.728689 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ffx4m"] Feb 02 12:08:57 crc kubenswrapper[4909]: E0202 12:08:57.729779 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846cb0f7-da5f-4088-bcca-05c67d940e83" containerName="init" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.729799 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="846cb0f7-da5f-4088-bcca-05c67d940e83" containerName="init" Feb 02 12:08:57 crc kubenswrapper[4909]: E0202 12:08:57.729851 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="846cb0f7-da5f-4088-bcca-05c67d940e83" containerName="dnsmasq-dns" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.729860 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="846cb0f7-da5f-4088-bcca-05c67d940e83" containerName="dnsmasq-dns" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.730070 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="846cb0f7-da5f-4088-bcca-05c67d940e83" containerName="dnsmasq-dns" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.731457 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.780773 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ffx4m"] Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.801952 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph5m5\" (UniqueName: \"kubernetes.io/projected/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-kube-api-access-ph5m5\") pod \"redhat-operators-ffx4m\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.802022 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-utilities\") pod \"redhat-operators-ffx4m\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.802141 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-catalog-content\") pod \"redhat-operators-ffx4m\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.903666 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-utilities\") pod \"redhat-operators-ffx4m\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.904170 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-catalog-content\") pod \"redhat-operators-ffx4m\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.904312 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph5m5\" (UniqueName: \"kubernetes.io/projected/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-kube-api-access-ph5m5\") pod \"redhat-operators-ffx4m\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.905055 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-utilities\") pod \"redhat-operators-ffx4m\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.904833 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-catalog-content\") pod \"redhat-operators-ffx4m\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:57 crc kubenswrapper[4909]: I0202 12:08:57.933239 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph5m5\" (UniqueName: \"kubernetes.io/projected/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-kube-api-access-ph5m5\") pod \"redhat-operators-ffx4m\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:58 crc kubenswrapper[4909]: I0202 12:08:58.054527 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:08:58 crc kubenswrapper[4909]: I0202 12:08:58.529149 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ffx4m"] Feb 02 12:08:58 crc kubenswrapper[4909]: I0202 12:08:58.601683 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffx4m" event={"ID":"32bf17e5-c734-4f5d-99cc-3e934dc0be5d","Type":"ContainerStarted","Data":"db5513e8e45856f8b1143da043ee46352de3916dfd7f56dc5563c6548f35378e"} Feb 02 12:08:59 crc kubenswrapper[4909]: I0202 12:08:59.018727 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:08:59 crc kubenswrapper[4909]: E0202 12:08:59.019434 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:08:59 crc kubenswrapper[4909]: I0202 12:08:59.614086 4909 generic.go:334] "Generic (PLEG): container finished" podID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerID="5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6" exitCode=0 Feb 02 12:08:59 crc kubenswrapper[4909]: I0202 12:08:59.614163 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffx4m" event={"ID":"32bf17e5-c734-4f5d-99cc-3e934dc0be5d","Type":"ContainerDied","Data":"5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6"} Feb 02 12:09:01 crc kubenswrapper[4909]: I0202 12:09:01.634648 4909 generic.go:334] "Generic (PLEG): container finished" podID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerID="82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1" exitCode=0 Feb 02 12:09:01 crc kubenswrapper[4909]: I0202 12:09:01.634718 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffx4m" event={"ID":"32bf17e5-c734-4f5d-99cc-3e934dc0be5d","Type":"ContainerDied","Data":"82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1"} Feb 02 12:09:02 crc kubenswrapper[4909]: I0202 12:09:02.645304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffx4m" event={"ID":"32bf17e5-c734-4f5d-99cc-3e934dc0be5d","Type":"ContainerStarted","Data":"76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace"} Feb 02 12:09:02 crc kubenswrapper[4909]: I0202 12:09:02.671511 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ffx4m" podStartSLOduration=3.167260106 podStartE2EDuration="5.671476904s" podCreationTimestamp="2026-02-02 12:08:57 +0000 UTC" firstStartedPulling="2026-02-02 12:08:59.616141423 +0000 UTC m=+5865.362242158" lastFinishedPulling="2026-02-02 12:09:02.120358221 +0000 UTC m=+5867.866458956" observedRunningTime="2026-02-02 12:09:02.66677077 +0000 UTC m=+5868.412871505" watchObservedRunningTime="2026-02-02 12:09:02.671476904 +0000 UTC m=+5868.417577639" Feb 02 12:09:03 crc kubenswrapper[4909]: I0202 12:09:03.056791 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-6sn6c"] Feb 02 12:09:03 crc kubenswrapper[4909]: I0202 12:09:03.065938 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f700-account-create-update-cczfc"] Feb 02 12:09:03 crc kubenswrapper[4909]: I0202 12:09:03.075884 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-6sn6c"] Feb 02 12:09:03 crc kubenswrapper[4909]: I0202 12:09:03.085008 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f700-account-create-update-cczfc"] Feb 02 12:09:05 crc kubenswrapper[4909]: I0202 12:09:05.028461 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411171cf-4974-4e9a-bcb5-683f05db89cd" path="/var/lib/kubelet/pods/411171cf-4974-4e9a-bcb5-683f05db89cd/volumes" Feb 02 12:09:05 crc kubenswrapper[4909]: I0202 12:09:05.029518 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0" path="/var/lib/kubelet/pods/5bd9f078-cbb9-42a0-8b6d-ed651f00dbf0/volumes" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.316295 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2vh4d"] Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.317911 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.321426 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.321528 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8pxg4" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.321651 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.332274 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-jgdxf"] Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.334558 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.341692 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2vh4d"] Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.387142 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jgdxf"] Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.469932 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcmkf\" (UniqueName: \"kubernetes.io/projected/0f30be1f-7fe1-40f2-89d3-23cfca972041-kube-api-access-rcmkf\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470029 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f30be1f-7fe1-40f2-89d3-23cfca972041-scripts\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470094 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-var-lib\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470114 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f30be1f-7fe1-40f2-89d3-23cfca972041-ovn-controller-tls-certs\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470131 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f30be1f-7fe1-40f2-89d3-23cfca972041-combined-ca-bundle\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470270 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-scripts\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470315 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-etc-ovs\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470341 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0f30be1f-7fe1-40f2-89d3-23cfca972041-var-log-ovn\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470366 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-var-log\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470413 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-var-run\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470458 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f30be1f-7fe1-40f2-89d3-23cfca972041-var-run-ovn\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470537 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0f30be1f-7fe1-40f2-89d3-23cfca972041-var-run\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.470568 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8qlf\" (UniqueName: \"kubernetes.io/projected/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-kube-api-access-p8qlf\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.572471 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f30be1f-7fe1-40f2-89d3-23cfca972041-scripts\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.572580 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-var-lib\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.572608 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f30be1f-7fe1-40f2-89d3-23cfca972041-ovn-controller-tls-certs\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.572634 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f30be1f-7fe1-40f2-89d3-23cfca972041-combined-ca-bundle\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.572675 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-scripts\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.572699 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-etc-ovs\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.572717 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0f30be1f-7fe1-40f2-89d3-23cfca972041-var-log-ovn\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.572739 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-var-log\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.572768 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-var-run\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573166 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-etc-ovs\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573193 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-var-run\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573169 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-var-lib\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573268 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0f30be1f-7fe1-40f2-89d3-23cfca972041-var-log-ovn\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-var-log\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f30be1f-7fe1-40f2-89d3-23cfca972041-var-run-ovn\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573384 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0f30be1f-7fe1-40f2-89d3-23cfca972041-var-run-ovn\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573418 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0f30be1f-7fe1-40f2-89d3-23cfca972041-var-run\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573508 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0f30be1f-7fe1-40f2-89d3-23cfca972041-var-run\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573558 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8qlf\" (UniqueName: \"kubernetes.io/projected/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-kube-api-access-p8qlf\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.573589 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcmkf\" (UniqueName: \"kubernetes.io/projected/0f30be1f-7fe1-40f2-89d3-23cfca972041-kube-api-access-rcmkf\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.575054 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f30be1f-7fe1-40f2-89d3-23cfca972041-scripts\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.575763 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-scripts\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.581379 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f30be1f-7fe1-40f2-89d3-23cfca972041-ovn-controller-tls-certs\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.581582 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f30be1f-7fe1-40f2-89d3-23cfca972041-combined-ca-bundle\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.589550 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8qlf\" (UniqueName: \"kubernetes.io/projected/bcbf23b5-3226-43c6-b8d3-6ba72b955eda-kube-api-access-p8qlf\") pod \"ovn-controller-ovs-jgdxf\" (UID: \"bcbf23b5-3226-43c6-b8d3-6ba72b955eda\") " pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.600360 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcmkf\" (UniqueName: \"kubernetes.io/projected/0f30be1f-7fe1-40f2-89d3-23cfca972041-kube-api-access-rcmkf\") pod \"ovn-controller-2vh4d\" (UID: \"0f30be1f-7fe1-40f2-89d3-23cfca972041\") " pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.688389 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:06 crc kubenswrapper[4909]: I0202 12:09:06.706893 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:07 crc kubenswrapper[4909]: I0202 12:09:07.217017 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2vh4d"] Feb 02 12:09:07 crc kubenswrapper[4909]: W0202 12:09:07.533289 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcbf23b5_3226_43c6_b8d3_6ba72b955eda.slice/crio-2db10bebe8c863b3a34cc98ef2222061345a6861d20b1a11723419825c9ac25a WatchSource:0}: Error finding container 2db10bebe8c863b3a34cc98ef2222061345a6861d20b1a11723419825c9ac25a: Status 404 returned error can't find the container with id 2db10bebe8c863b3a34cc98ef2222061345a6861d20b1a11723419825c9ac25a Feb 02 12:09:07 crc kubenswrapper[4909]: I0202 12:09:07.541430 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jgdxf"] Feb 02 12:09:07 crc kubenswrapper[4909]: I0202 12:09:07.699710 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jgdxf" event={"ID":"bcbf23b5-3226-43c6-b8d3-6ba72b955eda","Type":"ContainerStarted","Data":"2db10bebe8c863b3a34cc98ef2222061345a6861d20b1a11723419825c9ac25a"} Feb 02 12:09:07 crc kubenswrapper[4909]: I0202 12:09:07.703287 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2vh4d" event={"ID":"0f30be1f-7fe1-40f2-89d3-23cfca972041","Type":"ContainerStarted","Data":"e7f4b9d9e87002ac1af50f06938bf2620d6358761288a24d676e51f437331412"} Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.055201 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.055326 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.140948 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-zkzwg"] Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.142697 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.147374 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.159477 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-zkzwg"] Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.327401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-config\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.327449 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.327473 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-combined-ca-bundle\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.327500 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-ovn-rundir\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.327548 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-ovs-rundir\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.327594 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lvpk\" (UniqueName: \"kubernetes.io/projected/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-kube-api-access-9lvpk\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.429906 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-config\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.429979 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.430007 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-combined-ca-bundle\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.430041 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-ovn-rundir\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.430102 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-ovs-rundir\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.430161 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lvpk\" (UniqueName: \"kubernetes.io/projected/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-kube-api-access-9lvpk\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.431447 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-config\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.432075 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-ovn-rundir\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.432278 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-ovs-rundir\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.440537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-combined-ca-bundle\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.458866 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.460275 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lvpk\" (UniqueName: \"kubernetes.io/projected/01ae3729-32df-4aac-bc6a-e401e0cb9aa2-kube-api-access-9lvpk\") pod \"ovn-controller-metrics-zkzwg\" (UID: \"01ae3729-32df-4aac-bc6a-e401e0cb9aa2\") " pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.491200 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-zkzwg" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.717141 4909 generic.go:334] "Generic (PLEG): container finished" podID="bcbf23b5-3226-43c6-b8d3-6ba72b955eda" containerID="c59cc89cd0887d4fe8ec809f0b22ca42a68f09dc805a253493319792b632b129" exitCode=0 Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.717427 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jgdxf" event={"ID":"bcbf23b5-3226-43c6-b8d3-6ba72b955eda","Type":"ContainerDied","Data":"c59cc89cd0887d4fe8ec809f0b22ca42a68f09dc805a253493319792b632b129"} Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.732116 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2vh4d" event={"ID":"0f30be1f-7fe1-40f2-89d3-23cfca972041","Type":"ContainerStarted","Data":"15aab2237b53d98b72c06ca8fa59be9512aa844bfa6ed632347ef37a8699b110"} Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.732189 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.765445 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2vh4d" podStartSLOduration=2.76542159 podStartE2EDuration="2.76542159s" podCreationTimestamp="2026-02-02 12:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:09:08.76539247 +0000 UTC m=+5874.511493225" watchObservedRunningTime="2026-02-02 12:09:08.76542159 +0000 UTC m=+5874.511522325" Feb 02 12:09:08 crc kubenswrapper[4909]: I0202 12:09:08.976930 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-zkzwg"] Feb 02 12:09:08 crc kubenswrapper[4909]: W0202 12:09:08.993191 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01ae3729_32df_4aac_bc6a_e401e0cb9aa2.slice/crio-199d6f523e6c26033d4d053dc67d869f6c481a2cbae1a57b5e2f0f722ca8bb08 WatchSource:0}: Error finding container 199d6f523e6c26033d4d053dc67d869f6c481a2cbae1a57b5e2f0f722ca8bb08: Status 404 returned error can't find the container with id 199d6f523e6c26033d4d053dc67d869f6c481a2cbae1a57b5e2f0f722ca8bb08 Feb 02 12:09:09 crc kubenswrapper[4909]: I0202 12:09:09.112566 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ffx4m" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerName="registry-server" probeResult="failure" output=< Feb 02 12:09:09 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:09:09 crc kubenswrapper[4909]: > Feb 02 12:09:09 crc kubenswrapper[4909]: I0202 12:09:09.736681 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jgdxf" event={"ID":"bcbf23b5-3226-43c6-b8d3-6ba72b955eda","Type":"ContainerStarted","Data":"546ced832eb64e2b76872143a29ef5e4efce1e2d185839d2b4267649c16a7a48"} Feb 02 12:09:09 crc kubenswrapper[4909]: I0202 12:09:09.737134 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:09 crc kubenswrapper[4909]: I0202 12:09:09.737150 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jgdxf" event={"ID":"bcbf23b5-3226-43c6-b8d3-6ba72b955eda","Type":"ContainerStarted","Data":"b6cda042b44a704f1d5d9ed8af6fd41cea34a12a71cdd4aa824dba3f2fabe78e"} Feb 02 12:09:09 crc kubenswrapper[4909]: I0202 12:09:09.737167 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:09 crc kubenswrapper[4909]: I0202 12:09:09.738782 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-zkzwg" event={"ID":"01ae3729-32df-4aac-bc6a-e401e0cb9aa2","Type":"ContainerStarted","Data":"1fe2f41f9d02dbee6eb852a05b0fc87b8b1dd11c27fe20adcb700c8919e759f5"} Feb 02 12:09:09 crc kubenswrapper[4909]: I0202 12:09:09.738872 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-zkzwg" event={"ID":"01ae3729-32df-4aac-bc6a-e401e0cb9aa2","Type":"ContainerStarted","Data":"199d6f523e6c26033d4d053dc67d869f6c481a2cbae1a57b5e2f0f722ca8bb08"} Feb 02 12:09:09 crc kubenswrapper[4909]: I0202 12:09:09.764631 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-jgdxf" podStartSLOduration=3.764610201 podStartE2EDuration="3.764610201s" podCreationTimestamp="2026-02-02 12:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:09:09.760883925 +0000 UTC m=+5875.506984770" watchObservedRunningTime="2026-02-02 12:09:09.764610201 +0000 UTC m=+5875.510710936" Feb 02 12:09:09 crc kubenswrapper[4909]: I0202 12:09:09.779517 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-zkzwg" podStartSLOduration=1.779494334 podStartE2EDuration="1.779494334s" podCreationTimestamp="2026-02-02 12:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:09:09.778334481 +0000 UTC m=+5875.524435226" watchObservedRunningTime="2026-02-02 12:09:09.779494334 +0000 UTC m=+5875.525595069" Feb 02 12:09:10 crc kubenswrapper[4909]: I0202 12:09:10.030137 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2tcsj"] Feb 02 12:09:10 crc kubenswrapper[4909]: I0202 12:09:10.041094 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2tcsj"] Feb 02 12:09:11 crc kubenswrapper[4909]: I0202 12:09:11.027171 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce660c78-0b8c-4d99-aa8f-f03338f8d887" path="/var/lib/kubelet/pods/ce660c78-0b8c-4d99-aa8f-f03338f8d887/volumes" Feb 02 12:09:13 crc kubenswrapper[4909]: I0202 12:09:13.017787 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:09:13 crc kubenswrapper[4909]: E0202 12:09:13.019656 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:09:19 crc kubenswrapper[4909]: I0202 12:09:19.112663 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ffx4m" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerName="registry-server" probeResult="failure" output=< Feb 02 12:09:19 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:09:19 crc kubenswrapper[4909]: > Feb 02 12:09:21 crc kubenswrapper[4909]: I0202 12:09:21.518512 4909 scope.go:117] "RemoveContainer" containerID="964b1fd9c5945004adb5e8710440af70b154a112941bf9a64a64b0441cb9f2c9" Feb 02 12:09:21 crc kubenswrapper[4909]: I0202 12:09:21.551286 4909 scope.go:117] "RemoveContainer" containerID="1456a2c0e7924fdc4e90027876166a4c836dbc12a2bb519dba885c39a8a6c9af" Feb 02 12:09:21 crc kubenswrapper[4909]: I0202 12:09:21.590558 4909 scope.go:117] "RemoveContainer" containerID="b992f4fd6affa56df4ee86e6e874d8b121694670e52820573223a18390617a11" Feb 02 12:09:23 crc kubenswrapper[4909]: I0202 12:09:23.060943 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-js4dl"] Feb 02 12:09:23 crc kubenswrapper[4909]: I0202 12:09:23.073131 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-js4dl"] Feb 02 12:09:25 crc kubenswrapper[4909]: I0202 12:09:25.024358 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:09:25 crc kubenswrapper[4909]: E0202 12:09:25.024893 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:09:25 crc kubenswrapper[4909]: I0202 12:09:25.029638 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822011f9-e6a3-4a9e-9efd-57a0615fbe69" path="/var/lib/kubelet/pods/822011f9-e6a3-4a9e-9efd-57a0615fbe69/volumes" Feb 02 12:09:28 crc kubenswrapper[4909]: I0202 12:09:28.099598 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:09:28 crc kubenswrapper[4909]: I0202 12:09:28.161999 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:09:28 crc kubenswrapper[4909]: I0202 12:09:28.926506 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ffx4m"] Feb 02 12:09:29 crc kubenswrapper[4909]: I0202 12:09:29.913796 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-8brcl"] Feb 02 12:09:29 crc kubenswrapper[4909]: I0202 12:09:29.915675 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-8brcl" Feb 02 12:09:29 crc kubenswrapper[4909]: I0202 12:09:29.924060 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-8brcl"] Feb 02 12:09:29 crc kubenswrapper[4909]: I0202 12:09:29.953794 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ffx4m" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerName="registry-server" containerID="cri-o://76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace" gracePeriod=2 Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.055332 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138ed7b8-dd8c-43f5-a928-092f3d8dd670-operator-scripts\") pod \"octavia-db-create-8brcl\" (UID: \"138ed7b8-dd8c-43f5-a928-092f3d8dd670\") " pod="openstack/octavia-db-create-8brcl" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.055665 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6pc7\" (UniqueName: \"kubernetes.io/projected/138ed7b8-dd8c-43f5-a928-092f3d8dd670-kube-api-access-q6pc7\") pod \"octavia-db-create-8brcl\" (UID: \"138ed7b8-dd8c-43f5-a928-092f3d8dd670\") " pod="openstack/octavia-db-create-8brcl" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.158933 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138ed7b8-dd8c-43f5-a928-092f3d8dd670-operator-scripts\") pod \"octavia-db-create-8brcl\" (UID: \"138ed7b8-dd8c-43f5-a928-092f3d8dd670\") " pod="openstack/octavia-db-create-8brcl" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.159561 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6pc7\" (UniqueName: \"kubernetes.io/projected/138ed7b8-dd8c-43f5-a928-092f3d8dd670-kube-api-access-q6pc7\") pod \"octavia-db-create-8brcl\" (UID: \"138ed7b8-dd8c-43f5-a928-092f3d8dd670\") " pod="openstack/octavia-db-create-8brcl" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.159762 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138ed7b8-dd8c-43f5-a928-092f3d8dd670-operator-scripts\") pod \"octavia-db-create-8brcl\" (UID: \"138ed7b8-dd8c-43f5-a928-092f3d8dd670\") " pod="openstack/octavia-db-create-8brcl" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.202264 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6pc7\" (UniqueName: \"kubernetes.io/projected/138ed7b8-dd8c-43f5-a928-092f3d8dd670-kube-api-access-q6pc7\") pod \"octavia-db-create-8brcl\" (UID: \"138ed7b8-dd8c-43f5-a928-092f3d8dd670\") " pod="openstack/octavia-db-create-8brcl" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.249226 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-8brcl" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.523288 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.671630 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-utilities\") pod \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.672022 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph5m5\" (UniqueName: \"kubernetes.io/projected/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-kube-api-access-ph5m5\") pod \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.672047 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-catalog-content\") pod \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\" (UID: \"32bf17e5-c734-4f5d-99cc-3e934dc0be5d\") " Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.672766 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-utilities" (OuterVolumeSpecName: "utilities") pod "32bf17e5-c734-4f5d-99cc-3e934dc0be5d" (UID: "32bf17e5-c734-4f5d-99cc-3e934dc0be5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.679109 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-kube-api-access-ph5m5" (OuterVolumeSpecName: "kube-api-access-ph5m5") pod "32bf17e5-c734-4f5d-99cc-3e934dc0be5d" (UID: "32bf17e5-c734-4f5d-99cc-3e934dc0be5d"). InnerVolumeSpecName "kube-api-access-ph5m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.773861 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.773893 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph5m5\" (UniqueName: \"kubernetes.io/projected/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-kube-api-access-ph5m5\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.806013 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32bf17e5-c734-4f5d-99cc-3e934dc0be5d" (UID: "32bf17e5-c734-4f5d-99cc-3e934dc0be5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.875934 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32bf17e5-c734-4f5d-99cc-3e934dc0be5d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:30 crc kubenswrapper[4909]: W0202 12:09:30.906384 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod138ed7b8_dd8c_43f5_a928_092f3d8dd670.slice/crio-4aa7858fc63c01f06c57d2e412d732f66b62a45e77ee7ced1d305e1dd2aba384 WatchSource:0}: Error finding container 4aa7858fc63c01f06c57d2e412d732f66b62a45e77ee7ced1d305e1dd2aba384: Status 404 returned error can't find the container with id 4aa7858fc63c01f06c57d2e412d732f66b62a45e77ee7ced1d305e1dd2aba384 Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.907272 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-8brcl"] Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.966999 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-8brcl" event={"ID":"138ed7b8-dd8c-43f5-a928-092f3d8dd670","Type":"ContainerStarted","Data":"4aa7858fc63c01f06c57d2e412d732f66b62a45e77ee7ced1d305e1dd2aba384"} Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.969886 4909 generic.go:334] "Generic (PLEG): container finished" podID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerID="76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace" exitCode=0 Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.970074 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffx4m" event={"ID":"32bf17e5-c734-4f5d-99cc-3e934dc0be5d","Type":"ContainerDied","Data":"76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace"} Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.970279 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ffx4m" event={"ID":"32bf17e5-c734-4f5d-99cc-3e934dc0be5d","Type":"ContainerDied","Data":"db5513e8e45856f8b1143da043ee46352de3916dfd7f56dc5563c6548f35378e"} Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.970356 4909 scope.go:117] "RemoveContainer" containerID="76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.971073 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ffx4m" Feb 02 12:09:30 crc kubenswrapper[4909]: I0202 12:09:30.995079 4909 scope.go:117] "RemoveContainer" containerID="82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.002267 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-f673-account-create-update-hn76l"] Feb 02 12:09:31 crc kubenswrapper[4909]: E0202 12:09:31.002698 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerName="extract-content" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.002714 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerName="extract-content" Feb 02 12:09:31 crc kubenswrapper[4909]: E0202 12:09:31.002730 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerName="extract-utilities" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.002737 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerName="extract-utilities" Feb 02 12:09:31 crc kubenswrapper[4909]: E0202 12:09:31.002749 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerName="registry-server" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.002756 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerName="registry-server" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.002951 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" containerName="registry-server" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.003550 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-f673-account-create-update-hn76l" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.006034 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.036116 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-f673-account-create-update-hn76l"] Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.039920 4909 scope.go:117] "RemoveContainer" containerID="5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.062886 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ffx4m"] Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.071958 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ffx4m"] Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.072950 4909 scope.go:117] "RemoveContainer" containerID="76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace" Feb 02 12:09:31 crc kubenswrapper[4909]: E0202 12:09:31.076973 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace\": container with ID starting with 76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace not found: ID does not exist" containerID="76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.077037 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace"} err="failed to get container status \"76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace\": rpc error: code = NotFound desc = could not find container \"76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace\": container with ID starting with 76ef30c7f3b838d27f4ed4a54f6fc9dba9158077b2788fae7e50523432e48ace not found: ID does not exist" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.077091 4909 scope.go:117] "RemoveContainer" containerID="82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1" Feb 02 12:09:31 crc kubenswrapper[4909]: E0202 12:09:31.081116 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1\": container with ID starting with 82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1 not found: ID does not exist" containerID="82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.081159 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1"} err="failed to get container status \"82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1\": rpc error: code = NotFound desc = could not find container \"82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1\": container with ID starting with 82c2913e5c32d9f18bb21425fb042e8293b9560b0458b751a31d601b8b09bbe1 not found: ID does not exist" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.081186 4909 scope.go:117] "RemoveContainer" containerID="5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6" Feb 02 12:09:31 crc kubenswrapper[4909]: E0202 12:09:31.088490 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6\": container with ID starting with 5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6 not found: ID does not exist" containerID="5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.088552 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6"} err="failed to get container status \"5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6\": rpc error: code = NotFound desc = could not find container \"5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6\": container with ID starting with 5b56ce7206c25b6ab070255e268e1b054529ca33bed65ea5cbc5d278d37d14a6 not found: ID does not exist" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.189654 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdpgj\" (UniqueName: \"kubernetes.io/projected/e5683dbc-e166-419a-97a4-6ae32908deff-kube-api-access-kdpgj\") pod \"octavia-f673-account-create-update-hn76l\" (UID: \"e5683dbc-e166-419a-97a4-6ae32908deff\") " pod="openstack/octavia-f673-account-create-update-hn76l" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.189770 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5683dbc-e166-419a-97a4-6ae32908deff-operator-scripts\") pod \"octavia-f673-account-create-update-hn76l\" (UID: \"e5683dbc-e166-419a-97a4-6ae32908deff\") " pod="openstack/octavia-f673-account-create-update-hn76l" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.292312 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5683dbc-e166-419a-97a4-6ae32908deff-operator-scripts\") pod \"octavia-f673-account-create-update-hn76l\" (UID: \"e5683dbc-e166-419a-97a4-6ae32908deff\") " pod="openstack/octavia-f673-account-create-update-hn76l" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.292543 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdpgj\" (UniqueName: \"kubernetes.io/projected/e5683dbc-e166-419a-97a4-6ae32908deff-kube-api-access-kdpgj\") pod \"octavia-f673-account-create-update-hn76l\" (UID: \"e5683dbc-e166-419a-97a4-6ae32908deff\") " pod="openstack/octavia-f673-account-create-update-hn76l" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.293152 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5683dbc-e166-419a-97a4-6ae32908deff-operator-scripts\") pod \"octavia-f673-account-create-update-hn76l\" (UID: \"e5683dbc-e166-419a-97a4-6ae32908deff\") " pod="openstack/octavia-f673-account-create-update-hn76l" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.311030 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdpgj\" (UniqueName: \"kubernetes.io/projected/e5683dbc-e166-419a-97a4-6ae32908deff-kube-api-access-kdpgj\") pod \"octavia-f673-account-create-update-hn76l\" (UID: \"e5683dbc-e166-419a-97a4-6ae32908deff\") " pod="openstack/octavia-f673-account-create-update-hn76l" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.337290 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-f673-account-create-update-hn76l" Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.793827 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-f673-account-create-update-hn76l"] Feb 02 12:09:31 crc kubenswrapper[4909]: W0202 12:09:31.798154 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5683dbc_e166_419a_97a4_6ae32908deff.slice/crio-00dcc85ba3cd8ae544bb45557ac1b13aa438005c595fc28bffebf173ee32bfba WatchSource:0}: Error finding container 00dcc85ba3cd8ae544bb45557ac1b13aa438005c595fc28bffebf173ee32bfba: Status 404 returned error can't find the container with id 00dcc85ba3cd8ae544bb45557ac1b13aa438005c595fc28bffebf173ee32bfba Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.987236 4909 generic.go:334] "Generic (PLEG): container finished" podID="138ed7b8-dd8c-43f5-a928-092f3d8dd670" containerID="e7ba5ff8af450cd2ed5d8fd43c29c4180d2f9c17d21310f46bdcfd4f4ac0eaa7" exitCode=0 Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.987541 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-8brcl" event={"ID":"138ed7b8-dd8c-43f5-a928-092f3d8dd670","Type":"ContainerDied","Data":"e7ba5ff8af450cd2ed5d8fd43c29c4180d2f9c17d21310f46bdcfd4f4ac0eaa7"} Feb 02 12:09:31 crc kubenswrapper[4909]: I0202 12:09:31.990100 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-f673-account-create-update-hn76l" event={"ID":"e5683dbc-e166-419a-97a4-6ae32908deff","Type":"ContainerStarted","Data":"00dcc85ba3cd8ae544bb45557ac1b13aa438005c595fc28bffebf173ee32bfba"} Feb 02 12:09:33 crc kubenswrapper[4909]: I0202 12:09:33.000836 4909 generic.go:334] "Generic (PLEG): container finished" podID="e5683dbc-e166-419a-97a4-6ae32908deff" containerID="495eefbb942e4827063b82e831562cc26299407117ec95bec19891bde00bd1bd" exitCode=0 Feb 02 12:09:33 crc kubenswrapper[4909]: I0202 12:09:33.001434 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-f673-account-create-update-hn76l" event={"ID":"e5683dbc-e166-419a-97a4-6ae32908deff","Type":"ContainerDied","Data":"495eefbb942e4827063b82e831562cc26299407117ec95bec19891bde00bd1bd"} Feb 02 12:09:33 crc kubenswrapper[4909]: I0202 12:09:33.034723 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32bf17e5-c734-4f5d-99cc-3e934dc0be5d" path="/var/lib/kubelet/pods/32bf17e5-c734-4f5d-99cc-3e934dc0be5d/volumes" Feb 02 12:09:33 crc kubenswrapper[4909]: I0202 12:09:33.372558 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-8brcl" Feb 02 12:09:33 crc kubenswrapper[4909]: I0202 12:09:33.533628 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138ed7b8-dd8c-43f5-a928-092f3d8dd670-operator-scripts\") pod \"138ed7b8-dd8c-43f5-a928-092f3d8dd670\" (UID: \"138ed7b8-dd8c-43f5-a928-092f3d8dd670\") " Feb 02 12:09:33 crc kubenswrapper[4909]: I0202 12:09:33.533960 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6pc7\" (UniqueName: \"kubernetes.io/projected/138ed7b8-dd8c-43f5-a928-092f3d8dd670-kube-api-access-q6pc7\") pod \"138ed7b8-dd8c-43f5-a928-092f3d8dd670\" (UID: \"138ed7b8-dd8c-43f5-a928-092f3d8dd670\") " Feb 02 12:09:33 crc kubenswrapper[4909]: I0202 12:09:33.534040 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/138ed7b8-dd8c-43f5-a928-092f3d8dd670-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "138ed7b8-dd8c-43f5-a928-092f3d8dd670" (UID: "138ed7b8-dd8c-43f5-a928-092f3d8dd670"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:09:33 crc kubenswrapper[4909]: I0202 12:09:33.534492 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/138ed7b8-dd8c-43f5-a928-092f3d8dd670-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:33 crc kubenswrapper[4909]: I0202 12:09:33.542740 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138ed7b8-dd8c-43f5-a928-092f3d8dd670-kube-api-access-q6pc7" (OuterVolumeSpecName: "kube-api-access-q6pc7") pod "138ed7b8-dd8c-43f5-a928-092f3d8dd670" (UID: "138ed7b8-dd8c-43f5-a928-092f3d8dd670"). InnerVolumeSpecName "kube-api-access-q6pc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:09:33 crc kubenswrapper[4909]: I0202 12:09:33.636588 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6pc7\" (UniqueName: \"kubernetes.io/projected/138ed7b8-dd8c-43f5-a928-092f3d8dd670-kube-api-access-q6pc7\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:34 crc kubenswrapper[4909]: I0202 12:09:34.013373 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-8brcl" event={"ID":"138ed7b8-dd8c-43f5-a928-092f3d8dd670","Type":"ContainerDied","Data":"4aa7858fc63c01f06c57d2e412d732f66b62a45e77ee7ced1d305e1dd2aba384"} Feb 02 12:09:34 crc kubenswrapper[4909]: I0202 12:09:34.016612 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aa7858fc63c01f06c57d2e412d732f66b62a45e77ee7ced1d305e1dd2aba384" Feb 02 12:09:34 crc kubenswrapper[4909]: I0202 12:09:34.013428 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-8brcl" Feb 02 12:09:34 crc kubenswrapper[4909]: I0202 12:09:34.314187 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-f673-account-create-update-hn76l" Feb 02 12:09:34 crc kubenswrapper[4909]: I0202 12:09:34.449623 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5683dbc-e166-419a-97a4-6ae32908deff-operator-scripts\") pod \"e5683dbc-e166-419a-97a4-6ae32908deff\" (UID: \"e5683dbc-e166-419a-97a4-6ae32908deff\") " Feb 02 12:09:34 crc kubenswrapper[4909]: I0202 12:09:34.449698 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdpgj\" (UniqueName: \"kubernetes.io/projected/e5683dbc-e166-419a-97a4-6ae32908deff-kube-api-access-kdpgj\") pod \"e5683dbc-e166-419a-97a4-6ae32908deff\" (UID: \"e5683dbc-e166-419a-97a4-6ae32908deff\") " Feb 02 12:09:34 crc kubenswrapper[4909]: I0202 12:09:34.450156 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5683dbc-e166-419a-97a4-6ae32908deff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5683dbc-e166-419a-97a4-6ae32908deff" (UID: "e5683dbc-e166-419a-97a4-6ae32908deff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:09:34 crc kubenswrapper[4909]: I0202 12:09:34.450701 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5683dbc-e166-419a-97a4-6ae32908deff-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:34 crc kubenswrapper[4909]: I0202 12:09:34.453399 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5683dbc-e166-419a-97a4-6ae32908deff-kube-api-access-kdpgj" (OuterVolumeSpecName: "kube-api-access-kdpgj") pod "e5683dbc-e166-419a-97a4-6ae32908deff" (UID: "e5683dbc-e166-419a-97a4-6ae32908deff"). InnerVolumeSpecName "kube-api-access-kdpgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:09:34 crc kubenswrapper[4909]: I0202 12:09:34.552797 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdpgj\" (UniqueName: \"kubernetes.io/projected/e5683dbc-e166-419a-97a4-6ae32908deff-kube-api-access-kdpgj\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:35 crc kubenswrapper[4909]: I0202 12:09:35.024273 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-f673-account-create-update-hn76l" Feb 02 12:09:35 crc kubenswrapper[4909]: I0202 12:09:35.026962 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-f673-account-create-update-hn76l" event={"ID":"e5683dbc-e166-419a-97a4-6ae32908deff","Type":"ContainerDied","Data":"00dcc85ba3cd8ae544bb45557ac1b13aa438005c595fc28bffebf173ee32bfba"} Feb 02 12:09:35 crc kubenswrapper[4909]: I0202 12:09:35.027003 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00dcc85ba3cd8ae544bb45557ac1b13aa438005c595fc28bffebf173ee32bfba" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.378742 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-cv7jt"] Feb 02 12:09:36 crc kubenswrapper[4909]: E0202 12:09:36.379442 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138ed7b8-dd8c-43f5-a928-092f3d8dd670" containerName="mariadb-database-create" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.379457 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="138ed7b8-dd8c-43f5-a928-092f3d8dd670" containerName="mariadb-database-create" Feb 02 12:09:36 crc kubenswrapper[4909]: E0202 12:09:36.379499 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5683dbc-e166-419a-97a4-6ae32908deff" containerName="mariadb-account-create-update" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.379507 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5683dbc-e166-419a-97a4-6ae32908deff" containerName="mariadb-account-create-update" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.379693 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5683dbc-e166-419a-97a4-6ae32908deff" containerName="mariadb-account-create-update" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.379718 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="138ed7b8-dd8c-43f5-a928-092f3d8dd670" containerName="mariadb-database-create" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.380364 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-cv7jt" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.390653 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-cv7jt"] Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.498476 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d48e572-d205-42b6-87ba-232cea9b3d35-operator-scripts\") pod \"octavia-persistence-db-create-cv7jt\" (UID: \"8d48e572-d205-42b6-87ba-232cea9b3d35\") " pod="openstack/octavia-persistence-db-create-cv7jt" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.498611 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp82w\" (UniqueName: \"kubernetes.io/projected/8d48e572-d205-42b6-87ba-232cea9b3d35-kube-api-access-rp82w\") pod \"octavia-persistence-db-create-cv7jt\" (UID: \"8d48e572-d205-42b6-87ba-232cea9b3d35\") " pod="openstack/octavia-persistence-db-create-cv7jt" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.600492 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d48e572-d205-42b6-87ba-232cea9b3d35-operator-scripts\") pod \"octavia-persistence-db-create-cv7jt\" (UID: \"8d48e572-d205-42b6-87ba-232cea9b3d35\") " pod="openstack/octavia-persistence-db-create-cv7jt" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.600585 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp82w\" (UniqueName: \"kubernetes.io/projected/8d48e572-d205-42b6-87ba-232cea9b3d35-kube-api-access-rp82w\") pod \"octavia-persistence-db-create-cv7jt\" (UID: \"8d48e572-d205-42b6-87ba-232cea9b3d35\") " pod="openstack/octavia-persistence-db-create-cv7jt" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.601335 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d48e572-d205-42b6-87ba-232cea9b3d35-operator-scripts\") pod \"octavia-persistence-db-create-cv7jt\" (UID: \"8d48e572-d205-42b6-87ba-232cea9b3d35\") " pod="openstack/octavia-persistence-db-create-cv7jt" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.623826 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp82w\" (UniqueName: \"kubernetes.io/projected/8d48e572-d205-42b6-87ba-232cea9b3d35-kube-api-access-rp82w\") pod \"octavia-persistence-db-create-cv7jt\" (UID: \"8d48e572-d205-42b6-87ba-232cea9b3d35\") " pod="openstack/octavia-persistence-db-create-cv7jt" Feb 02 12:09:36 crc kubenswrapper[4909]: I0202 12:09:36.746926 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-cv7jt" Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.016723 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:09:37 crc kubenswrapper[4909]: E0202 12:09:37.017066 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.193270 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-cv7jt"] Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.650257 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-6b5e-account-create-update-dgmlx"] Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.651447 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-6b5e-account-create-update-dgmlx" Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.654420 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.665181 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-6b5e-account-create-update-dgmlx"] Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.824977 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx2ft\" (UniqueName: \"kubernetes.io/projected/cc5c8186-caff-40d7-9273-5cbb5864bf99-kube-api-access-cx2ft\") pod \"octavia-6b5e-account-create-update-dgmlx\" (UID: \"cc5c8186-caff-40d7-9273-5cbb5864bf99\") " pod="openstack/octavia-6b5e-account-create-update-dgmlx" Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.825401 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc5c8186-caff-40d7-9273-5cbb5864bf99-operator-scripts\") pod \"octavia-6b5e-account-create-update-dgmlx\" (UID: \"cc5c8186-caff-40d7-9273-5cbb5864bf99\") " pod="openstack/octavia-6b5e-account-create-update-dgmlx" Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.926788 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc5c8186-caff-40d7-9273-5cbb5864bf99-operator-scripts\") pod \"octavia-6b5e-account-create-update-dgmlx\" (UID: \"cc5c8186-caff-40d7-9273-5cbb5864bf99\") " pod="openstack/octavia-6b5e-account-create-update-dgmlx" Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.926873 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx2ft\" (UniqueName: \"kubernetes.io/projected/cc5c8186-caff-40d7-9273-5cbb5864bf99-kube-api-access-cx2ft\") pod \"octavia-6b5e-account-create-update-dgmlx\" (UID: \"cc5c8186-caff-40d7-9273-5cbb5864bf99\") " pod="openstack/octavia-6b5e-account-create-update-dgmlx" Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.927832 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc5c8186-caff-40d7-9273-5cbb5864bf99-operator-scripts\") pod \"octavia-6b5e-account-create-update-dgmlx\" (UID: \"cc5c8186-caff-40d7-9273-5cbb5864bf99\") " pod="openstack/octavia-6b5e-account-create-update-dgmlx" Feb 02 12:09:37 crc kubenswrapper[4909]: I0202 12:09:37.946263 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx2ft\" (UniqueName: \"kubernetes.io/projected/cc5c8186-caff-40d7-9273-5cbb5864bf99-kube-api-access-cx2ft\") pod \"octavia-6b5e-account-create-update-dgmlx\" (UID: \"cc5c8186-caff-40d7-9273-5cbb5864bf99\") " pod="openstack/octavia-6b5e-account-create-update-dgmlx" Feb 02 12:09:38 crc kubenswrapper[4909]: I0202 12:09:38.014754 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-6b5e-account-create-update-dgmlx" Feb 02 12:09:38 crc kubenswrapper[4909]: I0202 12:09:38.052326 4909 generic.go:334] "Generic (PLEG): container finished" podID="8d48e572-d205-42b6-87ba-232cea9b3d35" containerID="7186e22863fcb9269053e02643dac50be88c09728944307c150773560cf9bc33" exitCode=0 Feb 02 12:09:38 crc kubenswrapper[4909]: I0202 12:09:38.052377 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-cv7jt" event={"ID":"8d48e572-d205-42b6-87ba-232cea9b3d35","Type":"ContainerDied","Data":"7186e22863fcb9269053e02643dac50be88c09728944307c150773560cf9bc33"} Feb 02 12:09:38 crc kubenswrapper[4909]: I0202 12:09:38.052406 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-cv7jt" event={"ID":"8d48e572-d205-42b6-87ba-232cea9b3d35","Type":"ContainerStarted","Data":"d1d8093286dc902d65fe0395f5fb7098c02985eb305bdfb1c22cfa8ba1e7e34a"} Feb 02 12:09:38 crc kubenswrapper[4909]: I0202 12:09:38.456848 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-6b5e-account-create-update-dgmlx"] Feb 02 12:09:39 crc kubenswrapper[4909]: I0202 12:09:39.061922 4909 generic.go:334] "Generic (PLEG): container finished" podID="cc5c8186-caff-40d7-9273-5cbb5864bf99" containerID="c1657283ee2b61b2109fee0fa83c07814e58d7ac19beb46ee4632428ef03e0d9" exitCode=0 Feb 02 12:09:39 crc kubenswrapper[4909]: I0202 12:09:39.061973 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-6b5e-account-create-update-dgmlx" event={"ID":"cc5c8186-caff-40d7-9273-5cbb5864bf99","Type":"ContainerDied","Data":"c1657283ee2b61b2109fee0fa83c07814e58d7ac19beb46ee4632428ef03e0d9"} Feb 02 12:09:39 crc kubenswrapper[4909]: I0202 12:09:39.062290 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-6b5e-account-create-update-dgmlx" event={"ID":"cc5c8186-caff-40d7-9273-5cbb5864bf99","Type":"ContainerStarted","Data":"bd084761a5442ab0be36a8582a8e963c048d5b27a43172a21d3c08e25ba9cfcf"} Feb 02 12:09:39 crc kubenswrapper[4909]: I0202 12:09:39.379200 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-cv7jt" Feb 02 12:09:39 crc kubenswrapper[4909]: I0202 12:09:39.457322 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp82w\" (UniqueName: \"kubernetes.io/projected/8d48e572-d205-42b6-87ba-232cea9b3d35-kube-api-access-rp82w\") pod \"8d48e572-d205-42b6-87ba-232cea9b3d35\" (UID: \"8d48e572-d205-42b6-87ba-232cea9b3d35\") " Feb 02 12:09:39 crc kubenswrapper[4909]: I0202 12:09:39.457681 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d48e572-d205-42b6-87ba-232cea9b3d35-operator-scripts\") pod \"8d48e572-d205-42b6-87ba-232cea9b3d35\" (UID: \"8d48e572-d205-42b6-87ba-232cea9b3d35\") " Feb 02 12:09:39 crc kubenswrapper[4909]: I0202 12:09:39.458153 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d48e572-d205-42b6-87ba-232cea9b3d35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d48e572-d205-42b6-87ba-232cea9b3d35" (UID: "8d48e572-d205-42b6-87ba-232cea9b3d35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:09:39 crc kubenswrapper[4909]: I0202 12:09:39.461640 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d48e572-d205-42b6-87ba-232cea9b3d35-kube-api-access-rp82w" (OuterVolumeSpecName: "kube-api-access-rp82w") pod "8d48e572-d205-42b6-87ba-232cea9b3d35" (UID: "8d48e572-d205-42b6-87ba-232cea9b3d35"). InnerVolumeSpecName "kube-api-access-rp82w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:09:39 crc kubenswrapper[4909]: I0202 12:09:39.560376 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp82w\" (UniqueName: \"kubernetes.io/projected/8d48e572-d205-42b6-87ba-232cea9b3d35-kube-api-access-rp82w\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:39 crc kubenswrapper[4909]: I0202 12:09:39.560420 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d48e572-d205-42b6-87ba-232cea9b3d35-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:40 crc kubenswrapper[4909]: I0202 12:09:40.072988 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-cv7jt" Feb 02 12:09:40 crc kubenswrapper[4909]: I0202 12:09:40.072993 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-cv7jt" event={"ID":"8d48e572-d205-42b6-87ba-232cea9b3d35","Type":"ContainerDied","Data":"d1d8093286dc902d65fe0395f5fb7098c02985eb305bdfb1c22cfa8ba1e7e34a"} Feb 02 12:09:40 crc kubenswrapper[4909]: I0202 12:09:40.073038 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1d8093286dc902d65fe0395f5fb7098c02985eb305bdfb1c22cfa8ba1e7e34a" Feb 02 12:09:40 crc kubenswrapper[4909]: I0202 12:09:40.383683 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-6b5e-account-create-update-dgmlx" Feb 02 12:09:40 crc kubenswrapper[4909]: I0202 12:09:40.477720 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc5c8186-caff-40d7-9273-5cbb5864bf99-operator-scripts\") pod \"cc5c8186-caff-40d7-9273-5cbb5864bf99\" (UID: \"cc5c8186-caff-40d7-9273-5cbb5864bf99\") " Feb 02 12:09:40 crc kubenswrapper[4909]: I0202 12:09:40.478090 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx2ft\" (UniqueName: \"kubernetes.io/projected/cc5c8186-caff-40d7-9273-5cbb5864bf99-kube-api-access-cx2ft\") pod \"cc5c8186-caff-40d7-9273-5cbb5864bf99\" (UID: \"cc5c8186-caff-40d7-9273-5cbb5864bf99\") " Feb 02 12:09:40 crc kubenswrapper[4909]: I0202 12:09:40.478295 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc5c8186-caff-40d7-9273-5cbb5864bf99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cc5c8186-caff-40d7-9273-5cbb5864bf99" (UID: "cc5c8186-caff-40d7-9273-5cbb5864bf99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:09:40 crc kubenswrapper[4909]: I0202 12:09:40.479003 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cc5c8186-caff-40d7-9273-5cbb5864bf99-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:40 crc kubenswrapper[4909]: I0202 12:09:40.481392 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc5c8186-caff-40d7-9273-5cbb5864bf99-kube-api-access-cx2ft" (OuterVolumeSpecName: "kube-api-access-cx2ft") pod "cc5c8186-caff-40d7-9273-5cbb5864bf99" (UID: "cc5c8186-caff-40d7-9273-5cbb5864bf99"). InnerVolumeSpecName "kube-api-access-cx2ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:09:40 crc kubenswrapper[4909]: I0202 12:09:40.581179 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx2ft\" (UniqueName: \"kubernetes.io/projected/cc5c8186-caff-40d7-9273-5cbb5864bf99-kube-api-access-cx2ft\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.095171 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-6b5e-account-create-update-dgmlx" event={"ID":"cc5c8186-caff-40d7-9273-5cbb5864bf99","Type":"ContainerDied","Data":"bd084761a5442ab0be36a8582a8e963c048d5b27a43172a21d3c08e25ba9cfcf"} Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.095229 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-6b5e-account-create-update-dgmlx" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.095234 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd084761a5442ab0be36a8582a8e963c048d5b27a43172a21d3c08e25ba9cfcf" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.726963 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-2vh4d" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.749709 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.761111 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jgdxf" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.891998 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2vh4d-config-mw2s5"] Feb 02 12:09:41 crc kubenswrapper[4909]: E0202 12:09:41.892405 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc5c8186-caff-40d7-9273-5cbb5864bf99" containerName="mariadb-account-create-update" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.892418 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5c8186-caff-40d7-9273-5cbb5864bf99" containerName="mariadb-account-create-update" Feb 02 12:09:41 crc kubenswrapper[4909]: E0202 12:09:41.892442 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d48e572-d205-42b6-87ba-232cea9b3d35" containerName="mariadb-database-create" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.892448 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d48e572-d205-42b6-87ba-232cea9b3d35" containerName="mariadb-database-create" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.892644 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d48e572-d205-42b6-87ba-232cea9b3d35" containerName="mariadb-database-create" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.892656 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc5c8186-caff-40d7-9273-5cbb5864bf99" containerName="mariadb-account-create-update" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.893307 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.899711 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 02 12:09:41 crc kubenswrapper[4909]: I0202 12:09:41.911986 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2vh4d-config-mw2s5"] Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.005884 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-scripts\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.005960 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run-ovn\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.006220 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8kcz\" (UniqueName: \"kubernetes.io/projected/04a1cef1-7283-4a56-bda9-17142a32a84e-kube-api-access-g8kcz\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.006294 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.006356 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-log-ovn\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.006445 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-additional-scripts\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.107519 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8kcz\" (UniqueName: \"kubernetes.io/projected/04a1cef1-7283-4a56-bda9-17142a32a84e-kube-api-access-g8kcz\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.107567 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.107596 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-log-ovn\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.107632 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-additional-scripts\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.107724 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-scripts\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.107788 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run-ovn\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.108164 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run-ovn\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.108501 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.108546 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-log-ovn\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.110711 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-additional-scripts\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.113296 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-scripts\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.140129 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8kcz\" (UniqueName: \"kubernetes.io/projected/04a1cef1-7283-4a56-bda9-17142a32a84e-kube-api-access-g8kcz\") pod \"ovn-controller-2vh4d-config-mw2s5\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.215228 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:42 crc kubenswrapper[4909]: I0202 12:09:42.672784 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2vh4d-config-mw2s5"] Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.052020 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-6c4564d67-6x9nd"] Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.057656 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.061896 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-ovndbs" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.062130 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.062296 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.062667 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-kpq65" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.067495 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6c4564d67-6x9nd"] Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.120701 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2vh4d-config-mw2s5" event={"ID":"04a1cef1-7283-4a56-bda9-17142a32a84e","Type":"ContainerStarted","Data":"7ee880a2cb6d634dec71fbea463a53e84345ffca12464196465fac4c754087c3"} Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.120754 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2vh4d-config-mw2s5" event={"ID":"04a1cef1-7283-4a56-bda9-17142a32a84e","Type":"ContainerStarted","Data":"8442df377d829b842d0eba3b667a716ae9cf0b135f6dcc010bcce49ef0660315"} Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.125406 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-octavia-run\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.125485 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-ovndb-tls-certs\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.125520 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.125599 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data-merged\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.125704 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-scripts\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.125776 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-combined-ca-bundle\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.147148 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2vh4d-config-mw2s5" podStartSLOduration=2.147124494 podStartE2EDuration="2.147124494s" podCreationTimestamp="2026-02-02 12:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:09:43.143430029 +0000 UTC m=+5908.889530774" watchObservedRunningTime="2026-02-02 12:09:43.147124494 +0000 UTC m=+5908.893225239" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.227887 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-scripts\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.227998 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-combined-ca-bundle\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.228106 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-octavia-run\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.228160 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-ovndb-tls-certs\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.228206 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.228281 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data-merged\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.229096 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-octavia-run\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.229894 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data-merged\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.234680 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-ovndb-tls-certs\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.235979 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-scripts\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.237616 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.243771 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-combined-ca-bundle\") pod \"octavia-api-6c4564d67-6x9nd\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.398293 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:43 crc kubenswrapper[4909]: I0202 12:09:43.941959 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6c4564d67-6x9nd"] Feb 02 12:09:44 crc kubenswrapper[4909]: I0202 12:09:44.143851 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6c4564d67-6x9nd" event={"ID":"d5e46e05-22cf-4148-b7de-7f6330986a54","Type":"ContainerStarted","Data":"f0b8b15c1103f63e74ac080cc0de1d8a5611a57f2ee24b26eabec07f31d90f98"} Feb 02 12:09:44 crc kubenswrapper[4909]: I0202 12:09:44.150491 4909 generic.go:334] "Generic (PLEG): container finished" podID="04a1cef1-7283-4a56-bda9-17142a32a84e" containerID="7ee880a2cb6d634dec71fbea463a53e84345ffca12464196465fac4c754087c3" exitCode=0 Feb 02 12:09:44 crc kubenswrapper[4909]: I0202 12:09:44.150534 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2vh4d-config-mw2s5" event={"ID":"04a1cef1-7283-4a56-bda9-17142a32a84e","Type":"ContainerDied","Data":"7ee880a2cb6d634dec71fbea463a53e84345ffca12464196465fac4c754087c3"} Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.611673 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.685593 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8kcz\" (UniqueName: \"kubernetes.io/projected/04a1cef1-7283-4a56-bda9-17142a32a84e-kube-api-access-g8kcz\") pod \"04a1cef1-7283-4a56-bda9-17142a32a84e\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.685647 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run-ovn\") pod \"04a1cef1-7283-4a56-bda9-17142a32a84e\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.685678 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-log-ovn\") pod \"04a1cef1-7283-4a56-bda9-17142a32a84e\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.685750 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-additional-scripts\") pod \"04a1cef1-7283-4a56-bda9-17142a32a84e\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.685835 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run\") pod \"04a1cef1-7283-4a56-bda9-17142a32a84e\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.685918 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-scripts\") pod \"04a1cef1-7283-4a56-bda9-17142a32a84e\" (UID: \"04a1cef1-7283-4a56-bda9-17142a32a84e\") " Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.686636 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "04a1cef1-7283-4a56-bda9-17142a32a84e" (UID: "04a1cef1-7283-4a56-bda9-17142a32a84e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.686674 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "04a1cef1-7283-4a56-bda9-17142a32a84e" (UID: "04a1cef1-7283-4a56-bda9-17142a32a84e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.687237 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "04a1cef1-7283-4a56-bda9-17142a32a84e" (UID: "04a1cef1-7283-4a56-bda9-17142a32a84e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.687272 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run" (OuterVolumeSpecName: "var-run") pod "04a1cef1-7283-4a56-bda9-17142a32a84e" (UID: "04a1cef1-7283-4a56-bda9-17142a32a84e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.687423 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-scripts" (OuterVolumeSpecName: "scripts") pod "04a1cef1-7283-4a56-bda9-17142a32a84e" (UID: "04a1cef1-7283-4a56-bda9-17142a32a84e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.697114 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04a1cef1-7283-4a56-bda9-17142a32a84e-kube-api-access-g8kcz" (OuterVolumeSpecName: "kube-api-access-g8kcz") pod "04a1cef1-7283-4a56-bda9-17142a32a84e" (UID: "04a1cef1-7283-4a56-bda9-17142a32a84e"). InnerVolumeSpecName "kube-api-access-g8kcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.789058 4909 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.789094 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.789104 4909 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.789114 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8kcz\" (UniqueName: \"kubernetes.io/projected/04a1cef1-7283-4a56-bda9-17142a32a84e-kube-api-access-g8kcz\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.789125 4909 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/04a1cef1-7283-4a56-bda9-17142a32a84e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:45 crc kubenswrapper[4909]: I0202 12:09:45.789136 4909 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/04a1cef1-7283-4a56-bda9-17142a32a84e-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.178145 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2vh4d-config-mw2s5" event={"ID":"04a1cef1-7283-4a56-bda9-17142a32a84e","Type":"ContainerDied","Data":"8442df377d829b842d0eba3b667a716ae9cf0b135f6dcc010bcce49ef0660315"} Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.178429 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8442df377d829b842d0eba3b667a716ae9cf0b135f6dcc010bcce49ef0660315" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.178494 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2vh4d-config-mw2s5" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.232925 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2vh4d-config-mw2s5"] Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.239973 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2vh4d-config-mw2s5"] Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.365187 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2vh4d-config-tw9rz"] Feb 02 12:09:46 crc kubenswrapper[4909]: E0202 12:09:46.365662 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04a1cef1-7283-4a56-bda9-17142a32a84e" containerName="ovn-config" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.365681 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="04a1cef1-7283-4a56-bda9-17142a32a84e" containerName="ovn-config" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.365871 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="04a1cef1-7283-4a56-bda9-17142a32a84e" containerName="ovn-config" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.366524 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.368915 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.384418 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2vh4d-config-tw9rz"] Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.509210 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-additional-scripts\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.509287 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-log-ovn\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.509314 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.509384 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run-ovn\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.509400 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-scripts\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.509438 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d574\" (UniqueName: \"kubernetes.io/projected/a45332c4-e5b6-49d5-b55b-d546d9a93478-kube-api-access-5d574\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.611746 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-additional-scripts\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.611851 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-log-ovn\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.611878 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.612004 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run-ovn\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.612030 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-scripts\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.612195 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run-ovn\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.612199 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-log-ovn\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.612195 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.612330 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d574\" (UniqueName: \"kubernetes.io/projected/a45332c4-e5b6-49d5-b55b-d546d9a93478-kube-api-access-5d574\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.612874 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-additional-scripts\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.614456 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-scripts\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.632391 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d574\" (UniqueName: \"kubernetes.io/projected/a45332c4-e5b6-49d5-b55b-d546d9a93478-kube-api-access-5d574\") pod \"ovn-controller-2vh4d-config-tw9rz\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:46 crc kubenswrapper[4909]: I0202 12:09:46.690562 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:47 crc kubenswrapper[4909]: I0202 12:09:47.031172 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04a1cef1-7283-4a56-bda9-17142a32a84e" path="/var/lib/kubelet/pods/04a1cef1-7283-4a56-bda9-17142a32a84e/volumes" Feb 02 12:09:47 crc kubenswrapper[4909]: I0202 12:09:47.241040 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2vh4d-config-tw9rz"] Feb 02 12:09:47 crc kubenswrapper[4909]: W0202 12:09:47.247652 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda45332c4_e5b6_49d5_b55b_d546d9a93478.slice/crio-f86f604b9cb6acbee4fd890ebd99f55a0966ec9728c169ba6c9ce8c864e88512 WatchSource:0}: Error finding container f86f604b9cb6acbee4fd890ebd99f55a0966ec9728c169ba6c9ce8c864e88512: Status 404 returned error can't find the container with id f86f604b9cb6acbee4fd890ebd99f55a0966ec9728c169ba6c9ce8c864e88512 Feb 02 12:09:48 crc kubenswrapper[4909]: I0202 12:09:48.199036 4909 generic.go:334] "Generic (PLEG): container finished" podID="a45332c4-e5b6-49d5-b55b-d546d9a93478" containerID="458bc11f9c82911eec716c469623b3293d4b82ddfd5dac84759fc610c1f9f30c" exitCode=0 Feb 02 12:09:48 crc kubenswrapper[4909]: I0202 12:09:48.199090 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2vh4d-config-tw9rz" event={"ID":"a45332c4-e5b6-49d5-b55b-d546d9a93478","Type":"ContainerDied","Data":"458bc11f9c82911eec716c469623b3293d4b82ddfd5dac84759fc610c1f9f30c"} Feb 02 12:09:48 crc kubenswrapper[4909]: I0202 12:09:48.199534 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2vh4d-config-tw9rz" event={"ID":"a45332c4-e5b6-49d5-b55b-d546d9a93478","Type":"ContainerStarted","Data":"f86f604b9cb6acbee4fd890ebd99f55a0966ec9728c169ba6c9ce8c864e88512"} Feb 02 12:09:49 crc kubenswrapper[4909]: I0202 12:09:49.016374 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:09:49 crc kubenswrapper[4909]: E0202 12:09:49.017018 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:09:54 crc kubenswrapper[4909]: I0202 12:09:54.906942 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.019004 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-additional-scripts\") pod \"a45332c4-e5b6-49d5-b55b-d546d9a93478\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.019189 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run-ovn\") pod \"a45332c4-e5b6-49d5-b55b-d546d9a93478\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.019320 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-scripts\") pod \"a45332c4-e5b6-49d5-b55b-d546d9a93478\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.019715 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run\") pod \"a45332c4-e5b6-49d5-b55b-d546d9a93478\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.019739 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a45332c4-e5b6-49d5-b55b-d546d9a93478" (UID: "a45332c4-e5b6-49d5-b55b-d546d9a93478"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.019784 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-log-ovn\") pod \"a45332c4-e5b6-49d5-b55b-d546d9a93478\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.019841 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run" (OuterVolumeSpecName: "var-run") pod "a45332c4-e5b6-49d5-b55b-d546d9a93478" (UID: "a45332c4-e5b6-49d5-b55b-d546d9a93478"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.019857 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d574\" (UniqueName: \"kubernetes.io/projected/a45332c4-e5b6-49d5-b55b-d546d9a93478-kube-api-access-5d574\") pod \"a45332c4-e5b6-49d5-b55b-d546d9a93478\" (UID: \"a45332c4-e5b6-49d5-b55b-d546d9a93478\") " Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.019951 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a45332c4-e5b6-49d5-b55b-d546d9a93478" (UID: "a45332c4-e5b6-49d5-b55b-d546d9a93478"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.020350 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-scripts" (OuterVolumeSpecName: "scripts") pod "a45332c4-e5b6-49d5-b55b-d546d9a93478" (UID: "a45332c4-e5b6-49d5-b55b-d546d9a93478"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.019466 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a45332c4-e5b6-49d5-b55b-d546d9a93478" (UID: "a45332c4-e5b6-49d5-b55b-d546d9a93478"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.021288 4909 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.021312 4909 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.021325 4909 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.021338 4909 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a45332c4-e5b6-49d5-b55b-d546d9a93478-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.021350 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a45332c4-e5b6-49d5-b55b-d546d9a93478-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.031037 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45332c4-e5b6-49d5-b55b-d546d9a93478-kube-api-access-5d574" (OuterVolumeSpecName: "kube-api-access-5d574") pod "a45332c4-e5b6-49d5-b55b-d546d9a93478" (UID: "a45332c4-e5b6-49d5-b55b-d546d9a93478"). InnerVolumeSpecName "kube-api-access-5d574". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.123665 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d574\" (UniqueName: \"kubernetes.io/projected/a45332c4-e5b6-49d5-b55b-d546d9a93478-kube-api-access-5d574\") on node \"crc\" DevicePath \"\"" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.283157 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2vh4d-config-tw9rz" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.283973 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2vh4d-config-tw9rz" event={"ID":"a45332c4-e5b6-49d5-b55b-d546d9a93478","Type":"ContainerDied","Data":"f86f604b9cb6acbee4fd890ebd99f55a0966ec9728c169ba6c9ce8c864e88512"} Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.284002 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f86f604b9cb6acbee4fd890ebd99f55a0966ec9728c169ba6c9ce8c864e88512" Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.287642 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6c4564d67-6x9nd" event={"ID":"d5e46e05-22cf-4148-b7de-7f6330986a54","Type":"ContainerStarted","Data":"612f69c857b17828a3eaffc72cf8d268c5c50b3a0e8ae12a707d5f96f60eca5d"} Feb 02 12:09:55 crc kubenswrapper[4909]: I0202 12:09:55.996418 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2vh4d-config-tw9rz"] Feb 02 12:09:56 crc kubenswrapper[4909]: I0202 12:09:56.005991 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2vh4d-config-tw9rz"] Feb 02 12:09:56 crc kubenswrapper[4909]: I0202 12:09:56.297626 4909 generic.go:334] "Generic (PLEG): container finished" podID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerID="612f69c857b17828a3eaffc72cf8d268c5c50b3a0e8ae12a707d5f96f60eca5d" exitCode=0 Feb 02 12:09:56 crc kubenswrapper[4909]: I0202 12:09:56.297668 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6c4564d67-6x9nd" event={"ID":"d5e46e05-22cf-4148-b7de-7f6330986a54","Type":"ContainerDied","Data":"612f69c857b17828a3eaffc72cf8d268c5c50b3a0e8ae12a707d5f96f60eca5d"} Feb 02 12:09:57 crc kubenswrapper[4909]: I0202 12:09:57.030240 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45332c4-e5b6-49d5-b55b-d546d9a93478" path="/var/lib/kubelet/pods/a45332c4-e5b6-49d5-b55b-d546d9a93478/volumes" Feb 02 12:09:57 crc kubenswrapper[4909]: I0202 12:09:57.308635 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6c4564d67-6x9nd" event={"ID":"d5e46e05-22cf-4148-b7de-7f6330986a54","Type":"ContainerStarted","Data":"d9fb5658160bc102a00d229574579f5702d7aba83ce0e5675773e70369311d79"} Feb 02 12:09:57 crc kubenswrapper[4909]: I0202 12:09:57.308676 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6c4564d67-6x9nd" event={"ID":"d5e46e05-22cf-4148-b7de-7f6330986a54","Type":"ContainerStarted","Data":"67fb41ad4f0b2573c67eb8566c1fd90245597bd8af2164d4024f2a5e50382ce2"} Feb 02 12:09:57 crc kubenswrapper[4909]: I0202 12:09:57.308906 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:57 crc kubenswrapper[4909]: I0202 12:09:57.308930 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:09:57 crc kubenswrapper[4909]: I0202 12:09:57.334334 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-6c4564d67-6x9nd" podStartSLOduration=3.390891634 podStartE2EDuration="14.334313305s" podCreationTimestamp="2026-02-02 12:09:43 +0000 UTC" firstStartedPulling="2026-02-02 12:09:43.954593772 +0000 UTC m=+5909.700694507" lastFinishedPulling="2026-02-02 12:09:54.898015443 +0000 UTC m=+5920.644116178" observedRunningTime="2026-02-02 12:09:57.334281544 +0000 UTC m=+5923.080382279" watchObservedRunningTime="2026-02-02 12:09:57.334313305 +0000 UTC m=+5923.080414040" Feb 02 12:10:00 crc kubenswrapper[4909]: I0202 12:10:00.016772 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:10:00 crc kubenswrapper[4909]: E0202 12:10:00.018127 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.879380 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-fjmbh"] Feb 02 12:10:12 crc kubenswrapper[4909]: E0202 12:10:12.880323 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45332c4-e5b6-49d5-b55b-d546d9a93478" containerName="ovn-config" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.880340 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45332c4-e5b6-49d5-b55b-d546d9a93478" containerName="ovn-config" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.880582 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45332c4-e5b6-49d5-b55b-d546d9a93478" containerName="ovn-config" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.881651 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.884714 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.884956 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.885272 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.893110 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-fjmbh"] Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.960635 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2a570f17-3d51-4c5f-a813-e3f233631ef7-hm-ports\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.960869 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a570f17-3d51-4c5f-a813-e3f233631ef7-config-data\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.961013 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a570f17-3d51-4c5f-a813-e3f233631ef7-scripts\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:12 crc kubenswrapper[4909]: I0202 12:10:12.961045 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2a570f17-3d51-4c5f-a813-e3f233631ef7-config-data-merged\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.062737 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a570f17-3d51-4c5f-a813-e3f233631ef7-scripts\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.062787 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2a570f17-3d51-4c5f-a813-e3f233631ef7-config-data-merged\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.062898 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2a570f17-3d51-4c5f-a813-e3f233631ef7-hm-ports\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.062965 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a570f17-3d51-4c5f-a813-e3f233631ef7-config-data\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.065939 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/2a570f17-3d51-4c5f-a813-e3f233631ef7-hm-ports\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.066362 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2a570f17-3d51-4c5f-a813-e3f233631ef7-config-data-merged\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.069326 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a570f17-3d51-4c5f-a813-e3f233631ef7-config-data\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.083755 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a570f17-3d51-4c5f-a813-e3f233631ef7-scripts\") pod \"octavia-rsyslog-fjmbh\" (UID: \"2a570f17-3d51-4c5f-a813-e3f233631ef7\") " pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.246403 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.706137 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-kbcql"] Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.708633 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.721280 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.737493 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-kbcql"] Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.782392 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4727a1e-e236-4f4f-afb6-baf063cbf292-httpd-config\") pod \"octavia-image-upload-65dd99cb46-kbcql\" (UID: \"a4727a1e-e236-4f4f-afb6-baf063cbf292\") " pod="openstack/octavia-image-upload-65dd99cb46-kbcql" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.782550 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/a4727a1e-e236-4f4f-afb6-baf063cbf292-amphora-image\") pod \"octavia-image-upload-65dd99cb46-kbcql\" (UID: \"a4727a1e-e236-4f4f-afb6-baf063cbf292\") " pod="openstack/octavia-image-upload-65dd99cb46-kbcql" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.838378 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-fjmbh"] Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.884974 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4727a1e-e236-4f4f-afb6-baf063cbf292-httpd-config\") pod \"octavia-image-upload-65dd99cb46-kbcql\" (UID: \"a4727a1e-e236-4f4f-afb6-baf063cbf292\") " pod="openstack/octavia-image-upload-65dd99cb46-kbcql" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.885152 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/a4727a1e-e236-4f4f-afb6-baf063cbf292-amphora-image\") pod \"octavia-image-upload-65dd99cb46-kbcql\" (UID: \"a4727a1e-e236-4f4f-afb6-baf063cbf292\") " pod="openstack/octavia-image-upload-65dd99cb46-kbcql" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.885880 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/a4727a1e-e236-4f4f-afb6-baf063cbf292-amphora-image\") pod \"octavia-image-upload-65dd99cb46-kbcql\" (UID: \"a4727a1e-e236-4f4f-afb6-baf063cbf292\") " pod="openstack/octavia-image-upload-65dd99cb46-kbcql" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.891729 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4727a1e-e236-4f4f-afb6-baf063cbf292-httpd-config\") pod \"octavia-image-upload-65dd99cb46-kbcql\" (UID: \"a4727a1e-e236-4f4f-afb6-baf063cbf292\") " pod="openstack/octavia-image-upload-65dd99cb46-kbcql" Feb 02 12:10:13 crc kubenswrapper[4909]: I0202 12:10:13.989784 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-fjmbh"] Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.051988 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.459598 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-fjmbh" event={"ID":"2a570f17-3d51-4c5f-a813-e3f233631ef7","Type":"ContainerStarted","Data":"b9e299e470d558ade0f22611450ee4898fc8c4df7dec84e9b8d2997827833a9e"} Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.533004 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-kbcql"] Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.747232 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-85xwk"] Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.748996 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.751202 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.762718 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-85xwk"] Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.818330 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data-merged\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.818469 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-scripts\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.818522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.818699 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-combined-ca-bundle\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.921049 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data-merged\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.921145 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-scripts\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.921201 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.921255 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-combined-ca-bundle\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.921735 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data-merged\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.927176 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.929274 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-scripts\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:14 crc kubenswrapper[4909]: I0202 12:10:14.929695 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-combined-ca-bundle\") pod \"octavia-db-sync-85xwk\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:15 crc kubenswrapper[4909]: I0202 12:10:15.036102 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:10:15 crc kubenswrapper[4909]: E0202 12:10:15.036899 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:10:15 crc kubenswrapper[4909]: I0202 12:10:15.087546 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:15 crc kubenswrapper[4909]: I0202 12:10:15.479587 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" event={"ID":"a4727a1e-e236-4f4f-afb6-baf063cbf292","Type":"ContainerStarted","Data":"e10f7117b52552fdb2e4a11159b4c21dbc9802a7d7f3ea870fe08df5b499e901"} Feb 02 12:10:15 crc kubenswrapper[4909]: I0202 12:10:15.613020 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-85xwk"] Feb 02 12:10:16 crc kubenswrapper[4909]: I0202 12:10:16.500937 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-85xwk" event={"ID":"45214a2c-8581-4551-b61e-67df0ba0fc95","Type":"ContainerStarted","Data":"4fd9ab104737c22de985eec737ba2ff364bbbf46839e534c28ca326da1c9eec2"} Feb 02 12:10:16 crc kubenswrapper[4909]: I0202 12:10:16.500983 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-85xwk" event={"ID":"45214a2c-8581-4551-b61e-67df0ba0fc95","Type":"ContainerStarted","Data":"981406a853028c74c03ec7a146fbe59b92fb7092e006eec8be90aeaa72d129a4"} Feb 02 12:10:17 crc kubenswrapper[4909]: I0202 12:10:17.513366 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-fjmbh" event={"ID":"2a570f17-3d51-4c5f-a813-e3f233631ef7","Type":"ContainerStarted","Data":"f739f92ac6206ad4ab9ff153029680bd738e44f454fd1772af6396e58d5aa44e"} Feb 02 12:10:17 crc kubenswrapper[4909]: I0202 12:10:17.522628 4909 generic.go:334] "Generic (PLEG): container finished" podID="45214a2c-8581-4551-b61e-67df0ba0fc95" containerID="4fd9ab104737c22de985eec737ba2ff364bbbf46839e534c28ca326da1c9eec2" exitCode=0 Feb 02 12:10:17 crc kubenswrapper[4909]: I0202 12:10:17.522717 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-85xwk" event={"ID":"45214a2c-8581-4551-b61e-67df0ba0fc95","Type":"ContainerDied","Data":"4fd9ab104737c22de985eec737ba2ff364bbbf46839e534c28ca326da1c9eec2"} Feb 02 12:10:18 crc kubenswrapper[4909]: I0202 12:10:18.534303 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-85xwk" event={"ID":"45214a2c-8581-4551-b61e-67df0ba0fc95","Type":"ContainerStarted","Data":"55f98ab1d8456725fac00c3381b3f7e9c27849430d6b36039f807db276b30cf2"} Feb 02 12:10:18 crc kubenswrapper[4909]: I0202 12:10:18.565665 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-85xwk" podStartSLOduration=4.565646714 podStartE2EDuration="4.565646714s" podCreationTimestamp="2026-02-02 12:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:10:18.552497831 +0000 UTC m=+5944.298598556" watchObservedRunningTime="2026-02-02 12:10:18.565646714 +0000 UTC m=+5944.311747449" Feb 02 12:10:18 crc kubenswrapper[4909]: I0202 12:10:18.586717 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:10:18 crc kubenswrapper[4909]: I0202 12:10:18.587876 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:10:19 crc kubenswrapper[4909]: I0202 12:10:19.548034 4909 generic.go:334] "Generic (PLEG): container finished" podID="2a570f17-3d51-4c5f-a813-e3f233631ef7" containerID="f739f92ac6206ad4ab9ff153029680bd738e44f454fd1772af6396e58d5aa44e" exitCode=0 Feb 02 12:10:19 crc kubenswrapper[4909]: I0202 12:10:19.548136 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-fjmbh" event={"ID":"2a570f17-3d51-4c5f-a813-e3f233631ef7","Type":"ContainerDied","Data":"f739f92ac6206ad4ab9ff153029680bd738e44f454fd1772af6396e58d5aa44e"} Feb 02 12:10:21 crc kubenswrapper[4909]: I0202 12:10:21.568167 4909 generic.go:334] "Generic (PLEG): container finished" podID="45214a2c-8581-4551-b61e-67df0ba0fc95" containerID="55f98ab1d8456725fac00c3381b3f7e9c27849430d6b36039f807db276b30cf2" exitCode=0 Feb 02 12:10:21 crc kubenswrapper[4909]: I0202 12:10:21.568362 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-85xwk" event={"ID":"45214a2c-8581-4551-b61e-67df0ba0fc95","Type":"ContainerDied","Data":"55f98ab1d8456725fac00c3381b3f7e9c27849430d6b36039f807db276b30cf2"} Feb 02 12:10:21 crc kubenswrapper[4909]: I0202 12:10:21.708026 4909 scope.go:117] "RemoveContainer" containerID="656cfb0a9ef48145fe7201d7cc35dc862681571cc74b41c47675bab7c96240a9" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.199694 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.248547 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-scripts\") pod \"45214a2c-8581-4551-b61e-67df0ba0fc95\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.248606 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-combined-ca-bundle\") pod \"45214a2c-8581-4551-b61e-67df0ba0fc95\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.248855 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data\") pod \"45214a2c-8581-4551-b61e-67df0ba0fc95\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.249077 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data-merged\") pod \"45214a2c-8581-4551-b61e-67df0ba0fc95\" (UID: \"45214a2c-8581-4551-b61e-67df0ba0fc95\") " Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.254048 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-scripts" (OuterVolumeSpecName: "scripts") pod "45214a2c-8581-4551-b61e-67df0ba0fc95" (UID: "45214a2c-8581-4551-b61e-67df0ba0fc95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.270633 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data" (OuterVolumeSpecName: "config-data") pod "45214a2c-8581-4551-b61e-67df0ba0fc95" (UID: "45214a2c-8581-4551-b61e-67df0ba0fc95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.277860 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "45214a2c-8581-4551-b61e-67df0ba0fc95" (UID: "45214a2c-8581-4551-b61e-67df0ba0fc95"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.289991 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "45214a2c-8581-4551-b61e-67df0ba0fc95" (UID: "45214a2c-8581-4551-b61e-67df0ba0fc95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.352027 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.352068 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.352082 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.352094 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/45214a2c-8581-4551-b61e-67df0ba0fc95-config-data-merged\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.616735 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-85xwk" event={"ID":"45214a2c-8581-4551-b61e-67df0ba0fc95","Type":"ContainerDied","Data":"981406a853028c74c03ec7a146fbe59b92fb7092e006eec8be90aeaa72d129a4"} Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.617078 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="981406a853028c74c03ec7a146fbe59b92fb7092e006eec8be90aeaa72d129a4" Feb 02 12:10:25 crc kubenswrapper[4909]: I0202 12:10:25.616790 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-85xwk" Feb 02 12:10:26 crc kubenswrapper[4909]: I0202 12:10:26.635359 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" event={"ID":"a4727a1e-e236-4f4f-afb6-baf063cbf292","Type":"ContainerStarted","Data":"802ba0cef3bb440883b7e5de8a341ab0fc3a491c7fe62180801162b39f35d245"} Feb 02 12:10:26 crc kubenswrapper[4909]: I0202 12:10:26.640079 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-fjmbh" event={"ID":"2a570f17-3d51-4c5f-a813-e3f233631ef7","Type":"ContainerStarted","Data":"4cfd02b8ad1860a155098b15b9c03a111ed0f85234b06e188af8af503ee426de"} Feb 02 12:10:26 crc kubenswrapper[4909]: I0202 12:10:26.641252 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:26 crc kubenswrapper[4909]: I0202 12:10:26.690990 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-fjmbh" podStartSLOduration=2.8497735239999997 podStartE2EDuration="14.690968308s" podCreationTimestamp="2026-02-02 12:10:12 +0000 UTC" firstStartedPulling="2026-02-02 12:10:13.858701104 +0000 UTC m=+5939.604801849" lastFinishedPulling="2026-02-02 12:10:25.699895898 +0000 UTC m=+5951.445996633" observedRunningTime="2026-02-02 12:10:26.680988234 +0000 UTC m=+5952.427088969" watchObservedRunningTime="2026-02-02 12:10:26.690968308 +0000 UTC m=+5952.437069053" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.005988 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-6744779749-884lr"] Feb 02 12:10:27 crc kubenswrapper[4909]: E0202 12:10:27.006537 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45214a2c-8581-4551-b61e-67df0ba0fc95" containerName="octavia-db-sync" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.006561 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="45214a2c-8581-4551-b61e-67df0ba0fc95" containerName="octavia-db-sync" Feb 02 12:10:27 crc kubenswrapper[4909]: E0202 12:10:27.006585 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45214a2c-8581-4551-b61e-67df0ba0fc95" containerName="init" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.006594 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="45214a2c-8581-4551-b61e-67df0ba0fc95" containerName="init" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.006840 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="45214a2c-8581-4551-b61e-67df0ba0fc95" containerName="octavia-db-sync" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.008883 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.012626 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-public-svc" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.012911 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-octavia-internal-svc" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.016919 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:10:27 crc kubenswrapper[4909]: E0202 12:10:27.017239 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.030272 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6744779749-884lr"] Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.091873 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3d43a55d-45af-4682-9319-ef98614343d4-config-data-merged\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.092896 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-combined-ca-bundle\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.093016 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-scripts\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.093406 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-public-tls-certs\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.093505 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-ovndb-tls-certs\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.093572 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-config-data\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.093669 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/3d43a55d-45af-4682-9319-ef98614343d4-octavia-run\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.094182 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-internal-tls-certs\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.196050 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-internal-tls-certs\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.196163 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3d43a55d-45af-4682-9319-ef98614343d4-config-data-merged\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.196233 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-combined-ca-bundle\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.196285 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-scripts\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.196313 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-public-tls-certs\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.196357 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-ovndb-tls-certs\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.196384 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-config-data\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.196434 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/3d43a55d-45af-4682-9319-ef98614343d4-octavia-run\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.196681 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3d43a55d-45af-4682-9319-ef98614343d4-config-data-merged\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.197732 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/3d43a55d-45af-4682-9319-ef98614343d4-octavia-run\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.215783 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-public-tls-certs\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.215977 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-combined-ca-bundle\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.216752 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-internal-tls-certs\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.218332 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-config-data\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.219183 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-scripts\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.220337 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d43a55d-45af-4682-9319-ef98614343d4-ovndb-tls-certs\") pod \"octavia-api-6744779749-884lr\" (UID: \"3d43a55d-45af-4682-9319-ef98614343d4\") " pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.348665 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:27 crc kubenswrapper[4909]: I0202 12:10:27.817862 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6744779749-884lr"] Feb 02 12:10:28 crc kubenswrapper[4909]: I0202 12:10:28.658406 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6744779749-884lr" event={"ID":"3d43a55d-45af-4682-9319-ef98614343d4","Type":"ContainerStarted","Data":"4c2d6f2f052c9b37b68088464e43a14d12da20babe3f52d89492e87bc0824752"} Feb 02 12:10:28 crc kubenswrapper[4909]: I0202 12:10:28.658823 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6744779749-884lr" event={"ID":"3d43a55d-45af-4682-9319-ef98614343d4","Type":"ContainerStarted","Data":"f1cdef2aabc9ae05a38a9e91e448bbfa0911bc3dec4020e5c42177f3f73c2239"} Feb 02 12:10:29 crc kubenswrapper[4909]: I0202 12:10:29.669338 4909 generic.go:334] "Generic (PLEG): container finished" podID="a4727a1e-e236-4f4f-afb6-baf063cbf292" containerID="802ba0cef3bb440883b7e5de8a341ab0fc3a491c7fe62180801162b39f35d245" exitCode=0 Feb 02 12:10:29 crc kubenswrapper[4909]: I0202 12:10:29.670198 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" event={"ID":"a4727a1e-e236-4f4f-afb6-baf063cbf292","Type":"ContainerDied","Data":"802ba0cef3bb440883b7e5de8a341ab0fc3a491c7fe62180801162b39f35d245"} Feb 02 12:10:29 crc kubenswrapper[4909]: E0202 12:10:29.860469 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d43a55d_45af_4682_9319_ef98614343d4.slice/crio-conmon-4c2d6f2f052c9b37b68088464e43a14d12da20babe3f52d89492e87bc0824752.scope\": RecentStats: unable to find data in memory cache]" Feb 02 12:10:30 crc kubenswrapper[4909]: I0202 12:10:30.682519 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" event={"ID":"a4727a1e-e236-4f4f-afb6-baf063cbf292","Type":"ContainerStarted","Data":"0f82beca1f448ea6951e647b6c89f07f08f256d0054517a1b1bb802cba314d0f"} Feb 02 12:10:30 crc kubenswrapper[4909]: I0202 12:10:30.685421 4909 generic.go:334] "Generic (PLEG): container finished" podID="3d43a55d-45af-4682-9319-ef98614343d4" containerID="4c2d6f2f052c9b37b68088464e43a14d12da20babe3f52d89492e87bc0824752" exitCode=0 Feb 02 12:10:30 crc kubenswrapper[4909]: I0202 12:10:30.685495 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6744779749-884lr" event={"ID":"3d43a55d-45af-4682-9319-ef98614343d4","Type":"ContainerDied","Data":"4c2d6f2f052c9b37b68088464e43a14d12da20babe3f52d89492e87bc0824752"} Feb 02 12:10:30 crc kubenswrapper[4909]: I0202 12:10:30.707459 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" podStartSLOduration=6.511021276 podStartE2EDuration="17.707436459s" podCreationTimestamp="2026-02-02 12:10:13 +0000 UTC" firstStartedPulling="2026-02-02 12:10:14.539081626 +0000 UTC m=+5940.285182361" lastFinishedPulling="2026-02-02 12:10:25.735496809 +0000 UTC m=+5951.481597544" observedRunningTime="2026-02-02 12:10:30.702741696 +0000 UTC m=+5956.448842431" watchObservedRunningTime="2026-02-02 12:10:30.707436459 +0000 UTC m=+5956.453537194" Feb 02 12:10:31 crc kubenswrapper[4909]: I0202 12:10:31.706760 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6744779749-884lr" event={"ID":"3d43a55d-45af-4682-9319-ef98614343d4","Type":"ContainerStarted","Data":"26ccdf730300aafb4acd52b5454a2a37c9e5538210b59e7a7bc69a910024a2fa"} Feb 02 12:10:31 crc kubenswrapper[4909]: I0202 12:10:31.707224 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6744779749-884lr" event={"ID":"3d43a55d-45af-4682-9319-ef98614343d4","Type":"ContainerStarted","Data":"28a249ccbb557d0c67156c5501d71c60e2b293bd36fcdcf579efbbc6677b5085"} Feb 02 12:10:31 crc kubenswrapper[4909]: I0202 12:10:31.707467 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:31 crc kubenswrapper[4909]: I0202 12:10:31.707493 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:31 crc kubenswrapper[4909]: I0202 12:10:31.740266 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-6744779749-884lr" podStartSLOduration=5.740251354 podStartE2EDuration="5.740251354s" podCreationTimestamp="2026-02-02 12:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:10:31.738956147 +0000 UTC m=+5957.485056872" watchObservedRunningTime="2026-02-02 12:10:31.740251354 +0000 UTC m=+5957.486352089" Feb 02 12:10:42 crc kubenswrapper[4909]: I0202 12:10:42.017356 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:10:42 crc kubenswrapper[4909]: E0202 12:10:42.018194 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:10:43 crc kubenswrapper[4909]: I0202 12:10:43.283003 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-fjmbh" Feb 02 12:10:47 crc kubenswrapper[4909]: I0202 12:10:47.029518 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:47 crc kubenswrapper[4909]: I0202 12:10:47.330824 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6744779749-884lr" Feb 02 12:10:47 crc kubenswrapper[4909]: I0202 12:10:47.430609 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-6c4564d67-6x9nd"] Feb 02 12:10:47 crc kubenswrapper[4909]: I0202 12:10:47.436745 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-6c4564d67-6x9nd" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerName="octavia-api" containerID="cri-o://67fb41ad4f0b2573c67eb8566c1fd90245597bd8af2164d4024f2a5e50382ce2" gracePeriod=30 Feb 02 12:10:47 crc kubenswrapper[4909]: I0202 12:10:47.437109 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-api-6c4564d67-6x9nd" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerName="octavia-api-provider-agent" containerID="cri-o://d9fb5658160bc102a00d229574579f5702d7aba83ce0e5675773e70369311d79" gracePeriod=30 Feb 02 12:10:48 crc kubenswrapper[4909]: I0202 12:10:48.890508 4909 generic.go:334] "Generic (PLEG): container finished" podID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerID="d9fb5658160bc102a00d229574579f5702d7aba83ce0e5675773e70369311d79" exitCode=0 Feb 02 12:10:48 crc kubenswrapper[4909]: I0202 12:10:48.890549 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6c4564d67-6x9nd" event={"ID":"d5e46e05-22cf-4148-b7de-7f6330986a54","Type":"ContainerDied","Data":"d9fb5658160bc102a00d229574579f5702d7aba83ce0e5675773e70369311d79"} Feb 02 12:10:50 crc kubenswrapper[4909]: I0202 12:10:50.910661 4909 generic.go:334] "Generic (PLEG): container finished" podID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerID="67fb41ad4f0b2573c67eb8566c1fd90245597bd8af2164d4024f2a5e50382ce2" exitCode=0 Feb 02 12:10:50 crc kubenswrapper[4909]: I0202 12:10:50.910903 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6c4564d67-6x9nd" event={"ID":"d5e46e05-22cf-4148-b7de-7f6330986a54","Type":"ContainerDied","Data":"67fb41ad4f0b2573c67eb8566c1fd90245597bd8af2164d4024f2a5e50382ce2"} Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.090913 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.188306 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data-merged\") pod \"d5e46e05-22cf-4148-b7de-7f6330986a54\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.188987 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-ovndb-tls-certs\") pod \"d5e46e05-22cf-4148-b7de-7f6330986a54\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.189203 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-combined-ca-bundle\") pod \"d5e46e05-22cf-4148-b7de-7f6330986a54\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.189334 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-octavia-run\") pod \"d5e46e05-22cf-4148-b7de-7f6330986a54\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.189458 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-scripts\") pod \"d5e46e05-22cf-4148-b7de-7f6330986a54\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.189595 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data\") pod \"d5e46e05-22cf-4148-b7de-7f6330986a54\" (UID: \"d5e46e05-22cf-4148-b7de-7f6330986a54\") " Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.191020 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-octavia-run" (OuterVolumeSpecName: "octavia-run") pod "d5e46e05-22cf-4148-b7de-7f6330986a54" (UID: "d5e46e05-22cf-4148-b7de-7f6330986a54"). InnerVolumeSpecName "octavia-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.213843 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-scripts" (OuterVolumeSpecName: "scripts") pod "d5e46e05-22cf-4148-b7de-7f6330986a54" (UID: "d5e46e05-22cf-4148-b7de-7f6330986a54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.218208 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data" (OuterVolumeSpecName: "config-data") pod "d5e46e05-22cf-4148-b7de-7f6330986a54" (UID: "d5e46e05-22cf-4148-b7de-7f6330986a54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.250012 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "d5e46e05-22cf-4148-b7de-7f6330986a54" (UID: "d5e46e05-22cf-4148-b7de-7f6330986a54"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.274234 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5e46e05-22cf-4148-b7de-7f6330986a54" (UID: "d5e46e05-22cf-4148-b7de-7f6330986a54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.293970 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.294070 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-config-data-merged\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.294088 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.294097 4909 reconciler_common.go:293] "Volume detached for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/d5e46e05-22cf-4148-b7de-7f6330986a54-octavia-run\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.294108 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.349350 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d5e46e05-22cf-4148-b7de-7f6330986a54" (UID: "d5e46e05-22cf-4148-b7de-7f6330986a54"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.396477 4909 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e46e05-22cf-4148-b7de-7f6330986a54-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.923184 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6c4564d67-6x9nd" event={"ID":"d5e46e05-22cf-4148-b7de-7f6330986a54","Type":"ContainerDied","Data":"f0b8b15c1103f63e74ac080cc0de1d8a5611a57f2ee24b26eabec07f31d90f98"} Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.923244 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6c4564d67-6x9nd" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.923537 4909 scope.go:117] "RemoveContainer" containerID="d9fb5658160bc102a00d229574579f5702d7aba83ce0e5675773e70369311d79" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.956704 4909 scope.go:117] "RemoveContainer" containerID="67fb41ad4f0b2573c67eb8566c1fd90245597bd8af2164d4024f2a5e50382ce2" Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.959599 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-api-6c4564d67-6x9nd"] Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.969415 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-api-6c4564d67-6x9nd"] Feb 02 12:10:51 crc kubenswrapper[4909]: I0202 12:10:51.979514 4909 scope.go:117] "RemoveContainer" containerID="612f69c857b17828a3eaffc72cf8d268c5c50b3a0e8ae12a707d5f96f60eca5d" Feb 02 12:10:52 crc kubenswrapper[4909]: I0202 12:10:52.805059 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-kbcql"] Feb 02 12:10:52 crc kubenswrapper[4909]: I0202 12:10:52.805274 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" podUID="a4727a1e-e236-4f4f-afb6-baf063cbf292" containerName="octavia-amphora-httpd" containerID="cri-o://0f82beca1f448ea6951e647b6c89f07f08f256d0054517a1b1bb802cba314d0f" gracePeriod=30 Feb 02 12:10:52 crc kubenswrapper[4909]: I0202 12:10:52.936223 4909 generic.go:334] "Generic (PLEG): container finished" podID="a4727a1e-e236-4f4f-afb6-baf063cbf292" containerID="0f82beca1f448ea6951e647b6c89f07f08f256d0054517a1b1bb802cba314d0f" exitCode=0 Feb 02 12:10:52 crc kubenswrapper[4909]: I0202 12:10:52.936280 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" event={"ID":"a4727a1e-e236-4f4f-afb6-baf063cbf292","Type":"ContainerDied","Data":"0f82beca1f448ea6951e647b6c89f07f08f256d0054517a1b1bb802cba314d0f"} Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.028559 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" path="/var/lib/kubelet/pods/d5e46e05-22cf-4148-b7de-7f6330986a54/volumes" Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.341283 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.431843 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/a4727a1e-e236-4f4f-afb6-baf063cbf292-amphora-image\") pod \"a4727a1e-e236-4f4f-afb6-baf063cbf292\" (UID: \"a4727a1e-e236-4f4f-afb6-baf063cbf292\") " Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.432118 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4727a1e-e236-4f4f-afb6-baf063cbf292-httpd-config\") pod \"a4727a1e-e236-4f4f-afb6-baf063cbf292\" (UID: \"a4727a1e-e236-4f4f-afb6-baf063cbf292\") " Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.467544 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4727a1e-e236-4f4f-afb6-baf063cbf292-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a4727a1e-e236-4f4f-afb6-baf063cbf292" (UID: "a4727a1e-e236-4f4f-afb6-baf063cbf292"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.517427 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4727a1e-e236-4f4f-afb6-baf063cbf292-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "a4727a1e-e236-4f4f-afb6-baf063cbf292" (UID: "a4727a1e-e236-4f4f-afb6-baf063cbf292"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.534052 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4727a1e-e236-4f4f-afb6-baf063cbf292-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.534091 4909 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/a4727a1e-e236-4f4f-afb6-baf063cbf292-amphora-image\") on node \"crc\" DevicePath \"\"" Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.948955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" event={"ID":"a4727a1e-e236-4f4f-afb6-baf063cbf292","Type":"ContainerDied","Data":"e10f7117b52552fdb2e4a11159b4c21dbc9802a7d7f3ea870fe08df5b499e901"} Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.948987 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-65dd99cb46-kbcql" Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.949034 4909 scope.go:117] "RemoveContainer" containerID="0f82beca1f448ea6951e647b6c89f07f08f256d0054517a1b1bb802cba314d0f" Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.978049 4909 scope.go:117] "RemoveContainer" containerID="802ba0cef3bb440883b7e5de8a341ab0fc3a491c7fe62180801162b39f35d245" Feb 02 12:10:53 crc kubenswrapper[4909]: I0202 12:10:53.994137 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-kbcql"] Feb 02 12:10:54 crc kubenswrapper[4909]: I0202 12:10:54.004554 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-65dd99cb46-kbcql"] Feb 02 12:10:55 crc kubenswrapper[4909]: I0202 12:10:55.022542 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:10:55 crc kubenswrapper[4909]: I0202 12:10:55.030945 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4727a1e-e236-4f4f-afb6-baf063cbf292" path="/var/lib/kubelet/pods/a4727a1e-e236-4f4f-afb6-baf063cbf292/volumes" Feb 02 12:10:55 crc kubenswrapper[4909]: I0202 12:10:55.986176 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"7efda102e84c90fb99553b44e1e0fb3c586800eb829f32e3cb14404521e127c2"} Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.641652 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-94tp4"] Feb 02 12:11:17 crc kubenswrapper[4909]: E0202 12:11:17.642685 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerName="init" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.642700 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerName="init" Feb 02 12:11:17 crc kubenswrapper[4909]: E0202 12:11:17.642709 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4727a1e-e236-4f4f-afb6-baf063cbf292" containerName="octavia-amphora-httpd" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.642717 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4727a1e-e236-4f4f-afb6-baf063cbf292" containerName="octavia-amphora-httpd" Feb 02 12:11:17 crc kubenswrapper[4909]: E0202 12:11:17.642742 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerName="octavia-api-provider-agent" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.642751 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerName="octavia-api-provider-agent" Feb 02 12:11:17 crc kubenswrapper[4909]: E0202 12:11:17.642769 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerName="octavia-api" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.642777 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerName="octavia-api" Feb 02 12:11:17 crc kubenswrapper[4909]: E0202 12:11:17.642791 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4727a1e-e236-4f4f-afb6-baf063cbf292" containerName="init" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.642800 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4727a1e-e236-4f4f-afb6-baf063cbf292" containerName="init" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.643010 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerName="octavia-api-provider-agent" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.643042 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e46e05-22cf-4148-b7de-7f6330986a54" containerName="octavia-api" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.643054 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4727a1e-e236-4f4f-afb6-baf063cbf292" containerName="octavia-amphora-httpd" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.644182 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.647461 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.651299 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.651484 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.651996 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-94tp4"] Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.728243 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-scripts\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.728299 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-config-data\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.728596 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-combined-ca-bundle\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.728664 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-amphora-certs\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.729101 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3674af27-feb0-492c-9195-5557c3d392c1-hm-ports\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.729219 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3674af27-feb0-492c-9195-5557c3d392c1-config-data-merged\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.830694 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-combined-ca-bundle\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.830749 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-amphora-certs\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.830915 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3674af27-feb0-492c-9195-5557c3d392c1-hm-ports\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.830966 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3674af27-feb0-492c-9195-5557c3d392c1-config-data-merged\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.831011 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-scripts\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.831027 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-config-data\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.831734 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3674af27-feb0-492c-9195-5557c3d392c1-config-data-merged\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.832776 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3674af27-feb0-492c-9195-5557c3d392c1-hm-ports\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.838109 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-combined-ca-bundle\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.838248 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-config-data\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.839158 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-amphora-certs\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.850685 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3674af27-feb0-492c-9195-5557c3d392c1-scripts\") pod \"octavia-healthmanager-94tp4\" (UID: \"3674af27-feb0-492c-9195-5557c3d392c1\") " pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:17 crc kubenswrapper[4909]: I0202 12:11:17.965999 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.152865 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-snk8x"] Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.166464 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-snk8x"] Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.166562 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.224754 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.225133 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.257933 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e6d66230-f723-4042-86c4-6f19c99ae749-config-data-merged\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.257974 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-scripts\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.257994 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-amphora-certs\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.258065 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-combined-ca-bundle\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.258085 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e6d66230-f723-4042-86c4-6f19c99ae749-hm-ports\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.258105 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-config-data\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.359644 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-amphora-certs\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.360108 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-combined-ca-bundle\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.360134 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e6d66230-f723-4042-86c4-6f19c99ae749-hm-ports\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.360159 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-config-data\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.360338 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-scripts\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.360361 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e6d66230-f723-4042-86c4-6f19c99ae749-config-data-merged\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.360833 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e6d66230-f723-4042-86c4-6f19c99ae749-config-data-merged\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.361995 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e6d66230-f723-4042-86c4-6f19c99ae749-hm-ports\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.366783 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-scripts\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.367321 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-amphora-certs\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.367446 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-combined-ca-bundle\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.378223 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d66230-f723-4042-86c4-6f19c99ae749-config-data\") pod \"octavia-housekeeping-snk8x\" (UID: \"e6d66230-f723-4042-86c4-6f19c99ae749\") " pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.555751 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:19 crc kubenswrapper[4909]: I0202 12:11:19.741546 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-94tp4"] Feb 02 12:11:19 crc kubenswrapper[4909]: W0202 12:11:19.747702 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3674af27_feb0_492c_9195_5557c3d392c1.slice/crio-e5de1b418f135aa3100a31fb577c4c92655e52baa11ebcfd43ecde007032b6e8 WatchSource:0}: Error finding container e5de1b418f135aa3100a31fb577c4c92655e52baa11ebcfd43ecde007032b6e8: Status 404 returned error can't find the container with id e5de1b418f135aa3100a31fb577c4c92655e52baa11ebcfd43ecde007032b6e8 Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.160870 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-snk8x"] Feb 02 12:11:20 crc kubenswrapper[4909]: W0202 12:11:20.168059 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6d66230_f723_4042_86c4_6f19c99ae749.slice/crio-b7805007405df3d915a0f0ab7c68b26fffbefa69a700a56dd537b18eea29f6d3 WatchSource:0}: Error finding container b7805007405df3d915a0f0ab7c68b26fffbefa69a700a56dd537b18eea29f6d3: Status 404 returned error can't find the container with id b7805007405df3d915a0f0ab7c68b26fffbefa69a700a56dd537b18eea29f6d3 Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.171338 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.247633 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-94tp4" event={"ID":"3674af27-feb0-492c-9195-5557c3d392c1","Type":"ContainerStarted","Data":"e5de1b418f135aa3100a31fb577c4c92655e52baa11ebcfd43ecde007032b6e8"} Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.249906 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-snk8x" event={"ID":"e6d66230-f723-4042-86c4-6f19c99ae749","Type":"ContainerStarted","Data":"b7805007405df3d915a0f0ab7c68b26fffbefa69a700a56dd537b18eea29f6d3"} Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.877408 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-8ml42"] Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.880443 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.883839 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.884019 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.892981 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-config-data\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.893078 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-scripts\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.893109 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ebae09da-3646-40dd-98f7-d49907beacd1-hm-ports\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.893164 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ebae09da-3646-40dd-98f7-d49907beacd1-config-data-merged\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.893192 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-amphora-certs\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.893226 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-combined-ca-bundle\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.900835 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-8ml42"] Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.995496 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-combined-ca-bundle\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.995676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-config-data\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.995847 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-scripts\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.995891 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ebae09da-3646-40dd-98f7-d49907beacd1-hm-ports\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.996056 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ebae09da-3646-40dd-98f7-d49907beacd1-config-data-merged\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.996596 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ebae09da-3646-40dd-98f7-d49907beacd1-config-data-merged\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.996710 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-amphora-certs\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:20 crc kubenswrapper[4909]: I0202 12:11:20.996986 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/ebae09da-3646-40dd-98f7-d49907beacd1-hm-ports\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:21 crc kubenswrapper[4909]: I0202 12:11:21.003618 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-scripts\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:21 crc kubenswrapper[4909]: I0202 12:11:21.004029 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-amphora-certs\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:21 crc kubenswrapper[4909]: I0202 12:11:21.004070 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-combined-ca-bundle\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:21 crc kubenswrapper[4909]: I0202 12:11:21.022557 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebae09da-3646-40dd-98f7-d49907beacd1-config-data\") pod \"octavia-worker-8ml42\" (UID: \"ebae09da-3646-40dd-98f7-d49907beacd1\") " pod="openstack/octavia-worker-8ml42" Feb 02 12:11:21 crc kubenswrapper[4909]: I0202 12:11:21.209790 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-8ml42" Feb 02 12:11:21 crc kubenswrapper[4909]: I0202 12:11:21.261626 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-94tp4" event={"ID":"3674af27-feb0-492c-9195-5557c3d392c1","Type":"ContainerStarted","Data":"196daf70e8040e1bc1a28a378d7c6aa6a5c0f776bec5d55d0afcdc2df5591cf8"} Feb 02 12:11:21 crc kubenswrapper[4909]: I0202 12:11:21.763438 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-94tp4"] Feb 02 12:11:22 crc kubenswrapper[4909]: I0202 12:11:22.113152 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-8ml42"] Feb 02 12:11:22 crc kubenswrapper[4909]: W0202 12:11:22.127230 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebae09da_3646_40dd_98f7_d49907beacd1.slice/crio-e77041b92694838067255b8fd5e77678648d0fd7d15e60410065f028248b6e4c WatchSource:0}: Error finding container e77041b92694838067255b8fd5e77678648d0fd7d15e60410065f028248b6e4c: Status 404 returned error can't find the container with id e77041b92694838067255b8fd5e77678648d0fd7d15e60410065f028248b6e4c Feb 02 12:11:22 crc kubenswrapper[4909]: I0202 12:11:22.272010 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-snk8x" event={"ID":"e6d66230-f723-4042-86c4-6f19c99ae749","Type":"ContainerStarted","Data":"6d241395f38a2ebe979246a1347e3ef67f3fbcdd877aed5abcafe3bc917160a3"} Feb 02 12:11:22 crc kubenswrapper[4909]: I0202 12:11:22.274293 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8ml42" event={"ID":"ebae09da-3646-40dd-98f7-d49907beacd1","Type":"ContainerStarted","Data":"e77041b92694838067255b8fd5e77678648d0fd7d15e60410065f028248b6e4c"} Feb 02 12:11:23 crc kubenswrapper[4909]: I0202 12:11:23.287658 4909 generic.go:334] "Generic (PLEG): container finished" podID="e6d66230-f723-4042-86c4-6f19c99ae749" containerID="6d241395f38a2ebe979246a1347e3ef67f3fbcdd877aed5abcafe3bc917160a3" exitCode=0 Feb 02 12:11:23 crc kubenswrapper[4909]: I0202 12:11:23.287763 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-snk8x" event={"ID":"e6d66230-f723-4042-86c4-6f19c99ae749","Type":"ContainerDied","Data":"6d241395f38a2ebe979246a1347e3ef67f3fbcdd877aed5abcafe3bc917160a3"} Feb 02 12:11:23 crc kubenswrapper[4909]: I0202 12:11:23.291261 4909 generic.go:334] "Generic (PLEG): container finished" podID="3674af27-feb0-492c-9195-5557c3d392c1" containerID="196daf70e8040e1bc1a28a378d7c6aa6a5c0f776bec5d55d0afcdc2df5591cf8" exitCode=0 Feb 02 12:11:23 crc kubenswrapper[4909]: I0202 12:11:23.291315 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-94tp4" event={"ID":"3674af27-feb0-492c-9195-5557c3d392c1","Type":"ContainerDied","Data":"196daf70e8040e1bc1a28a378d7c6aa6a5c0f776bec5d55d0afcdc2df5591cf8"} Feb 02 12:11:24 crc kubenswrapper[4909]: I0202 12:11:24.305670 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-snk8x" event={"ID":"e6d66230-f723-4042-86c4-6f19c99ae749","Type":"ContainerStarted","Data":"36cfca1cb07c10600feab625f7afb3fac7187bb30ea7e879527e2764383751c6"} Feb 02 12:11:24 crc kubenswrapper[4909]: I0202 12:11:24.308399 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:24 crc kubenswrapper[4909]: I0202 12:11:24.311123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8ml42" event={"ID":"ebae09da-3646-40dd-98f7-d49907beacd1","Type":"ContainerStarted","Data":"e900f7ada3ad1443c688d9b6e27972cd992b6315861bbf830dc6752ced3008eb"} Feb 02 12:11:24 crc kubenswrapper[4909]: I0202 12:11:24.315229 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-94tp4" event={"ID":"3674af27-feb0-492c-9195-5557c3d392c1","Type":"ContainerStarted","Data":"8b4a9ef46c5b75eebc6912c5d552e32a5c1016af27de32a9a8d623a5f92d9277"} Feb 02 12:11:24 crc kubenswrapper[4909]: I0202 12:11:24.316116 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:24 crc kubenswrapper[4909]: I0202 12:11:24.331710 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-snk8x" podStartSLOduration=3.939919087 podStartE2EDuration="5.331690831s" podCreationTimestamp="2026-02-02 12:11:19 +0000 UTC" firstStartedPulling="2026-02-02 12:11:20.170956825 +0000 UTC m=+6005.917057560" lastFinishedPulling="2026-02-02 12:11:21.562728559 +0000 UTC m=+6007.308829304" observedRunningTime="2026-02-02 12:11:24.326953287 +0000 UTC m=+6010.073054022" watchObservedRunningTime="2026-02-02 12:11:24.331690831 +0000 UTC m=+6010.077791566" Feb 02 12:11:24 crc kubenswrapper[4909]: I0202 12:11:24.351995 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-94tp4" podStartSLOduration=7.351974777 podStartE2EDuration="7.351974777s" podCreationTimestamp="2026-02-02 12:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:11:24.349389264 +0000 UTC m=+6010.095490009" watchObservedRunningTime="2026-02-02 12:11:24.351974777 +0000 UTC m=+6010.098075512" Feb 02 12:11:25 crc kubenswrapper[4909]: I0202 12:11:25.329345 4909 generic.go:334] "Generic (PLEG): container finished" podID="ebae09da-3646-40dd-98f7-d49907beacd1" containerID="e900f7ada3ad1443c688d9b6e27972cd992b6315861bbf830dc6752ced3008eb" exitCode=0 Feb 02 12:11:25 crc kubenswrapper[4909]: I0202 12:11:25.329979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8ml42" event={"ID":"ebae09da-3646-40dd-98f7-d49907beacd1","Type":"ContainerDied","Data":"e900f7ada3ad1443c688d9b6e27972cd992b6315861bbf830dc6752ced3008eb"} Feb 02 12:11:26 crc kubenswrapper[4909]: I0202 12:11:26.341929 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8ml42" event={"ID":"ebae09da-3646-40dd-98f7-d49907beacd1","Type":"ContainerStarted","Data":"617a8fc59f4c815d5426840ff5734d73c6eae7ac51577ca0a9080f74e716dc03"} Feb 02 12:11:26 crc kubenswrapper[4909]: I0202 12:11:26.367429 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-8ml42" podStartSLOduration=4.960017625 podStartE2EDuration="6.367406032s" podCreationTimestamp="2026-02-02 12:11:20 +0000 UTC" firstStartedPulling="2026-02-02 12:11:22.129778984 +0000 UTC m=+6007.875879719" lastFinishedPulling="2026-02-02 12:11:23.537167391 +0000 UTC m=+6009.283268126" observedRunningTime="2026-02-02 12:11:26.359977052 +0000 UTC m=+6012.106077787" watchObservedRunningTime="2026-02-02 12:11:26.367406032 +0000 UTC m=+6012.113506757" Feb 02 12:11:27 crc kubenswrapper[4909]: I0202 12:11:27.350141 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-8ml42" Feb 02 12:11:32 crc kubenswrapper[4909]: I0202 12:11:32.996560 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-94tp4" Feb 02 12:11:34 crc kubenswrapper[4909]: I0202 12:11:34.595535 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-snk8x" Feb 02 12:11:36 crc kubenswrapper[4909]: I0202 12:11:36.246288 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-8ml42" Feb 02 12:11:43 crc kubenswrapper[4909]: I0202 12:11:43.064894 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-28c9-account-create-update-rsw9x"] Feb 02 12:11:43 crc kubenswrapper[4909]: I0202 12:11:43.074705 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-28c9-account-create-update-rsw9x"] Feb 02 12:11:44 crc kubenswrapper[4909]: I0202 12:11:44.029438 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-78x7k"] Feb 02 12:11:44 crc kubenswrapper[4909]: I0202 12:11:44.039566 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-78x7k"] Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.029380 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9a3fd6-be55-42e2-b83f-077ccb698019" path="/var/lib/kubelet/pods/2b9a3fd6-be55-42e2-b83f-077ccb698019/volumes" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.031214 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e69165bb-109c-41ad-84c0-bb7614862840" path="/var/lib/kubelet/pods/e69165bb-109c-41ad-84c0-bb7614862840/volumes" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.265846 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5477bcbbcc-smk66"] Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.267687 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.275056 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.275298 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.275556 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-7m7x5" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.275747 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.291263 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5477bcbbcc-smk66"] Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.319035 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.319248 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" containerName="glance-log" containerID="cri-o://96f4ccc6f4ef62d04ab2978fcd1557cbd0b70ab96e442bdeae7648989d511a40" gracePeriod=30 Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.319691 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" containerName="glance-httpd" containerID="cri-o://9fde8894f21d4504d06d52c3811b3597f98f90e7e5fd7842792938797aaa82fa" gracePeriod=30 Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.420557 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.420970 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3e06882f-c657-49fc-a5a1-0657121844f9" containerName="glance-log" containerID="cri-o://7cc5359348c3520e4ea084e1f1374a0dfb12683f12fec55c1aafc56234509af1" gracePeriod=30 Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.421050 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3e06882f-c657-49fc-a5a1-0657121844f9" containerName="glance-httpd" containerID="cri-o://d04ce1909f7f75c4a6b91d6cba96d86f768ab4138ad910634c3ddafeb3c56643" gracePeriod=30 Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.425910 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-scripts\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.426032 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78c1e92d-0b25-4b3f-9554-1e5427c36e97-horizon-secret-key\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.426082 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c1e92d-0b25-4b3f-9554-1e5427c36e97-logs\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.426163 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbppr\" (UniqueName: \"kubernetes.io/projected/78c1e92d-0b25-4b3f-9554-1e5427c36e97-kube-api-access-wbppr\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.426232 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-config-data\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.471714 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-74bd999885-xbm74"] Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.473461 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.509109 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74bd999885-xbm74"] Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.527659 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-scripts\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.527767 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78c1e92d-0b25-4b3f-9554-1e5427c36e97-horizon-secret-key\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.527824 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c1e92d-0b25-4b3f-9554-1e5427c36e97-logs\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.527883 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbppr\" (UniqueName: \"kubernetes.io/projected/78c1e92d-0b25-4b3f-9554-1e5427c36e97-kube-api-access-wbppr\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.527938 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-config-data\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.529320 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-config-data\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.529591 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c1e92d-0b25-4b3f-9554-1e5427c36e97-logs\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.530902 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-scripts\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.541202 4909 generic.go:334] "Generic (PLEG): container finished" podID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" containerID="96f4ccc6f4ef62d04ab2978fcd1557cbd0b70ab96e442bdeae7648989d511a40" exitCode=143 Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.541272 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5dc8b554-88ed-470a-b46a-3c0f611f75a8","Type":"ContainerDied","Data":"96f4ccc6f4ef62d04ab2978fcd1557cbd0b70ab96e442bdeae7648989d511a40"} Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.556922 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbppr\" (UniqueName: \"kubernetes.io/projected/78c1e92d-0b25-4b3f-9554-1e5427c36e97-kube-api-access-wbppr\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.568453 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78c1e92d-0b25-4b3f-9554-1e5427c36e97-horizon-secret-key\") pod \"horizon-5477bcbbcc-smk66\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.597309 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.629495 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzqgc\" (UniqueName: \"kubernetes.io/projected/5a78aee9-062e-458e-93b7-be8197c84610-kube-api-access-rzqgc\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.629776 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a78aee9-062e-458e-93b7-be8197c84610-horizon-secret-key\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.630142 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-config-data\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.630286 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-scripts\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.630378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a78aee9-062e-458e-93b7-be8197c84610-logs\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.735377 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzqgc\" (UniqueName: \"kubernetes.io/projected/5a78aee9-062e-458e-93b7-be8197c84610-kube-api-access-rzqgc\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.735788 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a78aee9-062e-458e-93b7-be8197c84610-horizon-secret-key\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.735864 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-config-data\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.735937 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-scripts\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.735970 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a78aee9-062e-458e-93b7-be8197c84610-logs\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.736440 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a78aee9-062e-458e-93b7-be8197c84610-logs\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.741642 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a78aee9-062e-458e-93b7-be8197c84610-horizon-secret-key\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.742446 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-config-data\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.742519 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-scripts\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:45 crc kubenswrapper[4909]: I0202 12:11:45.759559 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzqgc\" (UniqueName: \"kubernetes.io/projected/5a78aee9-062e-458e-93b7-be8197c84610-kube-api-access-rzqgc\") pod \"horizon-74bd999885-xbm74\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:46 crc kubenswrapper[4909]: I0202 12:11:46.012749 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:11:46 crc kubenswrapper[4909]: I0202 12:11:46.105788 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5477bcbbcc-smk66"] Feb 02 12:11:46 crc kubenswrapper[4909]: W0202 12:11:46.131517 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78c1e92d_0b25_4b3f_9554_1e5427c36e97.slice/crio-19f12c2fb15aa462b4602db6bd0504b905e8a68cec3ca389c4e097bfe9cf538f WatchSource:0}: Error finding container 19f12c2fb15aa462b4602db6bd0504b905e8a68cec3ca389c4e097bfe9cf538f: Status 404 returned error can't find the container with id 19f12c2fb15aa462b4602db6bd0504b905e8a68cec3ca389c4e097bfe9cf538f Feb 02 12:11:46 crc kubenswrapper[4909]: I0202 12:11:46.557906 4909 generic.go:334] "Generic (PLEG): container finished" podID="3e06882f-c657-49fc-a5a1-0657121844f9" containerID="7cc5359348c3520e4ea084e1f1374a0dfb12683f12fec55c1aafc56234509af1" exitCode=143 Feb 02 12:11:46 crc kubenswrapper[4909]: I0202 12:11:46.557982 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e06882f-c657-49fc-a5a1-0657121844f9","Type":"ContainerDied","Data":"7cc5359348c3520e4ea084e1f1374a0dfb12683f12fec55c1aafc56234509af1"} Feb 02 12:11:46 crc kubenswrapper[4909]: I0202 12:11:46.558352 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74bd999885-xbm74"] Feb 02 12:11:46 crc kubenswrapper[4909]: I0202 12:11:46.559349 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5477bcbbcc-smk66" event={"ID":"78c1e92d-0b25-4b3f-9554-1e5427c36e97","Type":"ContainerStarted","Data":"19f12c2fb15aa462b4602db6bd0504b905e8a68cec3ca389c4e097bfe9cf538f"} Feb 02 12:11:46 crc kubenswrapper[4909]: W0202 12:11:46.567372 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a78aee9_062e_458e_93b7_be8197c84610.slice/crio-c38379070d6554a0bbfb570c9ae53af6d90d3c41f90bb6d07db82c9670f5a610 WatchSource:0}: Error finding container c38379070d6554a0bbfb570c9ae53af6d90d3c41f90bb6d07db82c9670f5a610: Status 404 returned error can't find the container with id c38379070d6554a0bbfb570c9ae53af6d90d3c41f90bb6d07db82c9670f5a610 Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.570146 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bd999885-xbm74" event={"ID":"5a78aee9-062e-458e-93b7-be8197c84610","Type":"ContainerStarted","Data":"c38379070d6554a0bbfb570c9ae53af6d90d3c41f90bb6d07db82c9670f5a610"} Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.740356 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5477bcbbcc-smk66"] Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.791943 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57d579b9bb-2znmd"] Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.797744 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.807769 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57d579b9bb-2znmd"] Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.823023 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.889569 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74bd999885-xbm74"] Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.921201 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64775b7466-p6d8c"] Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.924184 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.933198 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64775b7466-p6d8c"] Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.934282 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-logs\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.952190 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-combined-ca-bundle\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.952526 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-secret-key\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.953034 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvrz8\" (UniqueName: \"kubernetes.io/projected/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-kube-api-access-qvrz8\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.953260 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-config-data\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.953407 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-scripts\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:47 crc kubenswrapper[4909]: I0202 12:11:47.953532 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-tls-certs\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055007 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-tls-certs\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055076 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-logs\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055119 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-combined-ca-bundle\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055153 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-secret-key\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055210 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npkcj\" (UniqueName: \"kubernetes.io/projected/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-kube-api-access-npkcj\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055249 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-logs\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055308 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvrz8\" (UniqueName: \"kubernetes.io/projected/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-kube-api-access-qvrz8\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055333 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-combined-ca-bundle\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055407 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-config-data\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055468 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-config-data\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055553 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-scripts\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055580 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-tls-certs\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055655 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-secret-key\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.055823 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-scripts\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.057973 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-logs\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.059523 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-config-data\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.060349 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-scripts\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.066104 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-tls-certs\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.069055 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-secret-key\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.079183 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-combined-ca-bundle\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.081580 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvrz8\" (UniqueName: \"kubernetes.io/projected/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-kube-api-access-qvrz8\") pod \"horizon-57d579b9bb-2znmd\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.142328 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.163455 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-combined-ca-bundle\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.164076 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-config-data\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.164156 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-secret-key\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.164235 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-scripts\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.164275 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-tls-certs\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.164316 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npkcj\" (UniqueName: \"kubernetes.io/projected/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-kube-api-access-npkcj\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.164341 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-logs\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.165040 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-logs\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.168050 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-scripts\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.168218 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-config-data\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.170333 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-secret-key\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.170338 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-tls-certs\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.171107 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-combined-ca-bundle\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.182250 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npkcj\" (UniqueName: \"kubernetes.io/projected/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-kube-api-access-npkcj\") pod \"horizon-64775b7466-p6d8c\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.256267 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.694495 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57d579b9bb-2znmd"] Feb 02 12:11:48 crc kubenswrapper[4909]: W0202 12:11:48.732163 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded357b84_9b8b_40ac_b2a9_4ae891e248a7.slice/crio-14e3b458796c0cf98273d3b185857c171f420f893ad770f6ca149d2f06c8048f WatchSource:0}: Error finding container 14e3b458796c0cf98273d3b185857c171f420f893ad770f6ca149d2f06c8048f: Status 404 returned error can't find the container with id 14e3b458796c0cf98273d3b185857c171f420f893ad770f6ca149d2f06c8048f Feb 02 12:11:48 crc kubenswrapper[4909]: I0202 12:11:48.841667 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64775b7466-p6d8c"] Feb 02 12:11:48 crc kubenswrapper[4909]: W0202 12:11:48.853966 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20c51ec9_94ae_4b2a_b16e_15877df6b3e6.slice/crio-3f0d2e7c227086b2fd8daab321ce6831d7b90f59bf84d3090a9cb0a8b48e9f86 WatchSource:0}: Error finding container 3f0d2e7c227086b2fd8daab321ce6831d7b90f59bf84d3090a9cb0a8b48e9f86: Status 404 returned error can't find the container with id 3f0d2e7c227086b2fd8daab321ce6831d7b90f59bf84d3090a9cb0a8b48e9f86 Feb 02 12:11:49 crc kubenswrapper[4909]: I0202 12:11:49.043247 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-46w84"] Feb 02 12:11:49 crc kubenswrapper[4909]: I0202 12:11:49.043286 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-46w84"] Feb 02 12:11:49 crc kubenswrapper[4909]: I0202 12:11:49.642118 4909 generic.go:334] "Generic (PLEG): container finished" podID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" containerID="9fde8894f21d4504d06d52c3811b3597f98f90e7e5fd7842792938797aaa82fa" exitCode=0 Feb 02 12:11:49 crc kubenswrapper[4909]: I0202 12:11:49.642290 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5dc8b554-88ed-470a-b46a-3c0f611f75a8","Type":"ContainerDied","Data":"9fde8894f21d4504d06d52c3811b3597f98f90e7e5fd7842792938797aaa82fa"} Feb 02 12:11:49 crc kubenswrapper[4909]: I0202 12:11:49.660352 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57d579b9bb-2znmd" event={"ID":"ed357b84-9b8b-40ac-b2a9-4ae891e248a7","Type":"ContainerStarted","Data":"14e3b458796c0cf98273d3b185857c171f420f893ad770f6ca149d2f06c8048f"} Feb 02 12:11:49 crc kubenswrapper[4909]: I0202 12:11:49.662312 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64775b7466-p6d8c" event={"ID":"20c51ec9-94ae-4b2a-b16e-15877df6b3e6","Type":"ContainerStarted","Data":"3f0d2e7c227086b2fd8daab321ce6831d7b90f59bf84d3090a9cb0a8b48e9f86"} Feb 02 12:11:49 crc kubenswrapper[4909]: I0202 12:11:49.682357 4909 generic.go:334] "Generic (PLEG): container finished" podID="3e06882f-c657-49fc-a5a1-0657121844f9" containerID="d04ce1909f7f75c4a6b91d6cba96d86f768ab4138ad910634c3ddafeb3c56643" exitCode=0 Feb 02 12:11:49 crc kubenswrapper[4909]: I0202 12:11:49.682411 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e06882f-c657-49fc-a5a1-0657121844f9","Type":"ContainerDied","Data":"d04ce1909f7f75c4a6b91d6cba96d86f768ab4138ad910634c3ddafeb3c56643"} Feb 02 12:11:51 crc kubenswrapper[4909]: I0202 12:11:51.030070 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0416ea7e-0584-4585-9d94-75f1df10d436" path="/var/lib/kubelet/pods/0416ea7e-0584-4585-9d94-75f1df10d436/volumes" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.534069 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.707763 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-public-tls-certs\") pod \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.708186 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpcfq\" (UniqueName: \"kubernetes.io/projected/5dc8b554-88ed-470a-b46a-3c0f611f75a8-kube-api-access-hpcfq\") pod \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.708253 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-httpd-run\") pod \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.708536 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-combined-ca-bundle\") pod \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.708609 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-config-data\") pod \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.708801 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-logs\") pod \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.708876 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-scripts\") pod \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\" (UID: \"5dc8b554-88ed-470a-b46a-3c0f611f75a8\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.708951 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5dc8b554-88ed-470a-b46a-3c0f611f75a8" (UID: "5dc8b554-88ed-470a-b46a-3c0f611f75a8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.709337 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-logs" (OuterVolumeSpecName: "logs") pod "5dc8b554-88ed-470a-b46a-3c0f611f75a8" (UID: "5dc8b554-88ed-470a-b46a-3c0f611f75a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.710069 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.710087 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5dc8b554-88ed-470a-b46a-3c0f611f75a8-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.713661 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-scripts" (OuterVolumeSpecName: "scripts") pod "5dc8b554-88ed-470a-b46a-3c0f611f75a8" (UID: "5dc8b554-88ed-470a-b46a-3c0f611f75a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.714159 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc8b554-88ed-470a-b46a-3c0f611f75a8-kube-api-access-hpcfq" (OuterVolumeSpecName: "kube-api-access-hpcfq") pod "5dc8b554-88ed-470a-b46a-3c0f611f75a8" (UID: "5dc8b554-88ed-470a-b46a-3c0f611f75a8"). InnerVolumeSpecName "kube-api-access-hpcfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.744614 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dc8b554-88ed-470a-b46a-3c0f611f75a8" (UID: "5dc8b554-88ed-470a-b46a-3c0f611f75a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.747140 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"5dc8b554-88ed-470a-b46a-3c0f611f75a8","Type":"ContainerDied","Data":"eba133e8c2a6f7a0ff6d9a6faaf3cc08378656e4600394dfc8bf8b8ae440c122"} Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.747195 4909 scope.go:117] "RemoveContainer" containerID="9fde8894f21d4504d06d52c3811b3597f98f90e7e5fd7842792938797aaa82fa" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.747646 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.757602 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e06882f-c657-49fc-a5a1-0657121844f9","Type":"ContainerDied","Data":"bc4d2dc893be6343f1e743af17426fa3406b5ac22bf25d7be5183ac9267b615a"} Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.757636 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc4d2dc893be6343f1e743af17426fa3406b5ac22bf25d7be5183ac9267b615a" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.762182 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.764878 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5dc8b554-88ed-470a-b46a-3c0f611f75a8" (UID: "5dc8b554-88ed-470a-b46a-3c0f611f75a8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.777072 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-config-data" (OuterVolumeSpecName: "config-data") pod "5dc8b554-88ed-470a-b46a-3c0f611f75a8" (UID: "5dc8b554-88ed-470a-b46a-3c0f611f75a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.811994 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.812175 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.812254 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.812348 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpcfq\" (UniqueName: \"kubernetes.io/projected/5dc8b554-88ed-470a-b46a-3c0f611f75a8-kube-api-access-hpcfq\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.812428 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dc8b554-88ed-470a-b46a-3c0f611f75a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.848499 4909 scope.go:117] "RemoveContainer" containerID="96f4ccc6f4ef62d04ab2978fcd1557cbd0b70ab96e442bdeae7648989d511a40" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.914480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfckc\" (UniqueName: \"kubernetes.io/projected/3e06882f-c657-49fc-a5a1-0657121844f9-kube-api-access-vfckc\") pod \"3e06882f-c657-49fc-a5a1-0657121844f9\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.914570 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-combined-ca-bundle\") pod \"3e06882f-c657-49fc-a5a1-0657121844f9\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.914617 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-logs\") pod \"3e06882f-c657-49fc-a5a1-0657121844f9\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.914646 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-httpd-run\") pod \"3e06882f-c657-49fc-a5a1-0657121844f9\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.914718 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-scripts\") pod \"3e06882f-c657-49fc-a5a1-0657121844f9\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.914849 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-internal-tls-certs\") pod \"3e06882f-c657-49fc-a5a1-0657121844f9\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.914873 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-config-data\") pod \"3e06882f-c657-49fc-a5a1-0657121844f9\" (UID: \"3e06882f-c657-49fc-a5a1-0657121844f9\") " Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.915652 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3e06882f-c657-49fc-a5a1-0657121844f9" (UID: "3e06882f-c657-49fc-a5a1-0657121844f9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.916057 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-logs" (OuterVolumeSpecName: "logs") pod "3e06882f-c657-49fc-a5a1-0657121844f9" (UID: "3e06882f-c657-49fc-a5a1-0657121844f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.916885 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.916908 4909 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e06882f-c657-49fc-a5a1-0657121844f9-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.929453 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e06882f-c657-49fc-a5a1-0657121844f9-kube-api-access-vfckc" (OuterVolumeSpecName: "kube-api-access-vfckc") pod "3e06882f-c657-49fc-a5a1-0657121844f9" (UID: "3e06882f-c657-49fc-a5a1-0657121844f9"). InnerVolumeSpecName "kube-api-access-vfckc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:11:55 crc kubenswrapper[4909]: I0202 12:11:55.959611 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-scripts" (OuterVolumeSpecName: "scripts") pod "3e06882f-c657-49fc-a5a1-0657121844f9" (UID: "3e06882f-c657-49fc-a5a1-0657121844f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.020708 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfckc\" (UniqueName: \"kubernetes.io/projected/3e06882f-c657-49fc-a5a1-0657121844f9-kube-api-access-vfckc\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.020738 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.112327 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.118233 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e06882f-c657-49fc-a5a1-0657121844f9" (UID: "3e06882f-c657-49fc-a5a1-0657121844f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.124430 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.125284 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.151053 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-config-data" (OuterVolumeSpecName: "config-data") pod "3e06882f-c657-49fc-a5a1-0657121844f9" (UID: "3e06882f-c657-49fc-a5a1-0657121844f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.157774 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:11:56 crc kubenswrapper[4909]: E0202 12:11:56.158363 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" containerName="glance-httpd" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.158385 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" containerName="glance-httpd" Feb 02 12:11:56 crc kubenswrapper[4909]: E0202 12:11:56.158426 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e06882f-c657-49fc-a5a1-0657121844f9" containerName="glance-log" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.158435 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e06882f-c657-49fc-a5a1-0657121844f9" containerName="glance-log" Feb 02 12:11:56 crc kubenswrapper[4909]: E0202 12:11:56.158463 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" containerName="glance-log" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.158471 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" containerName="glance-log" Feb 02 12:11:56 crc kubenswrapper[4909]: E0202 12:11:56.158480 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e06882f-c657-49fc-a5a1-0657121844f9" containerName="glance-httpd" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.158489 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e06882f-c657-49fc-a5a1-0657121844f9" containerName="glance-httpd" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.158699 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e06882f-c657-49fc-a5a1-0657121844f9" containerName="glance-log" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.158724 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e06882f-c657-49fc-a5a1-0657121844f9" containerName="glance-httpd" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.158743 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" containerName="glance-httpd" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.158765 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" containerName="glance-log" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.160034 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.162826 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.163047 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.167300 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e06882f-c657-49fc-a5a1-0657121844f9" (UID: "3e06882f-c657-49fc-a5a1-0657121844f9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.171687 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.226360 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.226395 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e06882f-c657-49fc-a5a1-0657121844f9-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.327769 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-logs\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.328173 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-scripts\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.328273 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.328305 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.328367 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw6ht\" (UniqueName: \"kubernetes.io/projected/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-kube-api-access-bw6ht\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.328418 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.328487 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-config-data\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.432005 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.432129 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-config-data\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.432177 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-logs\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.432242 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-scripts\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.432364 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.432673 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.432764 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw6ht\" (UniqueName: \"kubernetes.io/projected/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-kube-api-access-bw6ht\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.433444 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.433866 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-logs\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.439492 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-scripts\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.442239 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-config-data\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.442395 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.447274 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.460932 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw6ht\" (UniqueName: \"kubernetes.io/projected/b99d8fda-2d28-4e7b-9df5-e5bb8750e52a-kube-api-access-bw6ht\") pod \"glance-default-external-api-0\" (UID: \"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a\") " pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.485843 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.812436 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57d579b9bb-2znmd" event={"ID":"ed357b84-9b8b-40ac-b2a9-4ae891e248a7","Type":"ContainerStarted","Data":"1540b67a373b365e32b8ba90ccc12d0000855bf43b0e5c3ab7fbe6997e859116"} Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.812842 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57d579b9bb-2znmd" event={"ID":"ed357b84-9b8b-40ac-b2a9-4ae891e248a7","Type":"ContainerStarted","Data":"877bac677ebdad4604a78946e22260fc3fd3d3d9563461be6b0030f1cfd77b1a"} Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.824886 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64775b7466-p6d8c" event={"ID":"20c51ec9-94ae-4b2a-b16e-15877df6b3e6","Type":"ContainerStarted","Data":"bae7e028e4266fea529772f976481a28252d0e91ca7ab2592827f2f5863017f3"} Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.824956 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64775b7466-p6d8c" event={"ID":"20c51ec9-94ae-4b2a-b16e-15877df6b3e6","Type":"ContainerStarted","Data":"1339be205d32f9ae3651f10f23090655a6a424a80731083143e8a4f4eb39b519"} Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.829435 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5477bcbbcc-smk66" event={"ID":"78c1e92d-0b25-4b3f-9554-1e5427c36e97","Type":"ContainerStarted","Data":"96e735086c0c6df7af6c8606f7e182a4ab467f3a4db3f6f794b92cca05873bf7"} Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.829494 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5477bcbbcc-smk66" event={"ID":"78c1e92d-0b25-4b3f-9554-1e5427c36e97","Type":"ContainerStarted","Data":"15f89a865819b640d7ca1377e3b4971b1562d430744d75a274e1ecfd98268c27"} Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.829490 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5477bcbbcc-smk66" podUID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" containerName="horizon-log" containerID="cri-o://15f89a865819b640d7ca1377e3b4971b1562d430744d75a274e1ecfd98268c27" gracePeriod=30 Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.829564 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5477bcbbcc-smk66" podUID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" containerName="horizon" containerID="cri-o://96e735086c0c6df7af6c8606f7e182a4ab467f3a4db3f6f794b92cca05873bf7" gracePeriod=30 Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.860119 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.864178 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-74bd999885-xbm74" podUID="5a78aee9-062e-458e-93b7-be8197c84610" containerName="horizon-log" containerID="cri-o://2338469d660dc8c860be39bbd8f8edd90d79068adb6e302000fefe580bfb913e" gracePeriod=30 Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.864510 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bd999885-xbm74" event={"ID":"5a78aee9-062e-458e-93b7-be8197c84610","Type":"ContainerStarted","Data":"a5f6c40bb50703e7f9592ea75c786975ed50dd7dfaf472060441f222b4e4001f"} Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.864535 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bd999885-xbm74" event={"ID":"5a78aee9-062e-458e-93b7-be8197c84610","Type":"ContainerStarted","Data":"2338469d660dc8c860be39bbd8f8edd90d79068adb6e302000fefe580bfb913e"} Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.864636 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-74bd999885-xbm74" podUID="5a78aee9-062e-458e-93b7-be8197c84610" containerName="horizon" containerID="cri-o://a5f6c40bb50703e7f9592ea75c786975ed50dd7dfaf472060441f222b4e4001f" gracePeriod=30 Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.880868 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-57d579b9bb-2znmd" podStartSLOduration=3.097465542 podStartE2EDuration="9.880846508s" podCreationTimestamp="2026-02-02 12:11:47 +0000 UTC" firstStartedPulling="2026-02-02 12:11:48.738358336 +0000 UTC m=+6034.484459071" lastFinishedPulling="2026-02-02 12:11:55.521739302 +0000 UTC m=+6041.267840037" observedRunningTime="2026-02-02 12:11:56.854924732 +0000 UTC m=+6042.601025467" watchObservedRunningTime="2026-02-02 12:11:56.880846508 +0000 UTC m=+6042.626947243" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.886976 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-64775b7466-p6d8c" podStartSLOduration=3.137521128 podStartE2EDuration="9.886958451s" podCreationTimestamp="2026-02-02 12:11:47 +0000 UTC" firstStartedPulling="2026-02-02 12:11:48.860209825 +0000 UTC m=+6034.606310560" lastFinishedPulling="2026-02-02 12:11:55.609647148 +0000 UTC m=+6041.355747883" observedRunningTime="2026-02-02 12:11:56.885287464 +0000 UTC m=+6042.631388199" watchObservedRunningTime="2026-02-02 12:11:56.886958451 +0000 UTC m=+6042.633059186" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.946415 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5477bcbbcc-smk66" podStartSLOduration=2.602104835 podStartE2EDuration="11.946392698s" podCreationTimestamp="2026-02-02 12:11:45 +0000 UTC" firstStartedPulling="2026-02-02 12:11:46.141703025 +0000 UTC m=+6031.887803760" lastFinishedPulling="2026-02-02 12:11:55.485990888 +0000 UTC m=+6041.232091623" observedRunningTime="2026-02-02 12:11:56.943451575 +0000 UTC m=+6042.689552360" watchObservedRunningTime="2026-02-02 12:11:56.946392698 +0000 UTC m=+6042.692493433" Feb 02 12:11:56 crc kubenswrapper[4909]: I0202 12:11:56.990871 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-74bd999885-xbm74" podStartSLOduration=3.08071573 podStartE2EDuration="11.99083517s" podCreationTimestamp="2026-02-02 12:11:45 +0000 UTC" firstStartedPulling="2026-02-02 12:11:46.570760893 +0000 UTC m=+6032.316861628" lastFinishedPulling="2026-02-02 12:11:55.480880333 +0000 UTC m=+6041.226981068" observedRunningTime="2026-02-02 12:11:56.963546705 +0000 UTC m=+6042.709647460" watchObservedRunningTime="2026-02-02 12:11:56.99083517 +0000 UTC m=+6042.736935905" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.035649 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc8b554-88ed-470a-b46a-3c0f611f75a8" path="/var/lib/kubelet/pods/5dc8b554-88ed-470a-b46a-3c0f611f75a8/volumes" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.036586 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.036615 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.043886 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.049496 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.053141 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.053355 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.053628 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.160678 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcbkg\" (UniqueName: \"kubernetes.io/projected/74e16c15-dd31-4471-9b1d-80d20044c41a-kube-api-access-kcbkg\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.160739 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.160767 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.160796 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e16c15-dd31-4471-9b1d-80d20044c41a-logs\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.161049 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e16c15-dd31-4471-9b1d-80d20044c41a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.161114 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.161530 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.207168 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.263742 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbkg\" (UniqueName: \"kubernetes.io/projected/74e16c15-dd31-4471-9b1d-80d20044c41a-kube-api-access-kcbkg\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.264104 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.264245 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.264284 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e16c15-dd31-4471-9b1d-80d20044c41a-logs\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.264345 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e16c15-dd31-4471-9b1d-80d20044c41a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.264367 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.264522 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.265201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e16c15-dd31-4471-9b1d-80d20044c41a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.265201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e16c15-dd31-4471-9b1d-80d20044c41a-logs\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.274068 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.275344 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.277538 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.290008 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcbkg\" (UniqueName: \"kubernetes.io/projected/74e16c15-dd31-4471-9b1d-80d20044c41a-kube-api-access-kcbkg\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.302454 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e16c15-dd31-4471-9b1d-80d20044c41a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74e16c15-dd31-4471-9b1d-80d20044c41a\") " pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.373365 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 02 12:11:57 crc kubenswrapper[4909]: I0202 12:11:57.920336 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a","Type":"ContainerStarted","Data":"dbf97bfb89830361357e05ad0861b5cba5b741ad1a3123247ae6ffe1ad3758f3"} Feb 02 12:11:58 crc kubenswrapper[4909]: I0202 12:11:58.099678 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 02 12:11:58 crc kubenswrapper[4909]: W0202 12:11:58.127353 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74e16c15_dd31_4471_9b1d_80d20044c41a.slice/crio-c1b8efd2f95a29f52a6a80af08b9bf49be35aff6c5a4a3c0072a90e060da94dd WatchSource:0}: Error finding container c1b8efd2f95a29f52a6a80af08b9bf49be35aff6c5a4a3c0072a90e060da94dd: Status 404 returned error can't find the container with id c1b8efd2f95a29f52a6a80af08b9bf49be35aff6c5a4a3c0072a90e060da94dd Feb 02 12:11:58 crc kubenswrapper[4909]: I0202 12:11:58.156880 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:58 crc kubenswrapper[4909]: I0202 12:11:58.156921 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:11:58 crc kubenswrapper[4909]: I0202 12:11:58.257379 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:58 crc kubenswrapper[4909]: I0202 12:11:58.257663 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:11:58 crc kubenswrapper[4909]: I0202 12:11:58.979409 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a","Type":"ContainerStarted","Data":"eba95bbec9b943a89b1fb6a9080256c4d1af854eab53b6b0cc38c302edf0b551"} Feb 02 12:11:58 crc kubenswrapper[4909]: I0202 12:11:58.981767 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e16c15-dd31-4471-9b1d-80d20044c41a","Type":"ContainerStarted","Data":"c1b8efd2f95a29f52a6a80af08b9bf49be35aff6c5a4a3c0072a90e060da94dd"} Feb 02 12:11:59 crc kubenswrapper[4909]: I0202 12:11:59.032120 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e06882f-c657-49fc-a5a1-0657121844f9" path="/var/lib/kubelet/pods/3e06882f-c657-49fc-a5a1-0657121844f9/volumes" Feb 02 12:11:59 crc kubenswrapper[4909]: I0202 12:11:59.992500 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b99d8fda-2d28-4e7b-9df5-e5bb8750e52a","Type":"ContainerStarted","Data":"71e5cd09b1b155adda6ebb1fc3810aa727340694ff7ab339baca8b2727355af1"} Feb 02 12:11:59 crc kubenswrapper[4909]: I0202 12:11:59.994162 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e16c15-dd31-4471-9b1d-80d20044c41a","Type":"ContainerStarted","Data":"cc55b0cdcac27ade887a53bb133f7e49c84775777b377c78c968f1c4ee3a9ad4"} Feb 02 12:11:59 crc kubenswrapper[4909]: I0202 12:11:59.994208 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e16c15-dd31-4471-9b1d-80d20044c41a","Type":"ContainerStarted","Data":"251db843f8d1c1b44d262b3213b3c412fcc14d3bcaa09f0b04d07eb68dd5f604"} Feb 02 12:12:00 crc kubenswrapper[4909]: I0202 12:12:00.021158 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.02113883 podStartE2EDuration="4.02113883s" podCreationTimestamp="2026-02-02 12:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:12:00.009900841 +0000 UTC m=+6045.756001586" watchObservedRunningTime="2026-02-02 12:12:00.02113883 +0000 UTC m=+6045.767239565" Feb 02 12:12:00 crc kubenswrapper[4909]: I0202 12:12:00.046097 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.046077718 podStartE2EDuration="4.046077718s" podCreationTimestamp="2026-02-02 12:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:12:00.035353944 +0000 UTC m=+6045.781454679" watchObservedRunningTime="2026-02-02 12:12:00.046077718 +0000 UTC m=+6045.792178453" Feb 02 12:12:05 crc kubenswrapper[4909]: I0202 12:12:05.597637 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:12:06 crc kubenswrapper[4909]: I0202 12:12:06.013933 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:12:06 crc kubenswrapper[4909]: I0202 12:12:06.486098 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 12:12:06 crc kubenswrapper[4909]: I0202 12:12:06.486150 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 02 12:12:06 crc kubenswrapper[4909]: I0202 12:12:06.525178 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 12:12:06 crc kubenswrapper[4909]: I0202 12:12:06.534088 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 02 12:12:07 crc kubenswrapper[4909]: I0202 12:12:07.069462 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 12:12:07 crc kubenswrapper[4909]: I0202 12:12:07.069785 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 02 12:12:07 crc kubenswrapper[4909]: I0202 12:12:07.374603 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 12:12:07 crc kubenswrapper[4909]: I0202 12:12:07.374646 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 02 12:12:07 crc kubenswrapper[4909]: I0202 12:12:07.409631 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 12:12:07 crc kubenswrapper[4909]: I0202 12:12:07.420779 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 02 12:12:08 crc kubenswrapper[4909]: I0202 12:12:08.077438 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 12:12:08 crc kubenswrapper[4909]: I0202 12:12:08.077730 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 02 12:12:08 crc kubenswrapper[4909]: I0202 12:12:08.146456 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-57d579b9bb-2znmd" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.114:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8443: connect: connection refused" Feb 02 12:12:08 crc kubenswrapper[4909]: I0202 12:12:08.259630 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-64775b7466-p6d8c" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Feb 02 12:12:09 crc kubenswrapper[4909]: I0202 12:12:09.089482 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 12:12:09 crc kubenswrapper[4909]: I0202 12:12:09.089881 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 12:12:09 crc kubenswrapper[4909]: I0202 12:12:09.213277 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 12:12:09 crc kubenswrapper[4909]: I0202 12:12:09.503616 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 02 12:12:10 crc kubenswrapper[4909]: I0202 12:12:10.562451 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 12:12:10 crc kubenswrapper[4909]: I0202 12:12:10.562897 4909 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 02 12:12:10 crc kubenswrapper[4909]: I0202 12:12:10.564304 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 02 12:12:20 crc kubenswrapper[4909]: I0202 12:12:20.078509 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:12:20 crc kubenswrapper[4909]: I0202 12:12:20.342182 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:12:21 crc kubenswrapper[4909]: I0202 12:12:21.895958 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:12:22 crc kubenswrapper[4909]: I0202 12:12:22.082622 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:12:22 crc kubenswrapper[4909]: I0202 12:12:22.138051 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57d579b9bb-2znmd"] Feb 02 12:12:22 crc kubenswrapper[4909]: I0202 12:12:22.208913 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57d579b9bb-2znmd" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon-log" containerID="cri-o://877bac677ebdad4604a78946e22260fc3fd3d3d9563461be6b0030f1cfd77b1a" gracePeriod=30 Feb 02 12:12:22 crc kubenswrapper[4909]: I0202 12:12:22.208987 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57d579b9bb-2znmd" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon" containerID="cri-o://1540b67a373b365e32b8ba90ccc12d0000855bf43b0e5c3ab7fbe6997e859116" gracePeriod=30 Feb 02 12:12:25 crc kubenswrapper[4909]: I0202 12:12:25.759711 4909 scope.go:117] "RemoveContainer" containerID="aef0d7739f5850b7efa2c6deaa88d3aa390e0403fd0f633df0b8e60d8ff2d7ec" Feb 02 12:12:25 crc kubenswrapper[4909]: I0202 12:12:25.793943 4909 scope.go:117] "RemoveContainer" containerID="7cc5359348c3520e4ea084e1f1374a0dfb12683f12fec55c1aafc56234509af1" Feb 02 12:12:25 crc kubenswrapper[4909]: I0202 12:12:25.835302 4909 scope.go:117] "RemoveContainer" containerID="d04ce1909f7f75c4a6b91d6cba96d86f768ab4138ad910634c3ddafeb3c56643" Feb 02 12:12:25 crc kubenswrapper[4909]: I0202 12:12:25.863000 4909 scope.go:117] "RemoveContainer" containerID="a9eef1c1da5d8440a2dbdec197a5148ee2baf66e29aafecf11db339264067861" Feb 02 12:12:25 crc kubenswrapper[4909]: I0202 12:12:25.891162 4909 scope.go:117] "RemoveContainer" containerID="531041ee90724a687a125c32125e6673bc461283892efe90aa011da6456629ff" Feb 02 12:12:25 crc kubenswrapper[4909]: I0202 12:12:25.918646 4909 scope.go:117] "RemoveContainer" containerID="c3b9e512432981cec4bd914da4776c45148c84603f304325d6c664bad9fee04f" Feb 02 12:12:25 crc kubenswrapper[4909]: I0202 12:12:25.978236 4909 scope.go:117] "RemoveContainer" containerID="4bc2a9d643c3553bf9a59f4de5e4577950491b2d51652c90ef473ee3e1c85397" Feb 02 12:12:26 crc kubenswrapper[4909]: I0202 12:12:26.045628 4909 scope.go:117] "RemoveContainer" containerID="3f83b39fe092d064c3cacacc4bf52521ca277d5da3f333ee14e7391c542a6793" Feb 02 12:12:26 crc kubenswrapper[4909]: I0202 12:12:26.265705 4909 generic.go:334] "Generic (PLEG): container finished" podID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerID="1540b67a373b365e32b8ba90ccc12d0000855bf43b0e5c3ab7fbe6997e859116" exitCode=0 Feb 02 12:12:26 crc kubenswrapper[4909]: I0202 12:12:26.265768 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57d579b9bb-2znmd" event={"ID":"ed357b84-9b8b-40ac-b2a9-4ae891e248a7","Type":"ContainerDied","Data":"1540b67a373b365e32b8ba90ccc12d0000855bf43b0e5c3ab7fbe6997e859116"} Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.278947 4909 generic.go:334] "Generic (PLEG): container finished" podID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" containerID="96e735086c0c6df7af6c8606f7e182a4ab467f3a4db3f6f794b92cca05873bf7" exitCode=137 Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.279532 4909 generic.go:334] "Generic (PLEG): container finished" podID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" containerID="15f89a865819b640d7ca1377e3b4971b1562d430744d75a274e1ecfd98268c27" exitCode=137 Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.279069 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5477bcbbcc-smk66" event={"ID":"78c1e92d-0b25-4b3f-9554-1e5427c36e97","Type":"ContainerDied","Data":"96e735086c0c6df7af6c8606f7e182a4ab467f3a4db3f6f794b92cca05873bf7"} Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.279616 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5477bcbbcc-smk66" event={"ID":"78c1e92d-0b25-4b3f-9554-1e5427c36e97","Type":"ContainerDied","Data":"15f89a865819b640d7ca1377e3b4971b1562d430744d75a274e1ecfd98268c27"} Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.279659 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5477bcbbcc-smk66" event={"ID":"78c1e92d-0b25-4b3f-9554-1e5427c36e97","Type":"ContainerDied","Data":"19f12c2fb15aa462b4602db6bd0504b905e8a68cec3ca389c4e097bfe9cf538f"} Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.279674 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19f12c2fb15aa462b4602db6bd0504b905e8a68cec3ca389c4e097bfe9cf538f" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.283248 4909 generic.go:334] "Generic (PLEG): container finished" podID="5a78aee9-062e-458e-93b7-be8197c84610" containerID="a5f6c40bb50703e7f9592ea75c786975ed50dd7dfaf472060441f222b4e4001f" exitCode=137 Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.283275 4909 generic.go:334] "Generic (PLEG): container finished" podID="5a78aee9-062e-458e-93b7-be8197c84610" containerID="2338469d660dc8c860be39bbd8f8edd90d79068adb6e302000fefe580bfb913e" exitCode=137 Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.283296 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bd999885-xbm74" event={"ID":"5a78aee9-062e-458e-93b7-be8197c84610","Type":"ContainerDied","Data":"a5f6c40bb50703e7f9592ea75c786975ed50dd7dfaf472060441f222b4e4001f"} Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.283315 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bd999885-xbm74" event={"ID":"5a78aee9-062e-458e-93b7-be8197c84610","Type":"ContainerDied","Data":"2338469d660dc8c860be39bbd8f8edd90d79068adb6e302000fefe580bfb913e"} Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.341530 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.362738 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.435489 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a78aee9-062e-458e-93b7-be8197c84610-horizon-secret-key\") pod \"5a78aee9-062e-458e-93b7-be8197c84610\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.435565 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c1e92d-0b25-4b3f-9554-1e5427c36e97-logs\") pod \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.435585 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a78aee9-062e-458e-93b7-be8197c84610-logs\") pod \"5a78aee9-062e-458e-93b7-be8197c84610\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.435630 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-scripts\") pod \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.435741 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzqgc\" (UniqueName: \"kubernetes.io/projected/5a78aee9-062e-458e-93b7-be8197c84610-kube-api-access-rzqgc\") pod \"5a78aee9-062e-458e-93b7-be8197c84610\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.435777 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-config-data\") pod \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.435795 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-config-data\") pod \"5a78aee9-062e-458e-93b7-be8197c84610\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.435852 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78c1e92d-0b25-4b3f-9554-1e5427c36e97-horizon-secret-key\") pod \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.435883 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbppr\" (UniqueName: \"kubernetes.io/projected/78c1e92d-0b25-4b3f-9554-1e5427c36e97-kube-api-access-wbppr\") pod \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\" (UID: \"78c1e92d-0b25-4b3f-9554-1e5427c36e97\") " Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.435924 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-scripts\") pod \"5a78aee9-062e-458e-93b7-be8197c84610\" (UID: \"5a78aee9-062e-458e-93b7-be8197c84610\") " Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.436036 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78c1e92d-0b25-4b3f-9554-1e5427c36e97-logs" (OuterVolumeSpecName: "logs") pod "78c1e92d-0b25-4b3f-9554-1e5427c36e97" (UID: "78c1e92d-0b25-4b3f-9554-1e5427c36e97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.436166 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a78aee9-062e-458e-93b7-be8197c84610-logs" (OuterVolumeSpecName: "logs") pod "5a78aee9-062e-458e-93b7-be8197c84610" (UID: "5a78aee9-062e-458e-93b7-be8197c84610"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.436409 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78c1e92d-0b25-4b3f-9554-1e5427c36e97-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.436424 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5a78aee9-062e-458e-93b7-be8197c84610-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.443892 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78c1e92d-0b25-4b3f-9554-1e5427c36e97-kube-api-access-wbppr" (OuterVolumeSpecName: "kube-api-access-wbppr") pod "78c1e92d-0b25-4b3f-9554-1e5427c36e97" (UID: "78c1e92d-0b25-4b3f-9554-1e5427c36e97"). InnerVolumeSpecName "kube-api-access-wbppr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.446552 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a78aee9-062e-458e-93b7-be8197c84610-kube-api-access-rzqgc" (OuterVolumeSpecName: "kube-api-access-rzqgc") pod "5a78aee9-062e-458e-93b7-be8197c84610" (UID: "5a78aee9-062e-458e-93b7-be8197c84610"). InnerVolumeSpecName "kube-api-access-rzqgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.447349 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78c1e92d-0b25-4b3f-9554-1e5427c36e97-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "78c1e92d-0b25-4b3f-9554-1e5427c36e97" (UID: "78c1e92d-0b25-4b3f-9554-1e5427c36e97"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.449275 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a78aee9-062e-458e-93b7-be8197c84610-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5a78aee9-062e-458e-93b7-be8197c84610" (UID: "5a78aee9-062e-458e-93b7-be8197c84610"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.468284 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-scripts" (OuterVolumeSpecName: "scripts") pod "78c1e92d-0b25-4b3f-9554-1e5427c36e97" (UID: "78c1e92d-0b25-4b3f-9554-1e5427c36e97"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.468367 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-config-data" (OuterVolumeSpecName: "config-data") pod "5a78aee9-062e-458e-93b7-be8197c84610" (UID: "5a78aee9-062e-458e-93b7-be8197c84610"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.471964 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-scripts" (OuterVolumeSpecName: "scripts") pod "5a78aee9-062e-458e-93b7-be8197c84610" (UID: "5a78aee9-062e-458e-93b7-be8197c84610"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.473259 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-config-data" (OuterVolumeSpecName: "config-data") pod "78c1e92d-0b25-4b3f-9554-1e5427c36e97" (UID: "78c1e92d-0b25-4b3f-9554-1e5427c36e97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.538003 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.538041 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzqgc\" (UniqueName: \"kubernetes.io/projected/5a78aee9-062e-458e-93b7-be8197c84610-kube-api-access-rzqgc\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.538053 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/78c1e92d-0b25-4b3f-9554-1e5427c36e97-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.538063 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.538072 4909 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/78c1e92d-0b25-4b3f-9554-1e5427c36e97-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.538080 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbppr\" (UniqueName: \"kubernetes.io/projected/78c1e92d-0b25-4b3f-9554-1e5427c36e97-kube-api-access-wbppr\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.538089 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a78aee9-062e-458e-93b7-be8197c84610-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:27 crc kubenswrapper[4909]: I0202 12:12:27.538099 4909 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5a78aee9-062e-458e-93b7-be8197c84610-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:28 crc kubenswrapper[4909]: I0202 12:12:28.146856 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-57d579b9bb-2znmd" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.114:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8443: connect: connection refused" Feb 02 12:12:28 crc kubenswrapper[4909]: I0202 12:12:28.293396 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5477bcbbcc-smk66" Feb 02 12:12:28 crc kubenswrapper[4909]: I0202 12:12:28.299291 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74bd999885-xbm74" Feb 02 12:12:28 crc kubenswrapper[4909]: I0202 12:12:28.302895 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74bd999885-xbm74" event={"ID":"5a78aee9-062e-458e-93b7-be8197c84610","Type":"ContainerDied","Data":"c38379070d6554a0bbfb570c9ae53af6d90d3c41f90bb6d07db82c9670f5a610"} Feb 02 12:12:28 crc kubenswrapper[4909]: I0202 12:12:28.302954 4909 scope.go:117] "RemoveContainer" containerID="a5f6c40bb50703e7f9592ea75c786975ed50dd7dfaf472060441f222b4e4001f" Feb 02 12:12:28 crc kubenswrapper[4909]: I0202 12:12:28.335193 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5477bcbbcc-smk66"] Feb 02 12:12:28 crc kubenswrapper[4909]: I0202 12:12:28.354146 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5477bcbbcc-smk66"] Feb 02 12:12:28 crc kubenswrapper[4909]: I0202 12:12:28.364995 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74bd999885-xbm74"] Feb 02 12:12:28 crc kubenswrapper[4909]: I0202 12:12:28.375579 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-74bd999885-xbm74"] Feb 02 12:12:28 crc kubenswrapper[4909]: I0202 12:12:28.492766 4909 scope.go:117] "RemoveContainer" containerID="2338469d660dc8c860be39bbd8f8edd90d79068adb6e302000fefe580bfb913e" Feb 02 12:12:29 crc kubenswrapper[4909]: I0202 12:12:29.029465 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a78aee9-062e-458e-93b7-be8197c84610" path="/var/lib/kubelet/pods/5a78aee9-062e-458e-93b7-be8197c84610/volumes" Feb 02 12:12:29 crc kubenswrapper[4909]: I0202 12:12:29.030651 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" path="/var/lib/kubelet/pods/78c1e92d-0b25-4b3f-9554-1e5427c36e97/volumes" Feb 02 12:12:30 crc kubenswrapper[4909]: I0202 12:12:30.040376 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-wv6jm"] Feb 02 12:12:30 crc kubenswrapper[4909]: I0202 12:12:30.051140 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ff19-account-create-update-r4s5q"] Feb 02 12:12:30 crc kubenswrapper[4909]: I0202 12:12:30.060018 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ff19-account-create-update-r4s5q"] Feb 02 12:12:30 crc kubenswrapper[4909]: I0202 12:12:30.067871 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-wv6jm"] Feb 02 12:12:31 crc kubenswrapper[4909]: I0202 12:12:31.028130 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c842981-834f-414f-b66e-08d8f20a4b06" path="/var/lib/kubelet/pods/4c842981-834f-414f-b66e-08d8f20a4b06/volumes" Feb 02 12:12:31 crc kubenswrapper[4909]: I0202 12:12:31.028758 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="749a32f4-4c60-4f2c-bc33-eb38e8eaddd6" path="/var/lib/kubelet/pods/749a32f4-4c60-4f2c-bc33-eb38e8eaddd6/volumes" Feb 02 12:12:38 crc kubenswrapper[4909]: I0202 12:12:38.035026 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-n5npn"] Feb 02 12:12:38 crc kubenswrapper[4909]: I0202 12:12:38.051004 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-n5npn"] Feb 02 12:12:38 crc kubenswrapper[4909]: I0202 12:12:38.146021 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-57d579b9bb-2znmd" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.114:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8443: connect: connection refused" Feb 02 12:12:39 crc kubenswrapper[4909]: I0202 12:12:39.027402 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5164255-168d-4b6f-85de-d544b0b0642e" path="/var/lib/kubelet/pods/c5164255-168d-4b6f-85de-d544b0b0642e/volumes" Feb 02 12:12:48 crc kubenswrapper[4909]: I0202 12:12:48.144917 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-57d579b9bb-2znmd" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.114:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8443: connect: connection refused" Feb 02 12:12:48 crc kubenswrapper[4909]: I0202 12:12:48.145713 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:12:52 crc kubenswrapper[4909]: E0202 12:12:52.474028 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a78aee9_062e_458e_93b7_be8197c84610.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded357b84_9b8b_40ac_b2a9_4ae891e248a7.slice/crio-conmon-877bac677ebdad4604a78946e22260fc3fd3d3d9563461be6b0030f1cfd77b1a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a78aee9_062e_458e_93b7_be8197c84610.slice/crio-c38379070d6554a0bbfb570c9ae53af6d90d3c41f90bb6d07db82c9670f5a610\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a78aee9_062e_458e_93b7_be8197c84610.slice/crio-conmon-a5f6c40bb50703e7f9592ea75c786975ed50dd7dfaf472060441f222b4e4001f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a78aee9_062e_458e_93b7_be8197c84610.slice/crio-a5f6c40bb50703e7f9592ea75c786975ed50dd7dfaf472060441f222b4e4001f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a78aee9_062e_458e_93b7_be8197c84610.slice/crio-conmon-2338469d660dc8c860be39bbd8f8edd90d79068adb6e302000fefe580bfb913e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78c1e92d_0b25_4b3f_9554_1e5427c36e97.slice/crio-19f12c2fb15aa462b4602db6bd0504b905e8a68cec3ca389c4e097bfe9cf538f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78c1e92d_0b25_4b3f_9554_1e5427c36e97.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a78aee9_062e_458e_93b7_be8197c84610.slice/crio-2338469d660dc8c860be39bbd8f8edd90d79068adb6e302000fefe580bfb913e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded357b84_9b8b_40ac_b2a9_4ae891e248a7.slice/crio-877bac677ebdad4604a78946e22260fc3fd3d3d9563461be6b0030f1cfd77b1a.scope\": RecentStats: unable to find data in memory cache]" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.493717 4909 generic.go:334] "Generic (PLEG): container finished" podID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerID="877bac677ebdad4604a78946e22260fc3fd3d3d9563461be6b0030f1cfd77b1a" exitCode=137 Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.493760 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57d579b9bb-2znmd" event={"ID":"ed357b84-9b8b-40ac-b2a9-4ae891e248a7","Type":"ContainerDied","Data":"877bac677ebdad4604a78946e22260fc3fd3d3d9563461be6b0030f1cfd77b1a"} Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.620153 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.664800 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-scripts\") pod \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.665194 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-secret-key\") pod \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.665228 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-tls-certs\") pod \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.665329 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-combined-ca-bundle\") pod \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.665736 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-config-data\") pod \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.665782 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-logs\") pod \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.665848 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvrz8\" (UniqueName: \"kubernetes.io/projected/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-kube-api-access-qvrz8\") pod \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\" (UID: \"ed357b84-9b8b-40ac-b2a9-4ae891e248a7\") " Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.669672 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-logs" (OuterVolumeSpecName: "logs") pod "ed357b84-9b8b-40ac-b2a9-4ae891e248a7" (UID: "ed357b84-9b8b-40ac-b2a9-4ae891e248a7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.670562 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ed357b84-9b8b-40ac-b2a9-4ae891e248a7" (UID: "ed357b84-9b8b-40ac-b2a9-4ae891e248a7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.670965 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-kube-api-access-qvrz8" (OuterVolumeSpecName: "kube-api-access-qvrz8") pod "ed357b84-9b8b-40ac-b2a9-4ae891e248a7" (UID: "ed357b84-9b8b-40ac-b2a9-4ae891e248a7"). InnerVolumeSpecName "kube-api-access-qvrz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.690173 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-scripts" (OuterVolumeSpecName: "scripts") pod "ed357b84-9b8b-40ac-b2a9-4ae891e248a7" (UID: "ed357b84-9b8b-40ac-b2a9-4ae891e248a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.690263 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-config-data" (OuterVolumeSpecName: "config-data") pod "ed357b84-9b8b-40ac-b2a9-4ae891e248a7" (UID: "ed357b84-9b8b-40ac-b2a9-4ae891e248a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.693302 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed357b84-9b8b-40ac-b2a9-4ae891e248a7" (UID: "ed357b84-9b8b-40ac-b2a9-4ae891e248a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.717692 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "ed357b84-9b8b-40ac-b2a9-4ae891e248a7" (UID: "ed357b84-9b8b-40ac-b2a9-4ae891e248a7"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.768446 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.768486 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.768503 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvrz8\" (UniqueName: \"kubernetes.io/projected/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-kube-api-access-qvrz8\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.768518 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.768528 4909 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.768539 4909 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:52 crc kubenswrapper[4909]: I0202 12:12:52.768549 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed357b84-9b8b-40ac-b2a9-4ae891e248a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:12:53 crc kubenswrapper[4909]: I0202 12:12:53.506025 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57d579b9bb-2znmd" event={"ID":"ed357b84-9b8b-40ac-b2a9-4ae891e248a7","Type":"ContainerDied","Data":"14e3b458796c0cf98273d3b185857c171f420f893ad770f6ca149d2f06c8048f"} Feb 02 12:12:53 crc kubenswrapper[4909]: I0202 12:12:53.506093 4909 scope.go:117] "RemoveContainer" containerID="1540b67a373b365e32b8ba90ccc12d0000855bf43b0e5c3ab7fbe6997e859116" Feb 02 12:12:53 crc kubenswrapper[4909]: I0202 12:12:53.506125 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57d579b9bb-2znmd" Feb 02 12:12:53 crc kubenswrapper[4909]: I0202 12:12:53.543497 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57d579b9bb-2znmd"] Feb 02 12:12:53 crc kubenswrapper[4909]: I0202 12:12:53.556151 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57d579b9bb-2znmd"] Feb 02 12:12:53 crc kubenswrapper[4909]: I0202 12:12:53.688823 4909 scope.go:117] "RemoveContainer" containerID="877bac677ebdad4604a78946e22260fc3fd3d3d9563461be6b0030f1cfd77b1a" Feb 02 12:12:55 crc kubenswrapper[4909]: I0202 12:12:55.041406 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" path="/var/lib/kubelet/pods/ed357b84-9b8b-40ac-b2a9-4ae891e248a7/volumes" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.548893 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d78c46cb4-52qcm"] Feb 02 12:13:02 crc kubenswrapper[4909]: E0202 12:13:02.550091 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" containerName="horizon-log" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550110 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" containerName="horizon-log" Feb 02 12:13:02 crc kubenswrapper[4909]: E0202 12:13:02.550140 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a78aee9-062e-458e-93b7-be8197c84610" containerName="horizon" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550149 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a78aee9-062e-458e-93b7-be8197c84610" containerName="horizon" Feb 02 12:13:02 crc kubenswrapper[4909]: E0202 12:13:02.550167 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a78aee9-062e-458e-93b7-be8197c84610" containerName="horizon-log" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550175 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a78aee9-062e-458e-93b7-be8197c84610" containerName="horizon-log" Feb 02 12:13:02 crc kubenswrapper[4909]: E0202 12:13:02.550192 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550200 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon" Feb 02 12:13:02 crc kubenswrapper[4909]: E0202 12:13:02.550219 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" containerName="horizon" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550229 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" containerName="horizon" Feb 02 12:13:02 crc kubenswrapper[4909]: E0202 12:13:02.550259 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon-log" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550267 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon-log" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550535 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" containerName="horizon-log" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550555 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="78c1e92d-0b25-4b3f-9554-1e5427c36e97" containerName="horizon" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550567 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon-log" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550588 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed357b84-9b8b-40ac-b2a9-4ae891e248a7" containerName="horizon" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550601 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a78aee9-062e-458e-93b7-be8197c84610" containerName="horizon" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.550612 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a78aee9-062e-458e-93b7-be8197c84610" containerName="horizon-log" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.552109 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.569504 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d78c46cb4-52qcm"] Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.665217 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/343841e4-be3f-444a-a51e-89d5aeb87fa0-horizon-secret-key\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.665280 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/343841e4-be3f-444a-a51e-89d5aeb87fa0-config-data\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.665311 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/343841e4-be3f-444a-a51e-89d5aeb87fa0-scripts\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.665332 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/343841e4-be3f-444a-a51e-89d5aeb87fa0-horizon-tls-certs\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.665350 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/343841e4-be3f-444a-a51e-89d5aeb87fa0-logs\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.665378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz2jw\" (UniqueName: \"kubernetes.io/projected/343841e4-be3f-444a-a51e-89d5aeb87fa0-kube-api-access-qz2jw\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.665423 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343841e4-be3f-444a-a51e-89d5aeb87fa0-combined-ca-bundle\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.767614 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/343841e4-be3f-444a-a51e-89d5aeb87fa0-horizon-secret-key\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.768005 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/343841e4-be3f-444a-a51e-89d5aeb87fa0-config-data\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.768059 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/343841e4-be3f-444a-a51e-89d5aeb87fa0-scripts\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.768093 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/343841e4-be3f-444a-a51e-89d5aeb87fa0-horizon-tls-certs\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.768112 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/343841e4-be3f-444a-a51e-89d5aeb87fa0-logs\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.768136 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz2jw\" (UniqueName: \"kubernetes.io/projected/343841e4-be3f-444a-a51e-89d5aeb87fa0-kube-api-access-qz2jw\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.768196 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343841e4-be3f-444a-a51e-89d5aeb87fa0-combined-ca-bundle\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.769650 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/343841e4-be3f-444a-a51e-89d5aeb87fa0-scripts\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.769732 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/343841e4-be3f-444a-a51e-89d5aeb87fa0-logs\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.770124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/343841e4-be3f-444a-a51e-89d5aeb87fa0-config-data\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.774656 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/343841e4-be3f-444a-a51e-89d5aeb87fa0-horizon-tls-certs\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.775371 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/343841e4-be3f-444a-a51e-89d5aeb87fa0-combined-ca-bundle\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.776623 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/343841e4-be3f-444a-a51e-89d5aeb87fa0-horizon-secret-key\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.798389 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz2jw\" (UniqueName: \"kubernetes.io/projected/343841e4-be3f-444a-a51e-89d5aeb87fa0-kube-api-access-qz2jw\") pod \"horizon-6d78c46cb4-52qcm\" (UID: \"343841e4-be3f-444a-a51e-89d5aeb87fa0\") " pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:02 crc kubenswrapper[4909]: I0202 12:13:02.888412 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:03 crc kubenswrapper[4909]: I0202 12:13:03.436006 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d78c46cb4-52qcm"] Feb 02 12:13:03 crc kubenswrapper[4909]: I0202 12:13:03.600554 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d78c46cb4-52qcm" event={"ID":"343841e4-be3f-444a-a51e-89d5aeb87fa0","Type":"ContainerStarted","Data":"93c3bd8db6d27a42a6eeb97184b3778423ed551292811613b42be8a4ff33680a"} Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.043344 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-jfzjt"] Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.045473 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jfzjt" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.052041 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-jfzjt"] Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.147602 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-f0d4-account-create-update-7f2tp"] Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.149614 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f0d4-account-create-update-7f2tp" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.152558 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.163625 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-f0d4-account-create-update-7f2tp"] Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.198234 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnrsc\" (UniqueName: \"kubernetes.io/projected/c503a762-1f95-4131-8fca-dfbced528ce4-kube-api-access-tnrsc\") pod \"heat-db-create-jfzjt\" (UID: \"c503a762-1f95-4131-8fca-dfbced528ce4\") " pod="openstack/heat-db-create-jfzjt" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.198332 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c503a762-1f95-4131-8fca-dfbced528ce4-operator-scripts\") pod \"heat-db-create-jfzjt\" (UID: \"c503a762-1f95-4131-8fca-dfbced528ce4\") " pod="openstack/heat-db-create-jfzjt" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.299947 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c503a762-1f95-4131-8fca-dfbced528ce4-operator-scripts\") pod \"heat-db-create-jfzjt\" (UID: \"c503a762-1f95-4131-8fca-dfbced528ce4\") " pod="openstack/heat-db-create-jfzjt" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.300112 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz262\" (UniqueName: \"kubernetes.io/projected/6828af76-a320-4845-9d2f-584f161f5c8d-kube-api-access-sz262\") pod \"heat-f0d4-account-create-update-7f2tp\" (UID: \"6828af76-a320-4845-9d2f-584f161f5c8d\") " pod="openstack/heat-f0d4-account-create-update-7f2tp" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.300233 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnrsc\" (UniqueName: \"kubernetes.io/projected/c503a762-1f95-4131-8fca-dfbced528ce4-kube-api-access-tnrsc\") pod \"heat-db-create-jfzjt\" (UID: \"c503a762-1f95-4131-8fca-dfbced528ce4\") " pod="openstack/heat-db-create-jfzjt" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.300260 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6828af76-a320-4845-9d2f-584f161f5c8d-operator-scripts\") pod \"heat-f0d4-account-create-update-7f2tp\" (UID: \"6828af76-a320-4845-9d2f-584f161f5c8d\") " pod="openstack/heat-f0d4-account-create-update-7f2tp" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.301764 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c503a762-1f95-4131-8fca-dfbced528ce4-operator-scripts\") pod \"heat-db-create-jfzjt\" (UID: \"c503a762-1f95-4131-8fca-dfbced528ce4\") " pod="openstack/heat-db-create-jfzjt" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.327538 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnrsc\" (UniqueName: \"kubernetes.io/projected/c503a762-1f95-4131-8fca-dfbced528ce4-kube-api-access-tnrsc\") pod \"heat-db-create-jfzjt\" (UID: \"c503a762-1f95-4131-8fca-dfbced528ce4\") " pod="openstack/heat-db-create-jfzjt" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.383089 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jfzjt" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.402196 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6828af76-a320-4845-9d2f-584f161f5c8d-operator-scripts\") pod \"heat-f0d4-account-create-update-7f2tp\" (UID: \"6828af76-a320-4845-9d2f-584f161f5c8d\") " pod="openstack/heat-f0d4-account-create-update-7f2tp" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.402369 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz262\" (UniqueName: \"kubernetes.io/projected/6828af76-a320-4845-9d2f-584f161f5c8d-kube-api-access-sz262\") pod \"heat-f0d4-account-create-update-7f2tp\" (UID: \"6828af76-a320-4845-9d2f-584f161f5c8d\") " pod="openstack/heat-f0d4-account-create-update-7f2tp" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.404405 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6828af76-a320-4845-9d2f-584f161f5c8d-operator-scripts\") pod \"heat-f0d4-account-create-update-7f2tp\" (UID: \"6828af76-a320-4845-9d2f-584f161f5c8d\") " pod="openstack/heat-f0d4-account-create-update-7f2tp" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.421211 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz262\" (UniqueName: \"kubernetes.io/projected/6828af76-a320-4845-9d2f-584f161f5c8d-kube-api-access-sz262\") pod \"heat-f0d4-account-create-update-7f2tp\" (UID: \"6828af76-a320-4845-9d2f-584f161f5c8d\") " pod="openstack/heat-f0d4-account-create-update-7f2tp" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.471049 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f0d4-account-create-update-7f2tp" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.618643 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d78c46cb4-52qcm" event={"ID":"343841e4-be3f-444a-a51e-89d5aeb87fa0","Type":"ContainerStarted","Data":"19d64041b5f8f056a89be447fa99ad7ba38a3be62978c4e6fbb8cb0f3a56c174"} Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.618981 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d78c46cb4-52qcm" event={"ID":"343841e4-be3f-444a-a51e-89d5aeb87fa0","Type":"ContainerStarted","Data":"84c6d47d45a1000299409fbbf6e6b7451e01560566a5739593b930e8fe51e5a9"} Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.660217 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d78c46cb4-52qcm" podStartSLOduration=2.66020155 podStartE2EDuration="2.66020155s" podCreationTimestamp="2026-02-02 12:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:13:04.644736951 +0000 UTC m=+6110.390837686" watchObservedRunningTime="2026-02-02 12:13:04.66020155 +0000 UTC m=+6110.406302285" Feb 02 12:13:04 crc kubenswrapper[4909]: I0202 12:13:04.906633 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-jfzjt"] Feb 02 12:13:05 crc kubenswrapper[4909]: I0202 12:13:05.003780 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-f0d4-account-create-update-7f2tp"] Feb 02 12:13:05 crc kubenswrapper[4909]: I0202 12:13:05.628157 4909 generic.go:334] "Generic (PLEG): container finished" podID="6828af76-a320-4845-9d2f-584f161f5c8d" containerID="8044a9e89bcc0f2f50dcbd6bfcc65e225cba711291d3258c9425f7367a790d75" exitCode=0 Feb 02 12:13:05 crc kubenswrapper[4909]: I0202 12:13:05.628227 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-f0d4-account-create-update-7f2tp" event={"ID":"6828af76-a320-4845-9d2f-584f161f5c8d","Type":"ContainerDied","Data":"8044a9e89bcc0f2f50dcbd6bfcc65e225cba711291d3258c9425f7367a790d75"} Feb 02 12:13:05 crc kubenswrapper[4909]: I0202 12:13:05.628256 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-f0d4-account-create-update-7f2tp" event={"ID":"6828af76-a320-4845-9d2f-584f161f5c8d","Type":"ContainerStarted","Data":"4281ae49841f9f72e983b733727194594b2297b8aadb15d07e131a92d47e570f"} Feb 02 12:13:05 crc kubenswrapper[4909]: I0202 12:13:05.631116 4909 generic.go:334] "Generic (PLEG): container finished" podID="c503a762-1f95-4131-8fca-dfbced528ce4" containerID="810f86a4485c966a50ea2804a5d809ff2f855f29e1333d6e8a7636a9e60855b3" exitCode=0 Feb 02 12:13:05 crc kubenswrapper[4909]: I0202 12:13:05.631154 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-jfzjt" event={"ID":"c503a762-1f95-4131-8fca-dfbced528ce4","Type":"ContainerDied","Data":"810f86a4485c966a50ea2804a5d809ff2f855f29e1333d6e8a7636a9e60855b3"} Feb 02 12:13:05 crc kubenswrapper[4909]: I0202 12:13:05.631177 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-jfzjt" event={"ID":"c503a762-1f95-4131-8fca-dfbced528ce4","Type":"ContainerStarted","Data":"d7606078efa5332fb53bf546bb789453e637b5324081c5d8c5a96ea0b9e17b50"} Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.114633 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jfzjt" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.131543 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f0d4-account-create-update-7f2tp" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.263714 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6828af76-a320-4845-9d2f-584f161f5c8d-operator-scripts\") pod \"6828af76-a320-4845-9d2f-584f161f5c8d\" (UID: \"6828af76-a320-4845-9d2f-584f161f5c8d\") " Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.263770 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz262\" (UniqueName: \"kubernetes.io/projected/6828af76-a320-4845-9d2f-584f161f5c8d-kube-api-access-sz262\") pod \"6828af76-a320-4845-9d2f-584f161f5c8d\" (UID: \"6828af76-a320-4845-9d2f-584f161f5c8d\") " Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.263803 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnrsc\" (UniqueName: \"kubernetes.io/projected/c503a762-1f95-4131-8fca-dfbced528ce4-kube-api-access-tnrsc\") pod \"c503a762-1f95-4131-8fca-dfbced528ce4\" (UID: \"c503a762-1f95-4131-8fca-dfbced528ce4\") " Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.264125 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c503a762-1f95-4131-8fca-dfbced528ce4-operator-scripts\") pod \"c503a762-1f95-4131-8fca-dfbced528ce4\" (UID: \"c503a762-1f95-4131-8fca-dfbced528ce4\") " Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.265095 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c503a762-1f95-4131-8fca-dfbced528ce4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c503a762-1f95-4131-8fca-dfbced528ce4" (UID: "c503a762-1f95-4131-8fca-dfbced528ce4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.265538 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6828af76-a320-4845-9d2f-584f161f5c8d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6828af76-a320-4845-9d2f-584f161f5c8d" (UID: "6828af76-a320-4845-9d2f-584f161f5c8d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.271450 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6828af76-a320-4845-9d2f-584f161f5c8d-kube-api-access-sz262" (OuterVolumeSpecName: "kube-api-access-sz262") pod "6828af76-a320-4845-9d2f-584f161f5c8d" (UID: "6828af76-a320-4845-9d2f-584f161f5c8d"). InnerVolumeSpecName "kube-api-access-sz262". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.272907 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c503a762-1f95-4131-8fca-dfbced528ce4-kube-api-access-tnrsc" (OuterVolumeSpecName: "kube-api-access-tnrsc") pod "c503a762-1f95-4131-8fca-dfbced528ce4" (UID: "c503a762-1f95-4131-8fca-dfbced528ce4"). InnerVolumeSpecName "kube-api-access-tnrsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.366756 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c503a762-1f95-4131-8fca-dfbced528ce4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.366795 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6828af76-a320-4845-9d2f-584f161f5c8d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.366813 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz262\" (UniqueName: \"kubernetes.io/projected/6828af76-a320-4845-9d2f-584f161f5c8d-kube-api-access-sz262\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.366842 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnrsc\" (UniqueName: \"kubernetes.io/projected/c503a762-1f95-4131-8fca-dfbced528ce4-kube-api-access-tnrsc\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.650373 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-f0d4-account-create-update-7f2tp" event={"ID":"6828af76-a320-4845-9d2f-584f161f5c8d","Type":"ContainerDied","Data":"4281ae49841f9f72e983b733727194594b2297b8aadb15d07e131a92d47e570f"} Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.650437 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-f0d4-account-create-update-7f2tp" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.650441 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4281ae49841f9f72e983b733727194594b2297b8aadb15d07e131a92d47e570f" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.652019 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-jfzjt" event={"ID":"c503a762-1f95-4131-8fca-dfbced528ce4","Type":"ContainerDied","Data":"d7606078efa5332fb53bf546bb789453e637b5324081c5d8c5a96ea0b9e17b50"} Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.652052 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7606078efa5332fb53bf546bb789453e637b5324081c5d8c5a96ea0b9e17b50" Feb 02 12:13:07 crc kubenswrapper[4909]: I0202 12:13:07.652134 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-jfzjt" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.233943 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-cs882"] Feb 02 12:13:09 crc kubenswrapper[4909]: E0202 12:13:09.234626 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6828af76-a320-4845-9d2f-584f161f5c8d" containerName="mariadb-account-create-update" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.234647 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6828af76-a320-4845-9d2f-584f161f5c8d" containerName="mariadb-account-create-update" Feb 02 12:13:09 crc kubenswrapper[4909]: E0202 12:13:09.234724 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c503a762-1f95-4131-8fca-dfbced528ce4" containerName="mariadb-database-create" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.234731 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="c503a762-1f95-4131-8fca-dfbced528ce4" containerName="mariadb-database-create" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.234985 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6828af76-a320-4845-9d2f-584f161f5c8d" containerName="mariadb-account-create-update" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.235001 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="c503a762-1f95-4131-8fca-dfbced528ce4" containerName="mariadb-database-create" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.235916 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cs882" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.238982 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-52pqf" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.239069 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.251491 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-cs882"] Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.304722 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-config-data\") pod \"heat-db-sync-cs882\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " pod="openstack/heat-db-sync-cs882" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.304859 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-combined-ca-bundle\") pod \"heat-db-sync-cs882\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " pod="openstack/heat-db-sync-cs882" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.304884 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v8g5\" (UniqueName: \"kubernetes.io/projected/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-kube-api-access-2v8g5\") pod \"heat-db-sync-cs882\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " pod="openstack/heat-db-sync-cs882" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.406386 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-config-data\") pod \"heat-db-sync-cs882\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " pod="openstack/heat-db-sync-cs882" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.406508 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-combined-ca-bundle\") pod \"heat-db-sync-cs882\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " pod="openstack/heat-db-sync-cs882" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.406538 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v8g5\" (UniqueName: \"kubernetes.io/projected/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-kube-api-access-2v8g5\") pod \"heat-db-sync-cs882\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " pod="openstack/heat-db-sync-cs882" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.427043 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-config-data\") pod \"heat-db-sync-cs882\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " pod="openstack/heat-db-sync-cs882" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.427369 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-combined-ca-bundle\") pod \"heat-db-sync-cs882\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " pod="openstack/heat-db-sync-cs882" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.442641 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v8g5\" (UniqueName: \"kubernetes.io/projected/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-kube-api-access-2v8g5\") pod \"heat-db-sync-cs882\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " pod="openstack/heat-db-sync-cs882" Feb 02 12:13:09 crc kubenswrapper[4909]: I0202 12:13:09.553165 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cs882" Feb 02 12:13:10 crc kubenswrapper[4909]: I0202 12:13:10.134668 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-cs882"] Feb 02 12:13:10 crc kubenswrapper[4909]: W0202 12:13:10.137506 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a1e5bfe_7920_4cd4_84ea_2ff4bf7b8b0e.slice/crio-d36bd363a8385862f9fc60ba3bdc888c035f884ebc5c66ef45c01d87fb7fa9e3 WatchSource:0}: Error finding container d36bd363a8385862f9fc60ba3bdc888c035f884ebc5c66ef45c01d87fb7fa9e3: Status 404 returned error can't find the container with id d36bd363a8385862f9fc60ba3bdc888c035f884ebc5c66ef45c01d87fb7fa9e3 Feb 02 12:13:10 crc kubenswrapper[4909]: I0202 12:13:10.680438 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cs882" event={"ID":"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e","Type":"ContainerStarted","Data":"d36bd363a8385862f9fc60ba3bdc888c035f884ebc5c66ef45c01d87fb7fa9e3"} Feb 02 12:13:12 crc kubenswrapper[4909]: I0202 12:13:12.889207 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:12 crc kubenswrapper[4909]: I0202 12:13:12.889581 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:16 crc kubenswrapper[4909]: I0202 12:13:16.741345 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cs882" event={"ID":"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e","Type":"ContainerStarted","Data":"58411504fe3d8433ec2120550ba87a3e8c64bf94853cfdb54f5b27484ebcf7ee"} Feb 02 12:13:16 crc kubenswrapper[4909]: I0202 12:13:16.769454 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-cs882" podStartSLOduration=1.760969442 podStartE2EDuration="7.769436623s" podCreationTimestamp="2026-02-02 12:13:09 +0000 UTC" firstStartedPulling="2026-02-02 12:13:10.139732929 +0000 UTC m=+6115.885833674" lastFinishedPulling="2026-02-02 12:13:16.14820012 +0000 UTC m=+6121.894300855" observedRunningTime="2026-02-02 12:13:16.760203861 +0000 UTC m=+6122.506304596" watchObservedRunningTime="2026-02-02 12:13:16.769436623 +0000 UTC m=+6122.515537358" Feb 02 12:13:18 crc kubenswrapper[4909]: I0202 12:13:18.762324 4909 generic.go:334] "Generic (PLEG): container finished" podID="7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e" containerID="58411504fe3d8433ec2120550ba87a3e8c64bf94853cfdb54f5b27484ebcf7ee" exitCode=0 Feb 02 12:13:18 crc kubenswrapper[4909]: I0202 12:13:18.762397 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cs882" event={"ID":"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e","Type":"ContainerDied","Data":"58411504fe3d8433ec2120550ba87a3e8c64bf94853cfdb54f5b27484ebcf7ee"} Feb 02 12:13:19 crc kubenswrapper[4909]: I0202 12:13:19.510790 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:13:19 crc kubenswrapper[4909]: I0202 12:13:19.510915 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.124651 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cs882" Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.238695 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v8g5\" (UniqueName: \"kubernetes.io/projected/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-kube-api-access-2v8g5\") pod \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.238918 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-config-data\") pod \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.239058 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-combined-ca-bundle\") pod \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\" (UID: \"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e\") " Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.245388 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-kube-api-access-2v8g5" (OuterVolumeSpecName: "kube-api-access-2v8g5") pod "7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e" (UID: "7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e"). InnerVolumeSpecName "kube-api-access-2v8g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.273480 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e" (UID: "7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.315059 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-config-data" (OuterVolumeSpecName: "config-data") pod "7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e" (UID: "7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.341154 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.341188 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.341203 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v8g5\" (UniqueName: \"kubernetes.io/projected/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e-kube-api-access-2v8g5\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.780564 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cs882" event={"ID":"7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e","Type":"ContainerDied","Data":"d36bd363a8385862f9fc60ba3bdc888c035f884ebc5c66ef45c01d87fb7fa9e3"} Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.780602 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d36bd363a8385862f9fc60ba3bdc888c035f884ebc5c66ef45c01d87fb7fa9e3" Feb 02 12:13:20 crc kubenswrapper[4909]: I0202 12:13:20.780643 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cs882" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.825986 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-54cb865ffb-9dzj7"] Feb 02 12:13:21 crc kubenswrapper[4909]: E0202 12:13:21.826936 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e" containerName="heat-db-sync" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.826954 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e" containerName="heat-db-sync" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.827211 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e" containerName="heat-db-sync" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.831329 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.842687 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-52pqf" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.842746 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.843094 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.853058 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-54cb865ffb-9dzj7"] Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.978077 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data-custom\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.978182 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.978214 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-combined-ca-bundle\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:21 crc kubenswrapper[4909]: I0202 12:13:21.978377 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8lf\" (UniqueName: \"kubernetes.io/projected/f733c718-d19b-490f-8fc8-a91fc83f00e3-kube-api-access-pb8lf\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.074580 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7678cc9877-s78fm"] Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.076371 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.080259 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data-custom\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.080362 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.080394 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-combined-ca-bundle\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.080270 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.080492 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8lf\" (UniqueName: \"kubernetes.io/projected/f733c718-d19b-490f-8fc8-a91fc83f00e3-kube-api-access-pb8lf\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.090320 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-combined-ca-bundle\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.091138 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.091217 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-95f4d959b-pj66c"] Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.093582 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.095977 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.106688 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data-custom\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.118978 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7678cc9877-s78fm"] Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.131791 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8lf\" (UniqueName: \"kubernetes.io/projected/f733c718-d19b-490f-8fc8-a91fc83f00e3-kube-api-access-pb8lf\") pod \"heat-engine-54cb865ffb-9dzj7\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.152834 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-95f4d959b-pj66c"] Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.172156 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.182467 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data-custom\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.183001 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.183049 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-combined-ca-bundle\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.183122 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-combined-ca-bundle\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.183180 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data-custom\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.183215 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.183240 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wcdj\" (UniqueName: \"kubernetes.io/projected/9aaf5406-b4d0-4d24-995b-2185ac9edffb-kube-api-access-6wcdj\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.183366 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlsvg\" (UniqueName: \"kubernetes.io/projected/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-kube-api-access-vlsvg\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.284835 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data-custom\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.284876 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.284898 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wcdj\" (UniqueName: \"kubernetes.io/projected/9aaf5406-b4d0-4d24-995b-2185ac9edffb-kube-api-access-6wcdj\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.284961 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlsvg\" (UniqueName: \"kubernetes.io/projected/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-kube-api-access-vlsvg\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.285001 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data-custom\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.285072 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.285110 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-combined-ca-bundle\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.285157 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-combined-ca-bundle\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.292455 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data-custom\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.293426 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data-custom\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.305536 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.308573 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.309244 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-combined-ca-bundle\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.314942 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlsvg\" (UniqueName: \"kubernetes.io/projected/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-kube-api-access-vlsvg\") pod \"heat-cfnapi-95f4d959b-pj66c\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.318529 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wcdj\" (UniqueName: \"kubernetes.io/projected/9aaf5406-b4d0-4d24-995b-2185ac9edffb-kube-api-access-6wcdj\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.333720 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-combined-ca-bundle\") pod \"heat-api-7678cc9877-s78fm\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.354617 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.362547 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:22 crc kubenswrapper[4909]: W0202 12:13:22.700137 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf733c718_d19b_490f_8fc8_a91fc83f00e3.slice/crio-efab0ba1deae90e8c76f5f9ad91f0220e6a1fc87bd2b54cab6ae29c942611fb4 WatchSource:0}: Error finding container efab0ba1deae90e8c76f5f9ad91f0220e6a1fc87bd2b54cab6ae29c942611fb4: Status 404 returned error can't find the container with id efab0ba1deae90e8c76f5f9ad91f0220e6a1fc87bd2b54cab6ae29c942611fb4 Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.705484 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-54cb865ffb-9dzj7"] Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.809015 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54cb865ffb-9dzj7" event={"ID":"f733c718-d19b-490f-8fc8-a91fc83f00e3","Type":"ContainerStarted","Data":"efab0ba1deae90e8c76f5f9ad91f0220e6a1fc87bd2b54cab6ae29c942611fb4"} Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.891676 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d78c46cb4-52qcm" podUID="343841e4-be3f-444a-a51e-89d5aeb87fa0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.118:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8443: connect: connection refused" Feb 02 12:13:22 crc kubenswrapper[4909]: I0202 12:13:22.910154 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-95f4d959b-pj66c"] Feb 02 12:13:23 crc kubenswrapper[4909]: W0202 12:13:23.032191 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aaf5406_b4d0_4d24_995b_2185ac9edffb.slice/crio-2b741c12f48416e4a242875aff93edd96fabbcc45ec4305e8e48d5801cc56eb5 WatchSource:0}: Error finding container 2b741c12f48416e4a242875aff93edd96fabbcc45ec4305e8e48d5801cc56eb5: Status 404 returned error can't find the container with id 2b741c12f48416e4a242875aff93edd96fabbcc45ec4305e8e48d5801cc56eb5 Feb 02 12:13:23 crc kubenswrapper[4909]: I0202 12:13:23.034483 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7678cc9877-s78fm"] Feb 02 12:13:23 crc kubenswrapper[4909]: I0202 12:13:23.826790 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-95f4d959b-pj66c" event={"ID":"ca4ef3ac-ce80-439f-b041-abc4ff019f4e","Type":"ContainerStarted","Data":"fc02c21ee96f2607a30808d9120215ed7b775d564eeed0b9595f91e9a07a014d"} Feb 02 12:13:23 crc kubenswrapper[4909]: I0202 12:13:23.836043 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7678cc9877-s78fm" event={"ID":"9aaf5406-b4d0-4d24-995b-2185ac9edffb","Type":"ContainerStarted","Data":"2b741c12f48416e4a242875aff93edd96fabbcc45ec4305e8e48d5801cc56eb5"} Feb 02 12:13:23 crc kubenswrapper[4909]: I0202 12:13:23.839484 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54cb865ffb-9dzj7" event={"ID":"f733c718-d19b-490f-8fc8-a91fc83f00e3","Type":"ContainerStarted","Data":"f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8"} Feb 02 12:13:23 crc kubenswrapper[4909]: I0202 12:13:23.839665 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:23 crc kubenswrapper[4909]: I0202 12:13:23.863827 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-54cb865ffb-9dzj7" podStartSLOduration=2.863784735 podStartE2EDuration="2.863784735s" podCreationTimestamp="2026-02-02 12:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:13:23.857485146 +0000 UTC m=+6129.603585881" watchObservedRunningTime="2026-02-02 12:13:23.863784735 +0000 UTC m=+6129.609885470" Feb 02 12:13:25 crc kubenswrapper[4909]: I0202 12:13:25.866792 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-95f4d959b-pj66c" event={"ID":"ca4ef3ac-ce80-439f-b041-abc4ff019f4e","Type":"ContainerStarted","Data":"1f0cda32a2b9188f387a823f39f3c3e6ef06e199f61b7b23a82817112b28f1f5"} Feb 02 12:13:25 crc kubenswrapper[4909]: I0202 12:13:25.867563 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:25 crc kubenswrapper[4909]: I0202 12:13:25.869296 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7678cc9877-s78fm" event={"ID":"9aaf5406-b4d0-4d24-995b-2185ac9edffb","Type":"ContainerStarted","Data":"c0d32ae889bcdd2f6d376279c5e1720cf3e91246bf297cc07f703849b09b1d4c"} Feb 02 12:13:25 crc kubenswrapper[4909]: I0202 12:13:25.869634 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:25 crc kubenswrapper[4909]: I0202 12:13:25.884752 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-95f4d959b-pj66c" podStartSLOduration=3.079010133 podStartE2EDuration="4.884733336s" podCreationTimestamp="2026-02-02 12:13:21 +0000 UTC" firstStartedPulling="2026-02-02 12:13:22.915598242 +0000 UTC m=+6128.661698977" lastFinishedPulling="2026-02-02 12:13:24.721321445 +0000 UTC m=+6130.467422180" observedRunningTime="2026-02-02 12:13:25.880063623 +0000 UTC m=+6131.626164358" watchObservedRunningTime="2026-02-02 12:13:25.884733336 +0000 UTC m=+6131.630834071" Feb 02 12:13:25 crc kubenswrapper[4909]: I0202 12:13:25.907507 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7678cc9877-s78fm" podStartSLOduration=3.218239014 podStartE2EDuration="4.907490962s" podCreationTimestamp="2026-02-02 12:13:21 +0000 UTC" firstStartedPulling="2026-02-02 12:13:23.036366209 +0000 UTC m=+6128.782466944" lastFinishedPulling="2026-02-02 12:13:24.725618157 +0000 UTC m=+6130.471718892" observedRunningTime="2026-02-02 12:13:25.899643779 +0000 UTC m=+6131.645744514" watchObservedRunningTime="2026-02-02 12:13:25.907490962 +0000 UTC m=+6131.653591697" Feb 02 12:13:26 crc kubenswrapper[4909]: I0202 12:13:26.230080 4909 scope.go:117] "RemoveContainer" containerID="372ed309172d584b41df6708891739e8ebbfcf567d87604e8135b3774679eb16" Feb 02 12:13:26 crc kubenswrapper[4909]: I0202 12:13:26.252501 4909 scope.go:117] "RemoveContainer" containerID="61427af6e27e432f4da4bd3a05d4ac6b158193e75d7477d20ddea56e4baee4dd" Feb 02 12:13:26 crc kubenswrapper[4909]: I0202 12:13:26.314555 4909 scope.go:117] "RemoveContainer" containerID="48580c5d41b1035b23a3ff31fb0e276385d4a59af11fd04d587c3fc8e18931b7" Feb 02 12:13:26 crc kubenswrapper[4909]: I0202 12:13:26.389618 4909 scope.go:117] "RemoveContainer" containerID="44c9928a76f06252169ec8da41f7fe6c50bc1d1848ed5fe6dd867101383f49fc" Feb 02 12:13:26 crc kubenswrapper[4909]: I0202 12:13:26.424044 4909 scope.go:117] "RemoveContainer" containerID="f9907d6fba5061a45123d1fbecc75ca822de08d7ef12ae69af5f51716153c2ce" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.440804 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-9478bc769-cxwt5"] Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.442576 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.454153 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5fbd846955-7fxfq"] Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.455741 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.469324 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-77c65bc69c-n4w79"] Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.470948 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.482260 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-9478bc769-cxwt5"] Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.510776 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5fbd846955-7fxfq"] Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.550270 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-combined-ca-bundle\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.550343 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cprwf\" (UniqueName: \"kubernetes.io/projected/2f53270c-2995-4557-8010-3315762c06c8-kube-api-access-cprwf\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.550367 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-combined-ca-bundle\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.550424 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.550497 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data-custom\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.550516 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f53270c-2995-4557-8010-3315762c06c8-config-data\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.550615 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssjx6\" (UniqueName: \"kubernetes.io/projected/d5ede287-b375-4953-b46c-bb7afb372891-kube-api-access-ssjx6\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.550670 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data-custom\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.550776 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.550926 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47p8z\" (UniqueName: \"kubernetes.io/projected/22709a9d-0d92-48ef-8967-588ecaf1b50b-kube-api-access-47p8z\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.551066 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f53270c-2995-4557-8010-3315762c06c8-config-data-custom\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.551197 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f53270c-2995-4557-8010-3315762c06c8-combined-ca-bundle\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.559370 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-77c65bc69c-n4w79"] Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.653290 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data-custom\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.653343 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.653384 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47p8z\" (UniqueName: \"kubernetes.io/projected/22709a9d-0d92-48ef-8967-588ecaf1b50b-kube-api-access-47p8z\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.653779 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f53270c-2995-4557-8010-3315762c06c8-config-data-custom\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.653906 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f53270c-2995-4557-8010-3315762c06c8-combined-ca-bundle\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.654483 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-combined-ca-bundle\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.654520 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cprwf\" (UniqueName: \"kubernetes.io/projected/2f53270c-2995-4557-8010-3315762c06c8-kube-api-access-cprwf\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.654543 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-combined-ca-bundle\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.654562 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.654823 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data-custom\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.656217 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f53270c-2995-4557-8010-3315762c06c8-config-data\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.656323 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssjx6\" (UniqueName: \"kubernetes.io/projected/d5ede287-b375-4953-b46c-bb7afb372891-kube-api-access-ssjx6\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.660307 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-combined-ca-bundle\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.660622 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2f53270c-2995-4557-8010-3315762c06c8-config-data-custom\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.663639 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data-custom\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.667588 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.668681 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f53270c-2995-4557-8010-3315762c06c8-combined-ca-bundle\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.669269 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data-custom\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.671138 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f53270c-2995-4557-8010-3315762c06c8-config-data\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.671802 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47p8z\" (UniqueName: \"kubernetes.io/projected/22709a9d-0d92-48ef-8967-588ecaf1b50b-kube-api-access-47p8z\") pod \"heat-api-5fbd846955-7fxfq\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.672192 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssjx6\" (UniqueName: \"kubernetes.io/projected/d5ede287-b375-4953-b46c-bb7afb372891-kube-api-access-ssjx6\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.673266 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.675417 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cprwf\" (UniqueName: \"kubernetes.io/projected/2f53270c-2995-4557-8010-3315762c06c8-kube-api-access-cprwf\") pod \"heat-engine-9478bc769-cxwt5\" (UID: \"2f53270c-2995-4557-8010-3315762c06c8\") " pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.682588 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-combined-ca-bundle\") pod \"heat-cfnapi-77c65bc69c-n4w79\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.765643 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.782655 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:29 crc kubenswrapper[4909]: I0202 12:13:29.798702 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.307995 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-9478bc769-cxwt5"] Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.316017 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5fbd846955-7fxfq"] Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.461870 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-77c65bc69c-n4w79"] Feb 02 12:13:30 crc kubenswrapper[4909]: W0202 12:13:30.481139 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ede287_b375_4953_b46c_bb7afb372891.slice/crio-96a3853d21ae7646a8be4388932779cc62f8c846324b7ba35c129111a54ac2e3 WatchSource:0}: Error finding container 96a3853d21ae7646a8be4388932779cc62f8c846324b7ba35c129111a54ac2e3: Status 404 returned error can't find the container with id 96a3853d21ae7646a8be4388932779cc62f8c846324b7ba35c129111a54ac2e3 Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.629676 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-95f4d959b-pj66c"] Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.630127 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-95f4d959b-pj66c" podUID="ca4ef3ac-ce80-439f-b041-abc4ff019f4e" containerName="heat-cfnapi" containerID="cri-o://1f0cda32a2b9188f387a823f39f3c3e6ef06e199f61b7b23a82817112b28f1f5" gracePeriod=60 Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.697238 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7678cc9877-s78fm"] Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.697482 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-7678cc9877-s78fm" podUID="9aaf5406-b4d0-4d24-995b-2185ac9edffb" containerName="heat-api" containerID="cri-o://c0d32ae889bcdd2f6d376279c5e1720cf3e91246bf297cc07f703849b09b1d4c" gracePeriod=60 Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.712087 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-7678cc9877-s78fm" podUID="9aaf5406-b4d0-4d24-995b-2185ac9edffb" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.123:8004/healthcheck\": EOF" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.721866 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5476649989-mfkg8"] Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.723212 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.733944 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.734178 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.778546 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-config-data-custom\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.778859 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-config-data\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.778924 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-combined-ca-bundle\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.778973 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-internal-tls-certs\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.779015 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97sft\" (UniqueName: \"kubernetes.io/projected/48c2c982-be37-412b-83e4-7e572a2a2422-kube-api-access-97sft\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.779046 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-public-tls-certs\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.779165 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5bd5f6b94d-f6rqx"] Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.780661 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.792093 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5476649989-mfkg8"] Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.794373 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.794566 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.846976 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5bd5f6b94d-f6rqx"] Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.880738 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-internal-tls-certs\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.880793 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-public-tls-certs\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.880841 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-internal-tls-certs\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.880865 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97sft\" (UniqueName: \"kubernetes.io/projected/48c2c982-be37-412b-83e4-7e572a2a2422-kube-api-access-97sft\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.880891 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-config-data-custom\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.880915 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-public-tls-certs\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.880969 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-config-data-custom\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.881019 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-combined-ca-bundle\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.881061 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-config-data\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.881080 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-combined-ca-bundle\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.881098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-config-data\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.881119 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmvhz\" (UniqueName: \"kubernetes.io/projected/fae5779d-25ee-4282-9474-8306081d28b5-kube-api-access-lmvhz\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.890621 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-public-tls-certs\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.896602 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-internal-tls-certs\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.896711 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-config-data-custom\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.897055 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-combined-ca-bundle\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.904837 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48c2c982-be37-412b-83e4-7e572a2a2422-config-data\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.923320 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97sft\" (UniqueName: \"kubernetes.io/projected/48c2c982-be37-412b-83e4-7e572a2a2422-kube-api-access-97sft\") pod \"heat-cfnapi-5476649989-mfkg8\" (UID: \"48c2c982-be37-412b-83e4-7e572a2a2422\") " pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.939735 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5fbd846955-7fxfq" event={"ID":"22709a9d-0d92-48ef-8967-588ecaf1b50b","Type":"ContainerStarted","Data":"b0d930a4e2ba61b768420a024e7ab2dda008f4aab987cc76c2a89bc9877b89f4"} Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.940104 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5fbd846955-7fxfq" event={"ID":"22709a9d-0d92-48ef-8967-588ecaf1b50b","Type":"ContainerStarted","Data":"b27751aba3b2812a0138ffa9de3c8fe3fa5cdb121057850d6077983d07287c0d"} Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.940139 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.947108 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-9478bc769-cxwt5" event={"ID":"2f53270c-2995-4557-8010-3315762c06c8","Type":"ContainerStarted","Data":"dd09585faa7781d2e977202cb5a1c23ac135634646c1192f31cb42503db9ba60"} Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.973999 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" event={"ID":"d5ede287-b375-4953-b46c-bb7afb372891","Type":"ContainerStarted","Data":"96a3853d21ae7646a8be4388932779cc62f8c846324b7ba35c129111a54ac2e3"} Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.976072 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5fbd846955-7fxfq" podStartSLOduration=1.9760511950000001 podStartE2EDuration="1.976051195s" podCreationTimestamp="2026-02-02 12:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:13:30.964530138 +0000 UTC m=+6136.710630873" watchObservedRunningTime="2026-02-02 12:13:30.976051195 +0000 UTC m=+6136.722151930" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.983991 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-config-data\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.984042 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmvhz\" (UniqueName: \"kubernetes.io/projected/fae5779d-25ee-4282-9474-8306081d28b5-kube-api-access-lmvhz\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.984095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-public-tls-certs\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.984164 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-internal-tls-certs\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.984194 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-config-data-custom\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.984316 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-combined-ca-bundle\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.988058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-config-data\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.988795 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-public-tls-certs\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.990884 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-config-data-custom\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:30 crc kubenswrapper[4909]: I0202 12:13:30.995454 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-internal-tls-certs\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.000597 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fae5779d-25ee-4282-9474-8306081d28b5-combined-ca-bundle\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.010417 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmvhz\" (UniqueName: \"kubernetes.io/projected/fae5779d-25ee-4282-9474-8306081d28b5-kube-api-access-lmvhz\") pod \"heat-api-5bd5f6b94d-f6rqx\" (UID: \"fae5779d-25ee-4282-9474-8306081d28b5\") " pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.097301 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.136268 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.685383 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5476649989-mfkg8"] Feb 02 12:13:31 crc kubenswrapper[4909]: W0202 12:13:31.709024 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c2c982_be37_412b_83e4_7e572a2a2422.slice/crio-2857494c9eb5496e212a99f318ba1d0b1019878ae4d27c5282c8f23e40b3ecd0 WatchSource:0}: Error finding container 2857494c9eb5496e212a99f318ba1d0b1019878ae4d27c5282c8f23e40b3ecd0: Status 404 returned error can't find the container with id 2857494c9eb5496e212a99f318ba1d0b1019878ae4d27c5282c8f23e40b3ecd0 Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.810888 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5bd5f6b94d-f6rqx"] Feb 02 12:13:31 crc kubenswrapper[4909]: W0202 12:13:31.830543 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfae5779d_25ee_4282_9474_8306081d28b5.slice/crio-ce8853f04896e7f04635f60e5695fd8443ba6eaf8d6c317ebc97e43bbb7e061b WatchSource:0}: Error finding container ce8853f04896e7f04635f60e5695fd8443ba6eaf8d6c317ebc97e43bbb7e061b: Status 404 returned error can't find the container with id ce8853f04896e7f04635f60e5695fd8443ba6eaf8d6c317ebc97e43bbb7e061b Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.987015 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5476649989-mfkg8" event={"ID":"48c2c982-be37-412b-83e4-7e572a2a2422","Type":"ContainerStarted","Data":"2857494c9eb5496e212a99f318ba1d0b1019878ae4d27c5282c8f23e40b3ecd0"} Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.989319 4909 generic.go:334] "Generic (PLEG): container finished" podID="22709a9d-0d92-48ef-8967-588ecaf1b50b" containerID="b0d930a4e2ba61b768420a024e7ab2dda008f4aab987cc76c2a89bc9877b89f4" exitCode=1 Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.989380 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5fbd846955-7fxfq" event={"ID":"22709a9d-0d92-48ef-8967-588ecaf1b50b","Type":"ContainerDied","Data":"b0d930a4e2ba61b768420a024e7ab2dda008f4aab987cc76c2a89bc9877b89f4"} Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.990077 4909 scope.go:117] "RemoveContainer" containerID="b0d930a4e2ba61b768420a024e7ab2dda008f4aab987cc76c2a89bc9877b89f4" Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.995442 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-9478bc769-cxwt5" event={"ID":"2f53270c-2995-4557-8010-3315762c06c8","Type":"ContainerStarted","Data":"21d29b7686e660664e1b6ab035928c7db8e4972fd8192f405e7d454604539454"} Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.996233 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:31 crc kubenswrapper[4909]: I0202 12:13:31.999552 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5bd5f6b94d-f6rqx" event={"ID":"fae5779d-25ee-4282-9474-8306081d28b5","Type":"ContainerStarted","Data":"ce8853f04896e7f04635f60e5695fd8443ba6eaf8d6c317ebc97e43bbb7e061b"} Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.012548 4909 generic.go:334] "Generic (PLEG): container finished" podID="d5ede287-b375-4953-b46c-bb7afb372891" containerID="8e9983198b05da440bb2328ed675982826cef37767d541b734bcc608baa1b3eb" exitCode=1 Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.012587 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" event={"ID":"d5ede287-b375-4953-b46c-bb7afb372891","Type":"ContainerDied","Data":"8e9983198b05da440bb2328ed675982826cef37767d541b734bcc608baa1b3eb"} Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.013207 4909 scope.go:117] "RemoveContainer" containerID="8e9983198b05da440bb2328ed675982826cef37767d541b734bcc608baa1b3eb" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.064241 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-9478bc769-cxwt5" podStartSLOduration=3.064210701 podStartE2EDuration="3.064210701s" podCreationTimestamp="2026-02-02 12:13:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:13:32.024900926 +0000 UTC m=+6137.771001661" watchObservedRunningTime="2026-02-02 12:13:32.064210701 +0000 UTC m=+6137.810311456" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.372242 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mbdhn"] Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.374325 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.384436 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbdhn"] Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.429711 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.529958 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dckhl\" (UniqueName: \"kubernetes.io/projected/79d31cb4-9244-41d0-b49d-c49e0f5534a7-kube-api-access-dckhl\") pod \"community-operators-mbdhn\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.530034 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-catalog-content\") pod \"community-operators-mbdhn\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.530089 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-utilities\") pod \"community-operators-mbdhn\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.632447 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-catalog-content\") pod \"community-operators-mbdhn\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.632561 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-utilities\") pod \"community-operators-mbdhn\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.632707 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dckhl\" (UniqueName: \"kubernetes.io/projected/79d31cb4-9244-41d0-b49d-c49e0f5534a7-kube-api-access-dckhl\") pod \"community-operators-mbdhn\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.633514 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-catalog-content\") pod \"community-operators-mbdhn\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.638523 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-utilities\") pod \"community-operators-mbdhn\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.656435 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dckhl\" (UniqueName: \"kubernetes.io/projected/79d31cb4-9244-41d0-b49d-c49e0f5534a7-kube-api-access-dckhl\") pod \"community-operators-mbdhn\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:32 crc kubenswrapper[4909]: I0202 12:13:32.692993 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.038488 4909 scope.go:117] "RemoveContainer" containerID="9c168ef3d62c8984e694a539a0b378e74491cdecf8d3897783e77021c2fba293" Feb 02 12:13:33 crc kubenswrapper[4909]: E0202 12:13:33.039094 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-77c65bc69c-n4w79_openstack(d5ede287-b375-4953-b46c-bb7afb372891)\"" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" podUID="d5ede287-b375-4953-b46c-bb7afb372891" Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.039858 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" event={"ID":"d5ede287-b375-4953-b46c-bb7afb372891","Type":"ContainerStarted","Data":"9c168ef3d62c8984e694a539a0b378e74491cdecf8d3897783e77021c2fba293"} Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.072420 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5476649989-mfkg8" event={"ID":"48c2c982-be37-412b-83e4-7e572a2a2422","Type":"ContainerStarted","Data":"53ea2b40eaad75020d24c6e6415502dd2cebe7f889cf437afb5951459120993f"} Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.073616 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.131205 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5476649989-mfkg8" podStartSLOduration=3.131189815 podStartE2EDuration="3.131189815s" podCreationTimestamp="2026-02-02 12:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:13:33.103269703 +0000 UTC m=+6138.849370438" watchObservedRunningTime="2026-02-02 12:13:33.131189815 +0000 UTC m=+6138.877290540" Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.133294 4909 generic.go:334] "Generic (PLEG): container finished" podID="22709a9d-0d92-48ef-8967-588ecaf1b50b" containerID="2742ff993b9ddf74fb0a6b717468ef77757b92d1227fe87959dcd2c366e759f5" exitCode=1 Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.134878 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5fbd846955-7fxfq" event={"ID":"22709a9d-0d92-48ef-8967-588ecaf1b50b","Type":"ContainerDied","Data":"2742ff993b9ddf74fb0a6b717468ef77757b92d1227fe87959dcd2c366e759f5"} Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.134920 4909 scope.go:117] "RemoveContainer" containerID="b0d930a4e2ba61b768420a024e7ab2dda008f4aab987cc76c2a89bc9877b89f4" Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.135581 4909 scope.go:117] "RemoveContainer" containerID="2742ff993b9ddf74fb0a6b717468ef77757b92d1227fe87959dcd2c366e759f5" Feb 02 12:13:33 crc kubenswrapper[4909]: E0202 12:13:33.145585 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5fbd846955-7fxfq_openstack(22709a9d-0d92-48ef-8967-588ecaf1b50b)\"" pod="openstack/heat-api-5fbd846955-7fxfq" podUID="22709a9d-0d92-48ef-8967-588ecaf1b50b" Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.202869 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.202898 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5bd5f6b94d-f6rqx" event={"ID":"fae5779d-25ee-4282-9474-8306081d28b5","Type":"ContainerStarted","Data":"ac4bd2e28fc898a8e50d273876879325717bd342455bfb72e0e8221fe566f06e"} Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.224861 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5bd5f6b94d-f6rqx" podStartSLOduration=3.224845283 podStartE2EDuration="3.224845283s" podCreationTimestamp="2026-02-02 12:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:13:33.223364221 +0000 UTC m=+6138.969464966" watchObservedRunningTime="2026-02-02 12:13:33.224845283 +0000 UTC m=+6138.970946018" Feb 02 12:13:33 crc kubenswrapper[4909]: I0202 12:13:33.406690 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mbdhn"] Feb 02 12:13:33 crc kubenswrapper[4909]: W0202 12:13:33.410076 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79d31cb4_9244_41d0_b49d_c49e0f5534a7.slice/crio-b56a8917309d74a54cf475f44b3491e75e8e722b6da8110d1290ae8c54f151d9 WatchSource:0}: Error finding container b56a8917309d74a54cf475f44b3491e75e8e722b6da8110d1290ae8c54f151d9: Status 404 returned error can't find the container with id b56a8917309d74a54cf475f44b3491e75e8e722b6da8110d1290ae8c54f151d9 Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.212609 4909 generic.go:334] "Generic (PLEG): container finished" podID="d5ede287-b375-4953-b46c-bb7afb372891" containerID="9c168ef3d62c8984e694a539a0b378e74491cdecf8d3897783e77021c2fba293" exitCode=1 Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.212698 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" event={"ID":"d5ede287-b375-4953-b46c-bb7afb372891","Type":"ContainerDied","Data":"9c168ef3d62c8984e694a539a0b378e74491cdecf8d3897783e77021c2fba293"} Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.213187 4909 scope.go:117] "RemoveContainer" containerID="8e9983198b05da440bb2328ed675982826cef37767d541b734bcc608baa1b3eb" Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.213374 4909 scope.go:117] "RemoveContainer" containerID="9c168ef3d62c8984e694a539a0b378e74491cdecf8d3897783e77021c2fba293" Feb 02 12:13:34 crc kubenswrapper[4909]: E0202 12:13:34.213597 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-77c65bc69c-n4w79_openstack(d5ede287-b375-4953-b46c-bb7afb372891)\"" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" podUID="d5ede287-b375-4953-b46c-bb7afb372891" Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.216270 4909 generic.go:334] "Generic (PLEG): container finished" podID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerID="7da288cabb0051a161429863fb022fb5d4e4b697754a33459489d2f0e28c7459" exitCode=0 Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.216612 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbdhn" event={"ID":"79d31cb4-9244-41d0-b49d-c49e0f5534a7","Type":"ContainerDied","Data":"7da288cabb0051a161429863fb022fb5d4e4b697754a33459489d2f0e28c7459"} Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.216661 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbdhn" event={"ID":"79d31cb4-9244-41d0-b49d-c49e0f5534a7","Type":"ContainerStarted","Data":"b56a8917309d74a54cf475f44b3491e75e8e722b6da8110d1290ae8c54f151d9"} Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.221652 4909 scope.go:117] "RemoveContainer" containerID="2742ff993b9ddf74fb0a6b717468ef77757b92d1227fe87959dcd2c366e759f5" Feb 02 12:13:34 crc kubenswrapper[4909]: E0202 12:13:34.221839 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5fbd846955-7fxfq_openstack(22709a9d-0d92-48ef-8967-588ecaf1b50b)\"" pod="openstack/heat-api-5fbd846955-7fxfq" podUID="22709a9d-0d92-48ef-8967-588ecaf1b50b" Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.783299 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.783346 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.798928 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:34 crc kubenswrapper[4909]: I0202 12:13:34.799013 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:35 crc kubenswrapper[4909]: I0202 12:13:35.234498 4909 scope.go:117] "RemoveContainer" containerID="2742ff993b9ddf74fb0a6b717468ef77757b92d1227fe87959dcd2c366e759f5" Feb 02 12:13:35 crc kubenswrapper[4909]: E0202 12:13:35.234802 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5fbd846955-7fxfq_openstack(22709a9d-0d92-48ef-8967-588ecaf1b50b)\"" pod="openstack/heat-api-5fbd846955-7fxfq" podUID="22709a9d-0d92-48ef-8967-588ecaf1b50b" Feb 02 12:13:35 crc kubenswrapper[4909]: I0202 12:13:35.236377 4909 scope.go:117] "RemoveContainer" containerID="9c168ef3d62c8984e694a539a0b378e74491cdecf8d3897783e77021c2fba293" Feb 02 12:13:35 crc kubenswrapper[4909]: E0202 12:13:35.236685 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-77c65bc69c-n4w79_openstack(d5ede287-b375-4953-b46c-bb7afb372891)\"" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" podUID="d5ede287-b375-4953-b46c-bb7afb372891" Feb 02 12:13:35 crc kubenswrapper[4909]: I0202 12:13:35.776028 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.087303 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-95f4d959b-pj66c" podUID="ca4ef3ac-ce80-439f-b041-abc4ff019f4e" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.124:8000/healthcheck\": read tcp 10.217.0.2:34472->10.217.1.124:8000: read: connection reset by peer" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.195687 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-7678cc9877-s78fm" podUID="9aaf5406-b4d0-4d24-995b-2185ac9edffb" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.123:8004/healthcheck\": read tcp 10.217.0.2:50312->10.217.1.123:8004: read: connection reset by peer" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.268966 4909 generic.go:334] "Generic (PLEG): container finished" podID="9aaf5406-b4d0-4d24-995b-2185ac9edffb" containerID="c0d32ae889bcdd2f6d376279c5e1720cf3e91246bf297cc07f703849b09b1d4c" exitCode=0 Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.269292 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7678cc9877-s78fm" event={"ID":"9aaf5406-b4d0-4d24-995b-2185ac9edffb","Type":"ContainerDied","Data":"c0d32ae889bcdd2f6d376279c5e1720cf3e91246bf297cc07f703849b09b1d4c"} Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.271597 4909 generic.go:334] "Generic (PLEG): container finished" podID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerID="f85a23472c8fbff2be548289b891e068417c6e870773040eff8852255fb2a595" exitCode=0 Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.271669 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbdhn" event={"ID":"79d31cb4-9244-41d0-b49d-c49e0f5534a7","Type":"ContainerDied","Data":"f85a23472c8fbff2be548289b891e068417c6e870773040eff8852255fb2a595"} Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.276661 4909 generic.go:334] "Generic (PLEG): container finished" podID="ca4ef3ac-ce80-439f-b041-abc4ff019f4e" containerID="1f0cda32a2b9188f387a823f39f3c3e6ef06e199f61b7b23a82817112b28f1f5" exitCode=0 Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.277386 4909 scope.go:117] "RemoveContainer" containerID="9c168ef3d62c8984e694a539a0b378e74491cdecf8d3897783e77021c2fba293" Feb 02 12:13:36 crc kubenswrapper[4909]: E0202 12:13:36.277641 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-77c65bc69c-n4w79_openstack(d5ede287-b375-4953-b46c-bb7afb372891)\"" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" podUID="d5ede287-b375-4953-b46c-bb7afb372891" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.277713 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-95f4d959b-pj66c" event={"ID":"ca4ef3ac-ce80-439f-b041-abc4ff019f4e","Type":"ContainerDied","Data":"1f0cda32a2b9188f387a823f39f3c3e6ef06e199f61b7b23a82817112b28f1f5"} Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.635173 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.641430 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.769117 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data\") pod \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.769499 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wcdj\" (UniqueName: \"kubernetes.io/projected/9aaf5406-b4d0-4d24-995b-2185ac9edffb-kube-api-access-6wcdj\") pod \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.769524 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlsvg\" (UniqueName: \"kubernetes.io/projected/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-kube-api-access-vlsvg\") pod \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.769572 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data-custom\") pod \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.769658 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data\") pod \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.769721 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-combined-ca-bundle\") pod \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\" (UID: \"ca4ef3ac-ce80-439f-b041-abc4ff019f4e\") " Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.769765 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data-custom\") pod \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.769852 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-combined-ca-bundle\") pod \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\" (UID: \"9aaf5406-b4d0-4d24-995b-2185ac9edffb\") " Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.775278 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aaf5406-b4d0-4d24-995b-2185ac9edffb-kube-api-access-6wcdj" (OuterVolumeSpecName: "kube-api-access-6wcdj") pod "9aaf5406-b4d0-4d24-995b-2185ac9edffb" (UID: "9aaf5406-b4d0-4d24-995b-2185ac9edffb"). InnerVolumeSpecName "kube-api-access-6wcdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.776987 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ca4ef3ac-ce80-439f-b041-abc4ff019f4e" (UID: "ca4ef3ac-ce80-439f-b041-abc4ff019f4e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.778332 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-kube-api-access-vlsvg" (OuterVolumeSpecName: "kube-api-access-vlsvg") pod "ca4ef3ac-ce80-439f-b041-abc4ff019f4e" (UID: "ca4ef3ac-ce80-439f-b041-abc4ff019f4e"). InnerVolumeSpecName "kube-api-access-vlsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.788186 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9aaf5406-b4d0-4d24-995b-2185ac9edffb" (UID: "9aaf5406-b4d0-4d24-995b-2185ac9edffb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.802037 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca4ef3ac-ce80-439f-b041-abc4ff019f4e" (UID: "ca4ef3ac-ce80-439f-b041-abc4ff019f4e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.813951 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9aaf5406-b4d0-4d24-995b-2185ac9edffb" (UID: "9aaf5406-b4d0-4d24-995b-2185ac9edffb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.832854 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data" (OuterVolumeSpecName: "config-data") pod "9aaf5406-b4d0-4d24-995b-2185ac9edffb" (UID: "9aaf5406-b4d0-4d24-995b-2185ac9edffb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.843209 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data" (OuterVolumeSpecName: "config-data") pod "ca4ef3ac-ce80-439f-b041-abc4ff019f4e" (UID: "ca4ef3ac-ce80-439f-b041-abc4ff019f4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.872613 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.872643 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.872652 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wcdj\" (UniqueName: \"kubernetes.io/projected/9aaf5406-b4d0-4d24-995b-2185ac9edffb-kube-api-access-6wcdj\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.872666 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlsvg\" (UniqueName: \"kubernetes.io/projected/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-kube-api-access-vlsvg\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.872674 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.872683 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.872690 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4ef3ac-ce80-439f-b041-abc4ff019f4e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:36 crc kubenswrapper[4909]: I0202 12:13:36.872720 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9aaf5406-b4d0-4d24-995b-2185ac9edffb-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.287279 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbdhn" event={"ID":"79d31cb4-9244-41d0-b49d-c49e0f5534a7","Type":"ContainerStarted","Data":"72882e55a0328cad1ba98adaf9846bd3856d646e3f71d9de19bd8c97d4bb8199"} Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.290556 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-95f4d959b-pj66c" event={"ID":"ca4ef3ac-ce80-439f-b041-abc4ff019f4e","Type":"ContainerDied","Data":"fc02c21ee96f2607a30808d9120215ed7b775d564eeed0b9595f91e9a07a014d"} Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.290592 4909 scope.go:117] "RemoveContainer" containerID="1f0cda32a2b9188f387a823f39f3c3e6ef06e199f61b7b23a82817112b28f1f5" Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.290662 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-95f4d959b-pj66c" Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.294892 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7678cc9877-s78fm" event={"ID":"9aaf5406-b4d0-4d24-995b-2185ac9edffb","Type":"ContainerDied","Data":"2b741c12f48416e4a242875aff93edd96fabbcc45ec4305e8e48d5801cc56eb5"} Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.294948 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7678cc9877-s78fm" Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.313001 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mbdhn" podStartSLOduration=2.750157799 podStartE2EDuration="5.31297612s" podCreationTimestamp="2026-02-02 12:13:32 +0000 UTC" firstStartedPulling="2026-02-02 12:13:34.219451275 +0000 UTC m=+6139.965552010" lastFinishedPulling="2026-02-02 12:13:36.782269596 +0000 UTC m=+6142.528370331" observedRunningTime="2026-02-02 12:13:37.311052855 +0000 UTC m=+6143.057153850" watchObservedRunningTime="2026-02-02 12:13:37.31297612 +0000 UTC m=+6143.059076865" Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.328015 4909 scope.go:117] "RemoveContainer" containerID="c0d32ae889bcdd2f6d376279c5e1720cf3e91246bf297cc07f703849b09b1d4c" Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.337884 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-95f4d959b-pj66c"] Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.347904 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-95f4d959b-pj66c"] Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.358474 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7678cc9877-s78fm"] Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.373978 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7678cc9877-s78fm"] Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.499407 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6d78c46cb4-52qcm" Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.582013 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64775b7466-p6d8c"] Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.582408 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64775b7466-p6d8c" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon-log" containerID="cri-o://1339be205d32f9ae3651f10f23090655a6a424a80731083143e8a4f4eb39b519" gracePeriod=30 Feb 02 12:13:37 crc kubenswrapper[4909]: I0202 12:13:37.582722 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64775b7466-p6d8c" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon" containerID="cri-o://bae7e028e4266fea529772f976481a28252d0e91ca7ab2592827f2f5863017f3" gracePeriod=30 Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.134309 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qm4n9"] Feb 02 12:13:38 crc kubenswrapper[4909]: E0202 12:13:38.134857 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aaf5406-b4d0-4d24-995b-2185ac9edffb" containerName="heat-api" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.134881 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aaf5406-b4d0-4d24-995b-2185ac9edffb" containerName="heat-api" Feb 02 12:13:38 crc kubenswrapper[4909]: E0202 12:13:38.134913 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4ef3ac-ce80-439f-b041-abc4ff019f4e" containerName="heat-cfnapi" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.134922 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4ef3ac-ce80-439f-b041-abc4ff019f4e" containerName="heat-cfnapi" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.135149 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aaf5406-b4d0-4d24-995b-2185ac9edffb" containerName="heat-api" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.135190 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4ef3ac-ce80-439f-b041-abc4ff019f4e" containerName="heat-cfnapi" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.136749 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.146383 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qm4n9"] Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.318774 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-catalog-content\") pod \"certified-operators-qm4n9\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.318898 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-utilities\") pod \"certified-operators-qm4n9\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.319023 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxxwq\" (UniqueName: \"kubernetes.io/projected/e5bcc30a-426d-49c9-acf6-983e0a195c2c-kube-api-access-dxxwq\") pod \"certified-operators-qm4n9\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.421307 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-catalog-content\") pod \"certified-operators-qm4n9\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.421418 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-utilities\") pod \"certified-operators-qm4n9\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.421480 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxxwq\" (UniqueName: \"kubernetes.io/projected/e5bcc30a-426d-49c9-acf6-983e0a195c2c-kube-api-access-dxxwq\") pod \"certified-operators-qm4n9\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.421942 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-utilities\") pod \"certified-operators-qm4n9\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.422145 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-catalog-content\") pod \"certified-operators-qm4n9\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.447673 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxxwq\" (UniqueName: \"kubernetes.io/projected/e5bcc30a-426d-49c9-acf6-983e0a195c2c-kube-api-access-dxxwq\") pod \"certified-operators-qm4n9\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:38 crc kubenswrapper[4909]: I0202 12:13:38.474431 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:39 crc kubenswrapper[4909]: I0202 12:13:39.028080 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aaf5406-b4d0-4d24-995b-2185ac9edffb" path="/var/lib/kubelet/pods/9aaf5406-b4d0-4d24-995b-2185ac9edffb/volumes" Feb 02 12:13:39 crc kubenswrapper[4909]: I0202 12:13:39.029474 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4ef3ac-ce80-439f-b041-abc4ff019f4e" path="/var/lib/kubelet/pods/ca4ef3ac-ce80-439f-b041-abc4ff019f4e/volumes" Feb 02 12:13:39 crc kubenswrapper[4909]: W0202 12:13:39.581785 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5bcc30a_426d_49c9_acf6_983e0a195c2c.slice/crio-90b809ad9e3d49d754b09f5be6e2988fac7da95d2fc422a0c6827b12f39eb1d9 WatchSource:0}: Error finding container 90b809ad9e3d49d754b09f5be6e2988fac7da95d2fc422a0c6827b12f39eb1d9: Status 404 returned error can't find the container with id 90b809ad9e3d49d754b09f5be6e2988fac7da95d2fc422a0c6827b12f39eb1d9 Feb 02 12:13:39 crc kubenswrapper[4909]: I0202 12:13:39.582207 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qm4n9"] Feb 02 12:13:40 crc kubenswrapper[4909]: I0202 12:13:40.333406 4909 generic.go:334] "Generic (PLEG): container finished" podID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerID="0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a" exitCode=0 Feb 02 12:13:40 crc kubenswrapper[4909]: I0202 12:13:40.333455 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm4n9" event={"ID":"e5bcc30a-426d-49c9-acf6-983e0a195c2c","Type":"ContainerDied","Data":"0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a"} Feb 02 12:13:40 crc kubenswrapper[4909]: I0202 12:13:40.333482 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm4n9" event={"ID":"e5bcc30a-426d-49c9-acf6-983e0a195c2c","Type":"ContainerStarted","Data":"90b809ad9e3d49d754b09f5be6e2988fac7da95d2fc422a0c6827b12f39eb1d9"} Feb 02 12:13:40 crc kubenswrapper[4909]: I0202 12:13:40.711069 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64775b7466-p6d8c" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:45782->10.217.1.115:8443: read: connection reset by peer" Feb 02 12:13:41 crc kubenswrapper[4909]: I0202 12:13:41.352975 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm4n9" event={"ID":"e5bcc30a-426d-49c9-acf6-983e0a195c2c","Type":"ContainerStarted","Data":"04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d"} Feb 02 12:13:41 crc kubenswrapper[4909]: I0202 12:13:41.356708 4909 generic.go:334] "Generic (PLEG): container finished" podID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerID="bae7e028e4266fea529772f976481a28252d0e91ca7ab2592827f2f5863017f3" exitCode=0 Feb 02 12:13:41 crc kubenswrapper[4909]: I0202 12:13:41.356753 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64775b7466-p6d8c" event={"ID":"20c51ec9-94ae-4b2a-b16e-15877df6b3e6","Type":"ContainerDied","Data":"bae7e028e4266fea529772f976481a28252d0e91ca7ab2592827f2f5863017f3"} Feb 02 12:13:42 crc kubenswrapper[4909]: I0202 12:13:42.204221 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:13:42 crc kubenswrapper[4909]: I0202 12:13:42.370882 4909 generic.go:334] "Generic (PLEG): container finished" podID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerID="04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d" exitCode=0 Feb 02 12:13:42 crc kubenswrapper[4909]: I0202 12:13:42.370926 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm4n9" event={"ID":"e5bcc30a-426d-49c9-acf6-983e0a195c2c","Type":"ContainerDied","Data":"04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d"} Feb 02 12:13:42 crc kubenswrapper[4909]: I0202 12:13:42.526792 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5bd5f6b94d-f6rqx" Feb 02 12:13:42 crc kubenswrapper[4909]: I0202 12:13:42.532777 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5476649989-mfkg8" Feb 02 12:13:42 crc kubenswrapper[4909]: I0202 12:13:42.621096 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5fbd846955-7fxfq"] Feb 02 12:13:42 crc kubenswrapper[4909]: I0202 12:13:42.634255 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-77c65bc69c-n4w79"] Feb 02 12:13:42 crc kubenswrapper[4909]: I0202 12:13:42.694240 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:42 crc kubenswrapper[4909]: I0202 12:13:42.694294 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:42 crc kubenswrapper[4909]: I0202 12:13:42.764374 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.213144 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.218327 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.335515 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data\") pod \"d5ede287-b375-4953-b46c-bb7afb372891\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.335656 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data-custom\") pod \"22709a9d-0d92-48ef-8967-588ecaf1b50b\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.335685 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-combined-ca-bundle\") pod \"22709a9d-0d92-48ef-8967-588ecaf1b50b\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.336417 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data\") pod \"22709a9d-0d92-48ef-8967-588ecaf1b50b\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.336472 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47p8z\" (UniqueName: \"kubernetes.io/projected/22709a9d-0d92-48ef-8967-588ecaf1b50b-kube-api-access-47p8z\") pod \"22709a9d-0d92-48ef-8967-588ecaf1b50b\" (UID: \"22709a9d-0d92-48ef-8967-588ecaf1b50b\") " Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.336513 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-combined-ca-bundle\") pod \"d5ede287-b375-4953-b46c-bb7afb372891\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.336579 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssjx6\" (UniqueName: \"kubernetes.io/projected/d5ede287-b375-4953-b46c-bb7afb372891-kube-api-access-ssjx6\") pod \"d5ede287-b375-4953-b46c-bb7afb372891\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.336604 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data-custom\") pod \"d5ede287-b375-4953-b46c-bb7afb372891\" (UID: \"d5ede287-b375-4953-b46c-bb7afb372891\") " Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.341397 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d5ede287-b375-4953-b46c-bb7afb372891" (UID: "d5ede287-b375-4953-b46c-bb7afb372891"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.341542 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22709a9d-0d92-48ef-8967-588ecaf1b50b-kube-api-access-47p8z" (OuterVolumeSpecName: "kube-api-access-47p8z") pod "22709a9d-0d92-48ef-8967-588ecaf1b50b" (UID: "22709a9d-0d92-48ef-8967-588ecaf1b50b"). InnerVolumeSpecName "kube-api-access-47p8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.341882 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ede287-b375-4953-b46c-bb7afb372891-kube-api-access-ssjx6" (OuterVolumeSpecName: "kube-api-access-ssjx6") pod "d5ede287-b375-4953-b46c-bb7afb372891" (UID: "d5ede287-b375-4953-b46c-bb7afb372891"). InnerVolumeSpecName "kube-api-access-ssjx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.342000 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "22709a9d-0d92-48ef-8967-588ecaf1b50b" (UID: "22709a9d-0d92-48ef-8967-588ecaf1b50b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.366528 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5ede287-b375-4953-b46c-bb7afb372891" (UID: "d5ede287-b375-4953-b46c-bb7afb372891"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.376265 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22709a9d-0d92-48ef-8967-588ecaf1b50b" (UID: "22709a9d-0d92-48ef-8967-588ecaf1b50b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.387979 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.387979 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-77c65bc69c-n4w79" event={"ID":"d5ede287-b375-4953-b46c-bb7afb372891","Type":"ContainerDied","Data":"96a3853d21ae7646a8be4388932779cc62f8c846324b7ba35c129111a54ac2e3"} Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.388093 4909 scope.go:117] "RemoveContainer" containerID="9c168ef3d62c8984e694a539a0b378e74491cdecf8d3897783e77021c2fba293" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.391786 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm4n9" event={"ID":"e5bcc30a-426d-49c9-acf6-983e0a195c2c","Type":"ContainerStarted","Data":"61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b"} Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.393042 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data" (OuterVolumeSpecName: "config-data") pod "d5ede287-b375-4953-b46c-bb7afb372891" (UID: "d5ede287-b375-4953-b46c-bb7afb372891"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.396925 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5fbd846955-7fxfq" event={"ID":"22709a9d-0d92-48ef-8967-588ecaf1b50b","Type":"ContainerDied","Data":"b27751aba3b2812a0138ffa9de3c8fe3fa5cdb121057850d6077983d07287c0d"} Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.397054 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5fbd846955-7fxfq" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.404469 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data" (OuterVolumeSpecName: "config-data") pod "22709a9d-0d92-48ef-8967-588ecaf1b50b" (UID: "22709a9d-0d92-48ef-8967-588ecaf1b50b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.417657 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qm4n9" podStartSLOduration=2.966037795 podStartE2EDuration="5.41761262s" podCreationTimestamp="2026-02-02 12:13:38 +0000 UTC" firstStartedPulling="2026-02-02 12:13:40.335429767 +0000 UTC m=+6146.081530502" lastFinishedPulling="2026-02-02 12:13:42.787004592 +0000 UTC m=+6148.533105327" observedRunningTime="2026-02-02 12:13:43.407131102 +0000 UTC m=+6149.153231837" watchObservedRunningTime="2026-02-02 12:13:43.41761262 +0000 UTC m=+6149.163713355" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.425360 4909 scope.go:117] "RemoveContainer" containerID="2742ff993b9ddf74fb0a6b717468ef77757b92d1227fe87959dcd2c366e759f5" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.439833 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssjx6\" (UniqueName: \"kubernetes.io/projected/d5ede287-b375-4953-b46c-bb7afb372891-kube-api-access-ssjx6\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.439881 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.439895 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.439905 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.439914 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.439922 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22709a9d-0d92-48ef-8967-588ecaf1b50b-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.439930 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47p8z\" (UniqueName: \"kubernetes.io/projected/22709a9d-0d92-48ef-8967-588ecaf1b50b-kube-api-access-47p8z\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.439938 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5ede287-b375-4953-b46c-bb7afb372891-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.461346 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.771966 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-77c65bc69c-n4w79"] Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.799790 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-77c65bc69c-n4w79"] Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.813187 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5fbd846955-7fxfq"] Feb 02 12:13:43 crc kubenswrapper[4909]: I0202 12:13:43.824513 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5fbd846955-7fxfq"] Feb 02 12:13:45 crc kubenswrapper[4909]: I0202 12:13:45.031418 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22709a9d-0d92-48ef-8967-588ecaf1b50b" path="/var/lib/kubelet/pods/22709a9d-0d92-48ef-8967-588ecaf1b50b/volumes" Feb 02 12:13:45 crc kubenswrapper[4909]: I0202 12:13:45.032875 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ede287-b375-4953-b46c-bb7afb372891" path="/var/lib/kubelet/pods/d5ede287-b375-4953-b46c-bb7afb372891/volumes" Feb 02 12:13:45 crc kubenswrapper[4909]: I0202 12:13:45.128565 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbdhn"] Feb 02 12:13:45 crc kubenswrapper[4909]: I0202 12:13:45.421831 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mbdhn" podUID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerName="registry-server" containerID="cri-o://72882e55a0328cad1ba98adaf9846bd3856d646e3f71d9de19bd8c97d4bb8199" gracePeriod=2 Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.448637 4909 generic.go:334] "Generic (PLEG): container finished" podID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerID="72882e55a0328cad1ba98adaf9846bd3856d646e3f71d9de19bd8c97d4bb8199" exitCode=0 Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.448740 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbdhn" event={"ID":"79d31cb4-9244-41d0-b49d-c49e0f5534a7","Type":"ContainerDied","Data":"72882e55a0328cad1ba98adaf9846bd3856d646e3f71d9de19bd8c97d4bb8199"} Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.449182 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mbdhn" event={"ID":"79d31cb4-9244-41d0-b49d-c49e0f5534a7","Type":"ContainerDied","Data":"b56a8917309d74a54cf475f44b3491e75e8e722b6da8110d1290ae8c54f151d9"} Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.449241 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56a8917309d74a54cf475f44b3491e75e8e722b6da8110d1290ae8c54f151d9" Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.490644 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.629221 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dckhl\" (UniqueName: \"kubernetes.io/projected/79d31cb4-9244-41d0-b49d-c49e0f5534a7-kube-api-access-dckhl\") pod \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.630031 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-catalog-content\") pod \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.632174 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-utilities\") pod \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\" (UID: \"79d31cb4-9244-41d0-b49d-c49e0f5534a7\") " Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.633468 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-utilities" (OuterVolumeSpecName: "utilities") pod "79d31cb4-9244-41d0-b49d-c49e0f5534a7" (UID: "79d31cb4-9244-41d0-b49d-c49e0f5534a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.635430 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.645477 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79d31cb4-9244-41d0-b49d-c49e0f5534a7-kube-api-access-dckhl" (OuterVolumeSpecName: "kube-api-access-dckhl") pod "79d31cb4-9244-41d0-b49d-c49e0f5534a7" (UID: "79d31cb4-9244-41d0-b49d-c49e0f5534a7"). InnerVolumeSpecName "kube-api-access-dckhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.682606 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79d31cb4-9244-41d0-b49d-c49e0f5534a7" (UID: "79d31cb4-9244-41d0-b49d-c49e0f5534a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.737531 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79d31cb4-9244-41d0-b49d-c49e0f5534a7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:46 crc kubenswrapper[4909]: I0202 12:13:46.737557 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dckhl\" (UniqueName: \"kubernetes.io/projected/79d31cb4-9244-41d0-b49d-c49e0f5534a7-kube-api-access-dckhl\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:47 crc kubenswrapper[4909]: I0202 12:13:47.458622 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mbdhn" Feb 02 12:13:47 crc kubenswrapper[4909]: I0202 12:13:47.515801 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mbdhn"] Feb 02 12:13:47 crc kubenswrapper[4909]: I0202 12:13:47.530124 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mbdhn"] Feb 02 12:13:48 crc kubenswrapper[4909]: I0202 12:13:48.257377 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64775b7466-p6d8c" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Feb 02 12:13:48 crc kubenswrapper[4909]: I0202 12:13:48.474881 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:48 crc kubenswrapper[4909]: I0202 12:13:48.474933 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:48 crc kubenswrapper[4909]: I0202 12:13:48.595115 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:49 crc kubenswrapper[4909]: I0202 12:13:49.027840 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" path="/var/lib/kubelet/pods/79d31cb4-9244-41d0-b49d-c49e0f5534a7/volumes" Feb 02 12:13:49 crc kubenswrapper[4909]: I0202 12:13:49.510710 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:13:49 crc kubenswrapper[4909]: I0202 12:13:49.510786 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:13:49 crc kubenswrapper[4909]: I0202 12:13:49.520919 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:49 crc kubenswrapper[4909]: I0202 12:13:49.796431 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-9478bc769-cxwt5" Feb 02 12:13:49 crc kubenswrapper[4909]: I0202 12:13:49.846904 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-54cb865ffb-9dzj7"] Feb 02 12:13:49 crc kubenswrapper[4909]: I0202 12:13:49.847138 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-54cb865ffb-9dzj7" podUID="f733c718-d19b-490f-8fc8-a91fc83f00e3" containerName="heat-engine" containerID="cri-o://f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8" gracePeriod=60 Feb 02 12:13:50 crc kubenswrapper[4909]: I0202 12:13:50.126371 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qm4n9"] Feb 02 12:13:51 crc kubenswrapper[4909]: I0202 12:13:51.494478 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qm4n9" podUID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerName="registry-server" containerID="cri-o://61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b" gracePeriod=2 Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.078321 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.162321 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-utilities\") pod \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.162708 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxxwq\" (UniqueName: \"kubernetes.io/projected/e5bcc30a-426d-49c9-acf6-983e0a195c2c-kube-api-access-dxxwq\") pod \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.162930 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-catalog-content\") pod \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\" (UID: \"e5bcc30a-426d-49c9-acf6-983e0a195c2c\") " Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.163353 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-utilities" (OuterVolumeSpecName: "utilities") pod "e5bcc30a-426d-49c9-acf6-983e0a195c2c" (UID: "e5bcc30a-426d-49c9-acf6-983e0a195c2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.175308 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:52 crc kubenswrapper[4909]: E0202 12:13:52.180942 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.183188 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5bcc30a-426d-49c9-acf6-983e0a195c2c-kube-api-access-dxxwq" (OuterVolumeSpecName: "kube-api-access-dxxwq") pod "e5bcc30a-426d-49c9-acf6-983e0a195c2c" (UID: "e5bcc30a-426d-49c9-acf6-983e0a195c2c"). InnerVolumeSpecName "kube-api-access-dxxwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:13:52 crc kubenswrapper[4909]: E0202 12:13:52.183965 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 12:13:52 crc kubenswrapper[4909]: E0202 12:13:52.187891 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Feb 02 12:13:52 crc kubenswrapper[4909]: E0202 12:13:52.187938 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-54cb865ffb-9dzj7" podUID="f733c718-d19b-490f-8fc8-a91fc83f00e3" containerName="heat-engine" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.223545 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5bcc30a-426d-49c9-acf6-983e0a195c2c" (UID: "e5bcc30a-426d-49c9-acf6-983e0a195c2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.277397 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxxwq\" (UniqueName: \"kubernetes.io/projected/e5bcc30a-426d-49c9-acf6-983e0a195c2c-kube-api-access-dxxwq\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.277443 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5bcc30a-426d-49c9-acf6-983e0a195c2c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.505605 4909 generic.go:334] "Generic (PLEG): container finished" podID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerID="61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b" exitCode=0 Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.505642 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm4n9" event={"ID":"e5bcc30a-426d-49c9-acf6-983e0a195c2c","Type":"ContainerDied","Data":"61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b"} Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.505667 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qm4n9" event={"ID":"e5bcc30a-426d-49c9-acf6-983e0a195c2c","Type":"ContainerDied","Data":"90b809ad9e3d49d754b09f5be6e2988fac7da95d2fc422a0c6827b12f39eb1d9"} Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.505686 4909 scope.go:117] "RemoveContainer" containerID="61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.505794 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qm4n9" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.531240 4909 scope.go:117] "RemoveContainer" containerID="04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.544067 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qm4n9"] Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.558371 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qm4n9"] Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.561283 4909 scope.go:117] "RemoveContainer" containerID="0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.601313 4909 scope.go:117] "RemoveContainer" containerID="61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b" Feb 02 12:13:52 crc kubenswrapper[4909]: E0202 12:13:52.601702 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b\": container with ID starting with 61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b not found: ID does not exist" containerID="61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.601733 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b"} err="failed to get container status \"61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b\": rpc error: code = NotFound desc = could not find container \"61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b\": container with ID starting with 61c234a311d26a642aec0bb2edcbd2aad689d1222b48471d9ecb24e95eb1b10b not found: ID does not exist" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.601756 4909 scope.go:117] "RemoveContainer" containerID="04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d" Feb 02 12:13:52 crc kubenswrapper[4909]: E0202 12:13:52.602464 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d\": container with ID starting with 04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d not found: ID does not exist" containerID="04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.602499 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d"} err="failed to get container status \"04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d\": rpc error: code = NotFound desc = could not find container \"04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d\": container with ID starting with 04955cec97f877c60559027092da3c5df9162905bc11b458414155d41701aa1d not found: ID does not exist" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.602518 4909 scope.go:117] "RemoveContainer" containerID="0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a" Feb 02 12:13:52 crc kubenswrapper[4909]: E0202 12:13:52.603519 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a\": container with ID starting with 0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a not found: ID does not exist" containerID="0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a" Feb 02 12:13:52 crc kubenswrapper[4909]: I0202 12:13:52.603549 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a"} err="failed to get container status \"0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a\": rpc error: code = NotFound desc = could not find container \"0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a\": container with ID starting with 0c60b6d2bc68cb2195dd88439df17c0b745bcd0f3661440da6ebd1681de2779a not found: ID does not exist" Feb 02 12:13:53 crc kubenswrapper[4909]: I0202 12:13:53.038125 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" path="/var/lib/kubelet/pods/e5bcc30a-426d-49c9-acf6-983e0a195c2c/volumes" Feb 02 12:13:58 crc kubenswrapper[4909]: I0202 12:13:58.257474 4909 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-64775b7466-p6d8c" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.115:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.115:8443: connect: connection refused" Feb 02 12:13:58 crc kubenswrapper[4909]: I0202 12:13:58.258101 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.444227 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.584881 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-combined-ca-bundle\") pod \"f733c718-d19b-490f-8fc8-a91fc83f00e3\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.585045 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data-custom\") pod \"f733c718-d19b-490f-8fc8-a91fc83f00e3\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.585092 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb8lf\" (UniqueName: \"kubernetes.io/projected/f733c718-d19b-490f-8fc8-a91fc83f00e3-kube-api-access-pb8lf\") pod \"f733c718-d19b-490f-8fc8-a91fc83f00e3\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.585111 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data\") pod \"f733c718-d19b-490f-8fc8-a91fc83f00e3\" (UID: \"f733c718-d19b-490f-8fc8-a91fc83f00e3\") " Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.593979 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f733c718-d19b-490f-8fc8-a91fc83f00e3" (UID: "f733c718-d19b-490f-8fc8-a91fc83f00e3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.593994 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f733c718-d19b-490f-8fc8-a91fc83f00e3-kube-api-access-pb8lf" (OuterVolumeSpecName: "kube-api-access-pb8lf") pod "f733c718-d19b-490f-8fc8-a91fc83f00e3" (UID: "f733c718-d19b-490f-8fc8-a91fc83f00e3"). InnerVolumeSpecName "kube-api-access-pb8lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.594329 4909 generic.go:334] "Generic (PLEG): container finished" podID="f733c718-d19b-490f-8fc8-a91fc83f00e3" containerID="f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8" exitCode=0 Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.594375 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54cb865ffb-9dzj7" event={"ID":"f733c718-d19b-490f-8fc8-a91fc83f00e3","Type":"ContainerDied","Data":"f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8"} Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.594403 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-54cb865ffb-9dzj7" event={"ID":"f733c718-d19b-490f-8fc8-a91fc83f00e3","Type":"ContainerDied","Data":"efab0ba1deae90e8c76f5f9ad91f0220e6a1fc87bd2b54cab6ae29c942611fb4"} Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.594421 4909 scope.go:117] "RemoveContainer" containerID="f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.594425 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-54cb865ffb-9dzj7" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.618584 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f733c718-d19b-490f-8fc8-a91fc83f00e3" (UID: "f733c718-d19b-490f-8fc8-a91fc83f00e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.640272 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data" (OuterVolumeSpecName: "config-data") pod "f733c718-d19b-490f-8fc8-a91fc83f00e3" (UID: "f733c718-d19b-490f-8fc8-a91fc83f00e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.687736 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.687938 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb8lf\" (UniqueName: \"kubernetes.io/projected/f733c718-d19b-490f-8fc8-a91fc83f00e3-kube-api-access-pb8lf\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.688045 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.688129 4909 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f733c718-d19b-490f-8fc8-a91fc83f00e3-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.709326 4909 scope.go:117] "RemoveContainer" containerID="f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8" Feb 02 12:14:01 crc kubenswrapper[4909]: E0202 12:14:01.709743 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8\": container with ID starting with f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8 not found: ID does not exist" containerID="f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.709774 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8"} err="failed to get container status \"f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8\": rpc error: code = NotFound desc = could not find container \"f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8\": container with ID starting with f03e25a73b2ceab70d4f3ace75ce667491307f3c39f583415e7e341d4556aee8 not found: ID does not exist" Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.927788 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-54cb865ffb-9dzj7"] Feb 02 12:14:01 crc kubenswrapper[4909]: I0202 12:14:01.938210 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-54cb865ffb-9dzj7"] Feb 02 12:14:03 crc kubenswrapper[4909]: I0202 12:14:03.028085 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f733c718-d19b-490f-8fc8-a91fc83f00e3" path="/var/lib/kubelet/pods/f733c718-d19b-490f-8fc8-a91fc83f00e3/volumes" Feb 02 12:14:04 crc kubenswrapper[4909]: I0202 12:14:04.054172 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-d45e-account-create-update-cpm9n"] Feb 02 12:14:04 crc kubenswrapper[4909]: I0202 12:14:04.064263 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-l2vls"] Feb 02 12:14:04 crc kubenswrapper[4909]: I0202 12:14:04.074792 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-d45e-account-create-update-cpm9n"] Feb 02 12:14:04 crc kubenswrapper[4909]: I0202 12:14:04.082767 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-l2vls"] Feb 02 12:14:05 crc kubenswrapper[4909]: I0202 12:14:05.033058 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1d7afa-1e6f-4271-b1ab-e4052d238647" path="/var/lib/kubelet/pods/5e1d7afa-1e6f-4271-b1ab-e4052d238647/volumes" Feb 02 12:14:05 crc kubenswrapper[4909]: I0202 12:14:05.036882 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faaef286-81e0-4931-863a-0df86ef982c2" path="/var/lib/kubelet/pods/faaef286-81e0-4931-863a-0df86ef982c2/volumes" Feb 02 12:14:07 crc kubenswrapper[4909]: I0202 12:14:07.651656 4909 generic.go:334] "Generic (PLEG): container finished" podID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerID="1339be205d32f9ae3651f10f23090655a6a424a80731083143e8a4f4eb39b519" exitCode=137 Feb 02 12:14:07 crc kubenswrapper[4909]: I0202 12:14:07.651736 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64775b7466-p6d8c" event={"ID":"20c51ec9-94ae-4b2a-b16e-15877df6b3e6","Type":"ContainerDied","Data":"1339be205d32f9ae3651f10f23090655a6a424a80731083143e8a4f4eb39b519"} Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.027408 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.120536 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-secret-key\") pod \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.120623 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-logs\") pod \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.120715 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-combined-ca-bundle\") pod \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.120742 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npkcj\" (UniqueName: \"kubernetes.io/projected/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-kube-api-access-npkcj\") pod \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.120880 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-scripts\") pod \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.120920 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-tls-certs\") pod \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.121190 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-config-data\") pod \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\" (UID: \"20c51ec9-94ae-4b2a-b16e-15877df6b3e6\") " Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.129582 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-logs" (OuterVolumeSpecName: "logs") pod "20c51ec9-94ae-4b2a-b16e-15877df6b3e6" (UID: "20c51ec9-94ae-4b2a-b16e-15877df6b3e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.133097 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-kube-api-access-npkcj" (OuterVolumeSpecName: "kube-api-access-npkcj") pod "20c51ec9-94ae-4b2a-b16e-15877df6b3e6" (UID: "20c51ec9-94ae-4b2a-b16e-15877df6b3e6"). InnerVolumeSpecName "kube-api-access-npkcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.134065 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "20c51ec9-94ae-4b2a-b16e-15877df6b3e6" (UID: "20c51ec9-94ae-4b2a-b16e-15877df6b3e6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.158021 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-config-data" (OuterVolumeSpecName: "config-data") pod "20c51ec9-94ae-4b2a-b16e-15877df6b3e6" (UID: "20c51ec9-94ae-4b2a-b16e-15877df6b3e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.167386 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20c51ec9-94ae-4b2a-b16e-15877df6b3e6" (UID: "20c51ec9-94ae-4b2a-b16e-15877df6b3e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.169080 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-scripts" (OuterVolumeSpecName: "scripts") pod "20c51ec9-94ae-4b2a-b16e-15877df6b3e6" (UID: "20c51ec9-94ae-4b2a-b16e-15877df6b3e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.210003 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "20c51ec9-94ae-4b2a-b16e-15877df6b3e6" (UID: "20c51ec9-94ae-4b2a-b16e-15877df6b3e6"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.225517 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.225572 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npkcj\" (UniqueName: \"kubernetes.io/projected/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-kube-api-access-npkcj\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.225587 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.225597 4909 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.225606 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.225616 4909 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.225628 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20c51ec9-94ae-4b2a-b16e-15877df6b3e6-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.670340 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64775b7466-p6d8c" event={"ID":"20c51ec9-94ae-4b2a-b16e-15877df6b3e6","Type":"ContainerDied","Data":"3f0d2e7c227086b2fd8daab321ce6831d7b90f59bf84d3090a9cb0a8b48e9f86"} Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.670402 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64775b7466-p6d8c" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.670441 4909 scope.go:117] "RemoveContainer" containerID="bae7e028e4266fea529772f976481a28252d0e91ca7ab2592827f2f5863017f3" Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.737364 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64775b7466-p6d8c"] Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.744856 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64775b7466-p6d8c"] Feb 02 12:14:08 crc kubenswrapper[4909]: I0202 12:14:08.963498 4909 scope.go:117] "RemoveContainer" containerID="1339be205d32f9ae3651f10f23090655a6a424a80731083143e8a4f4eb39b519" Feb 02 12:14:09 crc kubenswrapper[4909]: I0202 12:14:09.029566 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" path="/var/lib/kubelet/pods/20c51ec9-94ae-4b2a-b16e-15877df6b3e6/volumes" Feb 02 12:14:12 crc kubenswrapper[4909]: I0202 12:14:12.030963 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-h4tm9"] Feb 02 12:14:12 crc kubenswrapper[4909]: I0202 12:14:12.038836 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-h4tm9"] Feb 02 12:14:13 crc kubenswrapper[4909]: I0202 12:14:13.032895 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="312db841-556a-4d31-b5c0-e2d4dc5cf3e4" path="/var/lib/kubelet/pods/312db841-556a-4d31-b5c0-e2d4dc5cf3e4/volumes" Feb 02 12:14:19 crc kubenswrapper[4909]: I0202 12:14:19.510567 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:14:19 crc kubenswrapper[4909]: I0202 12:14:19.511080 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:14:19 crc kubenswrapper[4909]: I0202 12:14:19.511173 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 12:14:19 crc kubenswrapper[4909]: I0202 12:14:19.511976 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7efda102e84c90fb99553b44e1e0fb3c586800eb829f32e3cb14404521e127c2"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 12:14:19 crc kubenswrapper[4909]: I0202 12:14:19.512037 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://7efda102e84c90fb99553b44e1e0fb3c586800eb829f32e3cb14404521e127c2" gracePeriod=600 Feb 02 12:14:19 crc kubenswrapper[4909]: I0202 12:14:19.788300 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="7efda102e84c90fb99553b44e1e0fb3c586800eb829f32e3cb14404521e127c2" exitCode=0 Feb 02 12:14:19 crc kubenswrapper[4909]: I0202 12:14:19.788366 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"7efda102e84c90fb99553b44e1e0fb3c586800eb829f32e3cb14404521e127c2"} Feb 02 12:14:19 crc kubenswrapper[4909]: I0202 12:14:19.788728 4909 scope.go:117] "RemoveContainer" containerID="2eea8a4afa64137efa082f002e4799eccab7cbebd23a4ab4b54a9ae921952783" Feb 02 12:14:20 crc kubenswrapper[4909]: I0202 12:14:20.799862 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0"} Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.972641 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb"] Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973704 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22709a9d-0d92-48ef-8967-588ecaf1b50b" containerName="heat-api" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973725 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="22709a9d-0d92-48ef-8967-588ecaf1b50b" containerName="heat-api" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973744 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerName="extract-utilities" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973757 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerName="extract-utilities" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973772 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ede287-b375-4953-b46c-bb7afb372891" containerName="heat-cfnapi" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973781 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ede287-b375-4953-b46c-bb7afb372891" containerName="heat-cfnapi" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973788 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerName="extract-content" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973795 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerName="extract-content" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973844 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerName="registry-server" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973851 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerName="registry-server" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973866 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerName="extract-utilities" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973871 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerName="extract-utilities" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973881 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f733c718-d19b-490f-8fc8-a91fc83f00e3" containerName="heat-engine" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973887 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f733c718-d19b-490f-8fc8-a91fc83f00e3" containerName="heat-engine" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973897 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22709a9d-0d92-48ef-8967-588ecaf1b50b" containerName="heat-api" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973903 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="22709a9d-0d92-48ef-8967-588ecaf1b50b" containerName="heat-api" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973915 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerName="extract-content" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973922 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerName="extract-content" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973934 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerName="registry-server" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973941 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerName="registry-server" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973954 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973961 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.973978 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon-log" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.973989 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon-log" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.974228 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="79d31cb4-9244-41d0-b49d-c49e0f5534a7" containerName="registry-server" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.974247 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5bcc30a-426d-49c9-acf6-983e0a195c2c" containerName="registry-server" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.974259 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ede287-b375-4953-b46c-bb7afb372891" containerName="heat-cfnapi" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.974270 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.974284 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="20c51ec9-94ae-4b2a-b16e-15877df6b3e6" containerName="horizon-log" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.974299 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f733c718-d19b-490f-8fc8-a91fc83f00e3" containerName="heat-engine" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.974313 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="22709a9d-0d92-48ef-8967-588ecaf1b50b" containerName="heat-api" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.974328 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="22709a9d-0d92-48ef-8967-588ecaf1b50b" containerName="heat-api" Feb 02 12:14:23 crc kubenswrapper[4909]: E0202 12:14:23.974541 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5ede287-b375-4953-b46c-bb7afb372891" containerName="heat-cfnapi" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.974550 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5ede287-b375-4953-b46c-bb7afb372891" containerName="heat-cfnapi" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.974727 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5ede287-b375-4953-b46c-bb7afb372891" containerName="heat-cfnapi" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.976453 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.982998 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 02 12:14:23 crc kubenswrapper[4909]: I0202 12:14:23.987127 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb"] Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.063193 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.063521 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.063600 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gmdq\" (UniqueName: \"kubernetes.io/projected/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-kube-api-access-8gmdq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.165742 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.165825 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gmdq\" (UniqueName: \"kubernetes.io/projected/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-kube-api-access-8gmdq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.166014 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.166256 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.166327 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.186036 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gmdq\" (UniqueName: \"kubernetes.io/projected/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-kube-api-access-8gmdq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.298134 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.725399 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb"] Feb 02 12:14:24 crc kubenswrapper[4909]: W0202 12:14:24.741475 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08a88ffb_9ba5_45a7_91de_aae63dfd8b5c.slice/crio-e65ae6f7e7254aef6fb420d45633b44d5b85271dc3cd255720dea8245a415320 WatchSource:0}: Error finding container e65ae6f7e7254aef6fb420d45633b44d5b85271dc3cd255720dea8245a415320: Status 404 returned error can't find the container with id e65ae6f7e7254aef6fb420d45633b44d5b85271dc3cd255720dea8245a415320 Feb 02 12:14:24 crc kubenswrapper[4909]: I0202 12:14:24.839786 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" event={"ID":"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c","Type":"ContainerStarted","Data":"e65ae6f7e7254aef6fb420d45633b44d5b85271dc3cd255720dea8245a415320"} Feb 02 12:14:25 crc kubenswrapper[4909]: I0202 12:14:25.851079 4909 generic.go:334] "Generic (PLEG): container finished" podID="08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" containerID="ff1168160238f92884a0e33e2a208c0623d6bea62905dafa0ffb671c969cb270" exitCode=0 Feb 02 12:14:25 crc kubenswrapper[4909]: I0202 12:14:25.851288 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" event={"ID":"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c","Type":"ContainerDied","Data":"ff1168160238f92884a0e33e2a208c0623d6bea62905dafa0ffb671c969cb270"} Feb 02 12:14:26 crc kubenswrapper[4909]: I0202 12:14:26.571498 4909 scope.go:117] "RemoveContainer" containerID="64c09969500bb7b421cdfcbd7ed9710827601fcd712ecc9c1501c68919dc1a4b" Feb 02 12:14:26 crc kubenswrapper[4909]: I0202 12:14:26.599194 4909 scope.go:117] "RemoveContainer" containerID="6d71f70849dc4c93fe50630422c63883636886d2746a0cac0fec0b21b702f79c" Feb 02 12:14:26 crc kubenswrapper[4909]: I0202 12:14:26.635702 4909 scope.go:117] "RemoveContainer" containerID="b4e9a41594afcc210335a3299cd21dc3bdf8a2a49c7f8e50334867d1514052c7" Feb 02 12:14:26 crc kubenswrapper[4909]: I0202 12:14:26.670546 4909 scope.go:117] "RemoveContainer" containerID="60c17e6fbbb93ef6b2d9c1b8fb0647e9c3dae092820bf72a473a5c0855a7e532" Feb 02 12:14:26 crc kubenswrapper[4909]: I0202 12:14:26.725339 4909 scope.go:117] "RemoveContainer" containerID="1e22a58adfb790c4dcc149d293cb56c9c93cf1e035e4bc68748662266b1baa0d" Feb 02 12:14:26 crc kubenswrapper[4909]: I0202 12:14:26.745338 4909 scope.go:117] "RemoveContainer" containerID="ba71b7556d483fd7259c209e27b2c8c194cfd84d2e0f6156107f56599d7b9f0b" Feb 02 12:14:26 crc kubenswrapper[4909]: I0202 12:14:26.766578 4909 scope.go:117] "RemoveContainer" containerID="1412daa98da71cf6ba008933a60fbdb47f5bbffef23ba414f41991ea55d17659" Feb 02 12:14:30 crc kubenswrapper[4909]: I0202 12:14:30.906432 4909 generic.go:334] "Generic (PLEG): container finished" podID="08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" containerID="ccee437ac7a29608b18d30a0dacb3418db6aaf81c52154d3387ba0bed8e0aa76" exitCode=0 Feb 02 12:14:30 crc kubenswrapper[4909]: I0202 12:14:30.906524 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" event={"ID":"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c","Type":"ContainerDied","Data":"ccee437ac7a29608b18d30a0dacb3418db6aaf81c52154d3387ba0bed8e0aa76"} Feb 02 12:14:31 crc kubenswrapper[4909]: I0202 12:14:31.917793 4909 generic.go:334] "Generic (PLEG): container finished" podID="08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" containerID="0e25625aa2b1d3992b116b7c70ddaeede58e73f979cf381bad443fe7de57ab78" exitCode=0 Feb 02 12:14:31 crc kubenswrapper[4909]: I0202 12:14:31.917840 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" event={"ID":"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c","Type":"ContainerDied","Data":"0e25625aa2b1d3992b116b7c70ddaeede58e73f979cf381bad443fe7de57ab78"} Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.278933 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.406779 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gmdq\" (UniqueName: \"kubernetes.io/projected/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-kube-api-access-8gmdq\") pod \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.406957 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-util\") pod \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.407034 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-bundle\") pod \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\" (UID: \"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c\") " Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.409192 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-bundle" (OuterVolumeSpecName: "bundle") pod "08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" (UID: "08a88ffb-9ba5-45a7-91de-aae63dfd8b5c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.415258 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-kube-api-access-8gmdq" (OuterVolumeSpecName: "kube-api-access-8gmdq") pod "08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" (UID: "08a88ffb-9ba5-45a7-91de-aae63dfd8b5c"). InnerVolumeSpecName "kube-api-access-8gmdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.420313 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-util" (OuterVolumeSpecName: "util") pod "08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" (UID: "08a88ffb-9ba5-45a7-91de-aae63dfd8b5c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.509485 4909 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.509525 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gmdq\" (UniqueName: \"kubernetes.io/projected/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-kube-api-access-8gmdq\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.509535 4909 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08a88ffb-9ba5-45a7-91de-aae63dfd8b5c-util\") on node \"crc\" DevicePath \"\"" Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.937746 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" event={"ID":"08a88ffb-9ba5-45a7-91de-aae63dfd8b5c","Type":"ContainerDied","Data":"e65ae6f7e7254aef6fb420d45633b44d5b85271dc3cd255720dea8245a415320"} Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.937782 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb" Feb 02 12:14:33 crc kubenswrapper[4909]: I0202 12:14:33.937828 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e65ae6f7e7254aef6fb420d45633b44d5b85271dc3cd255720dea8245a415320" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.558120 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps"] Feb 02 12:14:47 crc kubenswrapper[4909]: E0202 12:14:47.559778 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" containerName="pull" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.559907 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" containerName="pull" Feb 02 12:14:47 crc kubenswrapper[4909]: E0202 12:14:47.559970 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" containerName="util" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.560020 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" containerName="util" Feb 02 12:14:47 crc kubenswrapper[4909]: E0202 12:14:47.560083 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" containerName="extract" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.560130 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" containerName="extract" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.560428 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a88ffb-9ba5-45a7-91de-aae63dfd8b5c" containerName="extract" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.562997 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.571959 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.572295 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.572454 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-wk748" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.584918 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps"] Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.705079 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz"] Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.706978 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.708629 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-dvp2w" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.713737 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.718423 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22fhl\" (UniqueName: \"kubernetes.io/projected/c38385f8-d673-4443-91ff-2e2bb10686bf-kube-api-access-22fhl\") pod \"obo-prometheus-operator-68bc856cb9-2fsps\" (UID: \"c38385f8-d673-4443-91ff-2e2bb10686bf\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.725486 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx"] Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.727181 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.747697 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz"] Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.782105 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx"] Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.824043 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6dd6d59f-1261-4704-a129-6361cb00de58-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-hlmxx\" (UID: \"6dd6d59f-1261-4704-a129-6361cb00de58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.824218 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c1c9acf-33b3-4b87-ba96-0def73b7e9f9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-bgkpz\" (UID: \"0c1c9acf-33b3-4b87-ba96-0def73b7e9f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.824315 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22fhl\" (UniqueName: \"kubernetes.io/projected/c38385f8-d673-4443-91ff-2e2bb10686bf-kube-api-access-22fhl\") pod \"obo-prometheus-operator-68bc856cb9-2fsps\" (UID: \"c38385f8-d673-4443-91ff-2e2bb10686bf\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.824443 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c1c9acf-33b3-4b87-ba96-0def73b7e9f9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-bgkpz\" (UID: \"0c1c9acf-33b3-4b87-ba96-0def73b7e9f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.824490 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6dd6d59f-1261-4704-a129-6361cb00de58-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-hlmxx\" (UID: \"6dd6d59f-1261-4704-a129-6361cb00de58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.872439 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22fhl\" (UniqueName: \"kubernetes.io/projected/c38385f8-d673-4443-91ff-2e2bb10686bf-kube-api-access-22fhl\") pod \"obo-prometheus-operator-68bc856cb9-2fsps\" (UID: \"c38385f8-d673-4443-91ff-2e2bb10686bf\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.924985 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-zqtcd"] Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.926351 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6dd6d59f-1261-4704-a129-6361cb00de58-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-hlmxx\" (UID: \"6dd6d59f-1261-4704-a129-6361cb00de58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.926501 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c1c9acf-33b3-4b87-ba96-0def73b7e9f9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-bgkpz\" (UID: \"0c1c9acf-33b3-4b87-ba96-0def73b7e9f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.926605 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c1c9acf-33b3-4b87-ba96-0def73b7e9f9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-bgkpz\" (UID: \"0c1c9acf-33b3-4b87-ba96-0def73b7e9f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.926646 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6dd6d59f-1261-4704-a129-6361cb00de58-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-hlmxx\" (UID: \"6dd6d59f-1261-4704-a129-6361cb00de58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.927471 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.930339 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6dd6d59f-1261-4704-a129-6361cb00de58-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-hlmxx\" (UID: \"6dd6d59f-1261-4704-a129-6361cb00de58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.930829 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0c1c9acf-33b3-4b87-ba96-0def73b7e9f9-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-bgkpz\" (UID: \"0c1c9acf-33b3-4b87-ba96-0def73b7e9f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.932729 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0c1c9acf-33b3-4b87-ba96-0def73b7e9f9-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-bgkpz\" (UID: \"0c1c9acf-33b3-4b87-ba96-0def73b7e9f9\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.933632 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.934435 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-ch5gh" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.934594 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.947555 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6dd6d59f-1261-4704-a129-6361cb00de58-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-848d587754-hlmxx\" (UID: \"6dd6d59f-1261-4704-a129-6361cb00de58\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" Feb 02 12:14:47 crc kubenswrapper[4909]: I0202 12:14:47.961105 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-zqtcd"] Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.028784 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6682f\" (UniqueName: \"kubernetes.io/projected/da7697a0-be27-4dac-b2bb-6af40732994e-kube-api-access-6682f\") pod \"observability-operator-59bdc8b94-zqtcd\" (UID: \"da7697a0-be27-4dac-b2bb-6af40732994e\") " pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.029195 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.029574 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/da7697a0-be27-4dac-b2bb-6af40732994e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-zqtcd\" (UID: \"da7697a0-be27-4dac-b2bb-6af40732994e\") " pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.048452 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.117474 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tnrzd"] Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.119617 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.130709 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tnrzd"] Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.131484 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/da7697a0-be27-4dac-b2bb-6af40732994e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-zqtcd\" (UID: \"da7697a0-be27-4dac-b2bb-6af40732994e\") " pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.131675 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6682f\" (UniqueName: \"kubernetes.io/projected/da7697a0-be27-4dac-b2bb-6af40732994e-kube-api-access-6682f\") pod \"observability-operator-59bdc8b94-zqtcd\" (UID: \"da7697a0-be27-4dac-b2bb-6af40732994e\") " pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.135082 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-nl7cr" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.136786 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/da7697a0-be27-4dac-b2bb-6af40732994e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-zqtcd\" (UID: \"da7697a0-be27-4dac-b2bb-6af40732994e\") " pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.179671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6682f\" (UniqueName: \"kubernetes.io/projected/da7697a0-be27-4dac-b2bb-6af40732994e-kube-api-access-6682f\") pod \"observability-operator-59bdc8b94-zqtcd\" (UID: \"da7697a0-be27-4dac-b2bb-6af40732994e\") " pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.199449 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.235388 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2de24d60-d18d-44ad-ac44-c89d52fdd86a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tnrzd\" (UID: \"2de24d60-d18d-44ad-ac44-c89d52fdd86a\") " pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.235709 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dkh\" (UniqueName: \"kubernetes.io/projected/2de24d60-d18d-44ad-ac44-c89d52fdd86a-kube-api-access-k4dkh\") pod \"perses-operator-5bf474d74f-tnrzd\" (UID: \"2de24d60-d18d-44ad-ac44-c89d52fdd86a\") " pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.338952 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dkh\" (UniqueName: \"kubernetes.io/projected/2de24d60-d18d-44ad-ac44-c89d52fdd86a-kube-api-access-k4dkh\") pod \"perses-operator-5bf474d74f-tnrzd\" (UID: \"2de24d60-d18d-44ad-ac44-c89d52fdd86a\") " pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.339221 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2de24d60-d18d-44ad-ac44-c89d52fdd86a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tnrzd\" (UID: \"2de24d60-d18d-44ad-ac44-c89d52fdd86a\") " pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.341254 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/2de24d60-d18d-44ad-ac44-c89d52fdd86a-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tnrzd\" (UID: \"2de24d60-d18d-44ad-ac44-c89d52fdd86a\") " pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.375635 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dkh\" (UniqueName: \"kubernetes.io/projected/2de24d60-d18d-44ad-ac44-c89d52fdd86a-kube-api-access-k4dkh\") pod \"perses-operator-5bf474d74f-tnrzd\" (UID: \"2de24d60-d18d-44ad-ac44-c89d52fdd86a\") " pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.518471 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.871649 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps"] Feb 02 12:14:48 crc kubenswrapper[4909]: I0202 12:14:48.966687 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx"] Feb 02 12:14:49 crc kubenswrapper[4909]: I0202 12:14:49.174902 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" event={"ID":"6dd6d59f-1261-4704-a129-6361cb00de58","Type":"ContainerStarted","Data":"6098765994f80a7a8219b07b93d5d78bf4f0afe0ae8807537e3a239141e81f5e"} Feb 02 12:14:49 crc kubenswrapper[4909]: I0202 12:14:49.180406 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps" event={"ID":"c38385f8-d673-4443-91ff-2e2bb10686bf","Type":"ContainerStarted","Data":"95f6362afc61e7b31a9fc5e4d1bb1841fc4407d723564839d69affbc99279e9f"} Feb 02 12:14:49 crc kubenswrapper[4909]: I0202 12:14:49.217116 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz"] Feb 02 12:14:49 crc kubenswrapper[4909]: I0202 12:14:49.339449 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-zqtcd"] Feb 02 12:14:49 crc kubenswrapper[4909]: W0202 12:14:49.464801 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2de24d60_d18d_44ad_ac44_c89d52fdd86a.slice/crio-f5d3a95ba25cd0e8d5e9bbaa6c7022db5e602379c8fdb3a253d2fed1ce962b6f WatchSource:0}: Error finding container f5d3a95ba25cd0e8d5e9bbaa6c7022db5e602379c8fdb3a253d2fed1ce962b6f: Status 404 returned error can't find the container with id f5d3a95ba25cd0e8d5e9bbaa6c7022db5e602379c8fdb3a253d2fed1ce962b6f Feb 02 12:14:49 crc kubenswrapper[4909]: I0202 12:14:49.469970 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tnrzd"] Feb 02 12:14:50 crc kubenswrapper[4909]: I0202 12:14:50.200953 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" event={"ID":"da7697a0-be27-4dac-b2bb-6af40732994e","Type":"ContainerStarted","Data":"4efb6677e8fb82866fe9ba424339a30cbca71058df17705583a6564fd386de29"} Feb 02 12:14:50 crc kubenswrapper[4909]: I0202 12:14:50.202248 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" event={"ID":"0c1c9acf-33b3-4b87-ba96-0def73b7e9f9","Type":"ContainerStarted","Data":"a45622982fab4ddd022db0ca0114c4a008e17cb082aed2954eb5c310aa05f3e6"} Feb 02 12:14:50 crc kubenswrapper[4909]: I0202 12:14:50.216417 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" event={"ID":"2de24d60-d18d-44ad-ac44-c89d52fdd86a","Type":"ContainerStarted","Data":"f5d3a95ba25cd0e8d5e9bbaa6c7022db5e602379c8fdb3a253d2fed1ce962b6f"} Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.167874 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf"] Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.170123 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.174205 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.174485 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.189089 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf"] Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.280590 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87fc55fc-05b5-44b1-9af2-d9ff019235fc-secret-volume\") pod \"collect-profiles-29500575-f8dqf\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.280931 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbt87\" (UniqueName: \"kubernetes.io/projected/87fc55fc-05b5-44b1-9af2-d9ff019235fc-kube-api-access-jbt87\") pod \"collect-profiles-29500575-f8dqf\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.281097 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87fc55fc-05b5-44b1-9af2-d9ff019235fc-config-volume\") pod \"collect-profiles-29500575-f8dqf\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.384514 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbt87\" (UniqueName: \"kubernetes.io/projected/87fc55fc-05b5-44b1-9af2-d9ff019235fc-kube-api-access-jbt87\") pod \"collect-profiles-29500575-f8dqf\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.384626 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87fc55fc-05b5-44b1-9af2-d9ff019235fc-config-volume\") pod \"collect-profiles-29500575-f8dqf\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.384794 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87fc55fc-05b5-44b1-9af2-d9ff019235fc-secret-volume\") pod \"collect-profiles-29500575-f8dqf\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.386341 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87fc55fc-05b5-44b1-9af2-d9ff019235fc-config-volume\") pod \"collect-profiles-29500575-f8dqf\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.410914 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87fc55fc-05b5-44b1-9af2-d9ff019235fc-secret-volume\") pod \"collect-profiles-29500575-f8dqf\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.455099 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbt87\" (UniqueName: \"kubernetes.io/projected/87fc55fc-05b5-44b1-9af2-d9ff019235fc-kube-api-access-jbt87\") pod \"collect-profiles-29500575-f8dqf\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:00 crc kubenswrapper[4909]: I0202 12:15:00.514509 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:04 crc kubenswrapper[4909]: E0202 12:15:04.417068 4909 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8" Feb 02 12:15:04 crc kubenswrapper[4909]: E0202 12:15:04.417916 4909 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k4dkh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5bf474d74f-tnrzd_openshift-operators(2de24d60-d18d-44ad-ac44-c89d52fdd86a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 02 12:15:04 crc kubenswrapper[4909]: E0202 12:15:04.419376 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" podUID="2de24d60-d18d-44ad-ac44-c89d52fdd86a" Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.013517 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf"] Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.396675 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" event={"ID":"0c1c9acf-33b3-4b87-ba96-0def73b7e9f9","Type":"ContainerStarted","Data":"4a4d2b37c7cfc65647cb93bb277935875664e6a7cc1e24a0e66ef4e327e397b0"} Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.400834 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" event={"ID":"87fc55fc-05b5-44b1-9af2-d9ff019235fc","Type":"ContainerStarted","Data":"09524936d3d2a4a3a0cfe95a26364adc09352cfd88260c68d9a4b10ab84a8ac5"} Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.400892 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" event={"ID":"87fc55fc-05b5-44b1-9af2-d9ff019235fc","Type":"ContainerStarted","Data":"c117af1d2f7d61a29aaf6839a65ee876442e18993bd493125110f5b3782110a1"} Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.404872 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" event={"ID":"6dd6d59f-1261-4704-a129-6361cb00de58","Type":"ContainerStarted","Data":"6e56125f147079a1ef48f8ec469cdb56784175cba7451a8459f6f7b4cad8fe68"} Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.406734 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps" event={"ID":"c38385f8-d673-4443-91ff-2e2bb10686bf","Type":"ContainerStarted","Data":"d0e298e2fe89cc5843fdb5a55092f972d67e787df8a8f261e90ae7279313edeb"} Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.410023 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" event={"ID":"da7697a0-be27-4dac-b2bb-6af40732994e","Type":"ContainerStarted","Data":"5fd835eda1263f697779b06212f15255477bb450ee12e93db8514c28db23d05c"} Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.410072 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" Feb 02 12:15:05 crc kubenswrapper[4909]: E0202 12:15:05.411561 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:b5c8526d2ae660fe092dd8a7acf18ec4957d5c265890a222f55396fc2cdaeed8\\\"\"" pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" podUID="2de24d60-d18d-44ad-ac44-c89d52fdd86a" Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.411942 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.434668 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-bgkpz" podStartSLOduration=3.187617967 podStartE2EDuration="18.434647032s" podCreationTimestamp="2026-02-02 12:14:47 +0000 UTC" firstStartedPulling="2026-02-02 12:14:49.235206939 +0000 UTC m=+6214.981307674" lastFinishedPulling="2026-02-02 12:15:04.482236004 +0000 UTC m=+6230.228336739" observedRunningTime="2026-02-02 12:15:05.417716792 +0000 UTC m=+6231.163817527" watchObservedRunningTime="2026-02-02 12:15:05.434647032 +0000 UTC m=+6231.180747777" Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.509836 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2fsps" podStartSLOduration=3.016184424 podStartE2EDuration="18.509798764s" podCreationTimestamp="2026-02-02 12:14:47 +0000 UTC" firstStartedPulling="2026-02-02 12:14:48.95501594 +0000 UTC m=+6214.701116665" lastFinishedPulling="2026-02-02 12:15:04.44863028 +0000 UTC m=+6230.194731005" observedRunningTime="2026-02-02 12:15:05.483943241 +0000 UTC m=+6231.230043976" watchObservedRunningTime="2026-02-02 12:15:05.509798764 +0000 UTC m=+6231.255899499" Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.531098 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-848d587754-hlmxx" podStartSLOduration=3.089785382 podStartE2EDuration="18.531076248s" podCreationTimestamp="2026-02-02 12:14:47 +0000 UTC" firstStartedPulling="2026-02-02 12:14:49.007858919 +0000 UTC m=+6214.753959664" lastFinishedPulling="2026-02-02 12:15:04.449149795 +0000 UTC m=+6230.195250530" observedRunningTime="2026-02-02 12:15:05.516087833 +0000 UTC m=+6231.262188568" watchObservedRunningTime="2026-02-02 12:15:05.531076248 +0000 UTC m=+6231.277176973" Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.542466 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-zqtcd" podStartSLOduration=3.408655008 podStartE2EDuration="18.54244827s" podCreationTimestamp="2026-02-02 12:14:47 +0000 UTC" firstStartedPulling="2026-02-02 12:14:49.349981175 +0000 UTC m=+6215.096081910" lastFinishedPulling="2026-02-02 12:15:04.483774437 +0000 UTC m=+6230.229875172" observedRunningTime="2026-02-02 12:15:05.541272897 +0000 UTC m=+6231.287373632" watchObservedRunningTime="2026-02-02 12:15:05.54244827 +0000 UTC m=+6231.288549005" Feb 02 12:15:05 crc kubenswrapper[4909]: I0202 12:15:05.580752 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" podStartSLOduration=5.580727795 podStartE2EDuration="5.580727795s" podCreationTimestamp="2026-02-02 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:15:05.562290432 +0000 UTC m=+6231.308391167" watchObservedRunningTime="2026-02-02 12:15:05.580727795 +0000 UTC m=+6231.326828520" Feb 02 12:15:06 crc kubenswrapper[4909]: I0202 12:15:06.430093 4909 generic.go:334] "Generic (PLEG): container finished" podID="87fc55fc-05b5-44b1-9af2-d9ff019235fc" containerID="09524936d3d2a4a3a0cfe95a26364adc09352cfd88260c68d9a4b10ab84a8ac5" exitCode=0 Feb 02 12:15:06 crc kubenswrapper[4909]: I0202 12:15:06.430162 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" event={"ID":"87fc55fc-05b5-44b1-9af2-d9ff019235fc","Type":"ContainerDied","Data":"09524936d3d2a4a3a0cfe95a26364adc09352cfd88260c68d9a4b10ab84a8ac5"} Feb 02 12:15:07 crc kubenswrapper[4909]: I0202 12:15:07.901422 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:07 crc kubenswrapper[4909]: I0202 12:15:07.967113 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87fc55fc-05b5-44b1-9af2-d9ff019235fc-config-volume\") pod \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " Feb 02 12:15:07 crc kubenswrapper[4909]: I0202 12:15:07.967265 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87fc55fc-05b5-44b1-9af2-d9ff019235fc-secret-volume\") pod \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " Feb 02 12:15:07 crc kubenswrapper[4909]: I0202 12:15:07.967291 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbt87\" (UniqueName: \"kubernetes.io/projected/87fc55fc-05b5-44b1-9af2-d9ff019235fc-kube-api-access-jbt87\") pod \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\" (UID: \"87fc55fc-05b5-44b1-9af2-d9ff019235fc\") " Feb 02 12:15:07 crc kubenswrapper[4909]: I0202 12:15:07.968007 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87fc55fc-05b5-44b1-9af2-d9ff019235fc-config-volume" (OuterVolumeSpecName: "config-volume") pod "87fc55fc-05b5-44b1-9af2-d9ff019235fc" (UID: "87fc55fc-05b5-44b1-9af2-d9ff019235fc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:15:07 crc kubenswrapper[4909]: I0202 12:15:07.973155 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87fc55fc-05b5-44b1-9af2-d9ff019235fc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "87fc55fc-05b5-44b1-9af2-d9ff019235fc" (UID: "87fc55fc-05b5-44b1-9af2-d9ff019235fc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:15:07 crc kubenswrapper[4909]: I0202 12:15:07.974032 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87fc55fc-05b5-44b1-9af2-d9ff019235fc-kube-api-access-jbt87" (OuterVolumeSpecName: "kube-api-access-jbt87") pod "87fc55fc-05b5-44b1-9af2-d9ff019235fc" (UID: "87fc55fc-05b5-44b1-9af2-d9ff019235fc"). InnerVolumeSpecName "kube-api-access-jbt87". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:15:08 crc kubenswrapper[4909]: I0202 12:15:08.069585 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87fc55fc-05b5-44b1-9af2-d9ff019235fc-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 12:15:08 crc kubenswrapper[4909]: I0202 12:15:08.069619 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87fc55fc-05b5-44b1-9af2-d9ff019235fc-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 12:15:08 crc kubenswrapper[4909]: I0202 12:15:08.069629 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbt87\" (UniqueName: \"kubernetes.io/projected/87fc55fc-05b5-44b1-9af2-d9ff019235fc-kube-api-access-jbt87\") on node \"crc\" DevicePath \"\"" Feb 02 12:15:08 crc kubenswrapper[4909]: I0202 12:15:08.086201 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575"] Feb 02 12:15:08 crc kubenswrapper[4909]: I0202 12:15:08.097902 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500530-xv575"] Feb 02 12:15:08 crc kubenswrapper[4909]: I0202 12:15:08.459695 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" event={"ID":"87fc55fc-05b5-44b1-9af2-d9ff019235fc","Type":"ContainerDied","Data":"c117af1d2f7d61a29aaf6839a65ee876442e18993bd493125110f5b3782110a1"} Feb 02 12:15:08 crc kubenswrapper[4909]: I0202 12:15:08.459740 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c117af1d2f7d61a29aaf6839a65ee876442e18993bd493125110f5b3782110a1" Feb 02 12:15:08 crc kubenswrapper[4909]: I0202 12:15:08.459768 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf" Feb 02 12:15:09 crc kubenswrapper[4909]: I0202 12:15:09.028871 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7d28527-82aa-4c77-b150-d94c4e9c4f32" path="/var/lib/kubelet/pods/f7d28527-82aa-4c77-b150-d94c4e9c4f32/volumes" Feb 02 12:15:15 crc kubenswrapper[4909]: I0202 12:15:15.038716 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4633-account-create-update-h9zgz"] Feb 02 12:15:15 crc kubenswrapper[4909]: I0202 12:15:15.051919 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4633-account-create-update-h9zgz"] Feb 02 12:15:15 crc kubenswrapper[4909]: I0202 12:15:15.062527 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-86npr"] Feb 02 12:15:15 crc kubenswrapper[4909]: I0202 12:15:15.072369 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-86npr"] Feb 02 12:15:17 crc kubenswrapper[4909]: I0202 12:15:17.028104 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b252cb-ab91-45b8-90e3-848e58d8e15e" path="/var/lib/kubelet/pods/94b252cb-ab91-45b8-90e3-848e58d8e15e/volumes" Feb 02 12:15:17 crc kubenswrapper[4909]: I0202 12:15:17.031005 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a73632c6-a76c-4c6f-bf8b-b08b9e3c6092" path="/var/lib/kubelet/pods/a73632c6-a76c-4c6f-bf8b-b08b9e3c6092/volumes" Feb 02 12:15:20 crc kubenswrapper[4909]: I0202 12:15:20.568981 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" event={"ID":"2de24d60-d18d-44ad-ac44-c89d52fdd86a","Type":"ContainerStarted","Data":"9d098473740631f2a31429580e6c547aedffdacc97b616920504113ffa2c9f74"} Feb 02 12:15:20 crc kubenswrapper[4909]: I0202 12:15:20.569712 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" Feb 02 12:15:20 crc kubenswrapper[4909]: I0202 12:15:20.590411 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" podStartSLOduration=2.145091941 podStartE2EDuration="32.590388706s" podCreationTimestamp="2026-02-02 12:14:48 +0000 UTC" firstStartedPulling="2026-02-02 12:14:49.468654962 +0000 UTC m=+6215.214755697" lastFinishedPulling="2026-02-02 12:15:19.913951737 +0000 UTC m=+6245.660052462" observedRunningTime="2026-02-02 12:15:20.585500687 +0000 UTC m=+6246.331601422" watchObservedRunningTime="2026-02-02 12:15:20.590388706 +0000 UTC m=+6246.336489441" Feb 02 12:15:24 crc kubenswrapper[4909]: I0202 12:15:24.043467 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zp7nf"] Feb 02 12:15:24 crc kubenswrapper[4909]: I0202 12:15:24.054909 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zp7nf"] Feb 02 12:15:25 crc kubenswrapper[4909]: I0202 12:15:25.029064 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f879cfdb-2ad6-47ca-ac7c-cb892538ca36" path="/var/lib/kubelet/pods/f879cfdb-2ad6-47ca-ac7c-cb892538ca36/volumes" Feb 02 12:15:27 crc kubenswrapper[4909]: I0202 12:15:27.033563 4909 scope.go:117] "RemoveContainer" containerID="44e184f7db4057268a2723de85c6bf922fa1b132fbffd98e4fef4350870a0b6f" Feb 02 12:15:27 crc kubenswrapper[4909]: I0202 12:15:27.078511 4909 scope.go:117] "RemoveContainer" containerID="24f82203c598179471b653e1ca099546b9a9e9872e0df83fbebcc69ed7005811" Feb 02 12:15:27 crc kubenswrapper[4909]: I0202 12:15:27.117551 4909 scope.go:117] "RemoveContainer" containerID="29fcba3fe2ea1b72f8d6aa8a8a4c017a890fec1a0a7a6c956786a50d42a118bc" Feb 02 12:15:27 crc kubenswrapper[4909]: I0202 12:15:27.166558 4909 scope.go:117] "RemoveContainer" containerID="d0c26779153e8bc2446d85ab7c2136b32ffa0643deeac78846eb0e073ad7a6ba" Feb 02 12:15:28 crc kubenswrapper[4909]: I0202 12:15:28.521945 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-tnrzd" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.525929 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.526461 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a710e525-42ba-4dd6-baf7-514f315b2c26" containerName="openstackclient" containerID="cri-o://83f04bf35bbd5c3e95033ce0c843ef9043fd4dfff663e2b71a1315bd36aa94f8" gracePeriod=2 Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.541563 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.597411 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 12:15:30 crc kubenswrapper[4909]: E0202 12:15:30.597986 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a710e525-42ba-4dd6-baf7-514f315b2c26" containerName="openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.598005 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a710e525-42ba-4dd6-baf7-514f315b2c26" containerName="openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: E0202 12:15:30.598044 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87fc55fc-05b5-44b1-9af2-d9ff019235fc" containerName="collect-profiles" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.598053 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="87fc55fc-05b5-44b1-9af2-d9ff019235fc" containerName="collect-profiles" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.598300 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="87fc55fc-05b5-44b1-9af2-d9ff019235fc" containerName="collect-profiles" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.598325 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a710e525-42ba-4dd6-baf7-514f315b2c26" containerName="openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.599270 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.605505 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a710e525-42ba-4dd6-baf7-514f315b2c26" podUID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.620405 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.690321 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config-secret\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.690438 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.690575 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.690598 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn8sx\" (UniqueName: \"kubernetes.io/projected/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-kube-api-access-vn8sx\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.744970 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.746615 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.751256 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-2lkkl" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.768153 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.793971 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config-secret\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.794070 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.794210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.794232 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn8sx\" (UniqueName: \"kubernetes.io/projected/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-kube-api-access-vn8sx\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.799057 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.808670 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.830334 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config-secret\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.843171 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn8sx\" (UniqueName: \"kubernetes.io/projected/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-kube-api-access-vn8sx\") pod \"openstackclient\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.896082 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wprtw\" (UniqueName: \"kubernetes.io/projected/664bbecb-dcb0-42c9-8da4-83110fbcf138-kube-api-access-wprtw\") pod \"kube-state-metrics-0\" (UID: \"664bbecb-dcb0-42c9-8da4-83110fbcf138\") " pod="openstack/kube-state-metrics-0" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.917689 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 12:15:30 crc kubenswrapper[4909]: I0202 12:15:30.997769 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wprtw\" (UniqueName: \"kubernetes.io/projected/664bbecb-dcb0-42c9-8da4-83110fbcf138-kube-api-access-wprtw\") pod \"kube-state-metrics-0\" (UID: \"664bbecb-dcb0-42c9-8da4-83110fbcf138\") " pod="openstack/kube-state-metrics-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.019936 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wprtw\" (UniqueName: \"kubernetes.io/projected/664bbecb-dcb0-42c9-8da4-83110fbcf138-kube-api-access-wprtw\") pod \"kube-state-metrics-0\" (UID: \"664bbecb-dcb0-42c9-8da4-83110fbcf138\") " pod="openstack/kube-state-metrics-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.064786 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.731690 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.745114 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.752533 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.757175 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.757474 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.757609 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-pv9c8" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.757706 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.770835 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.871132 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmwnv\" (UniqueName: \"kubernetes.io/projected/01a92576-40e1-46c8-b08b-13f40c8c4892-kube-api-access-cmwnv\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.871264 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/01a92576-40e1-46c8-b08b-13f40c8c4892-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.871301 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/01a92576-40e1-46c8-b08b-13f40c8c4892-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.871327 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/01a92576-40e1-46c8-b08b-13f40c8c4892-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.871363 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/01a92576-40e1-46c8-b08b-13f40c8c4892-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.871389 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/01a92576-40e1-46c8-b08b-13f40c8c4892-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.871501 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/01a92576-40e1-46c8-b08b-13f40c8c4892-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.938275 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.974938 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/01a92576-40e1-46c8-b08b-13f40c8c4892-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.975057 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmwnv\" (UniqueName: \"kubernetes.io/projected/01a92576-40e1-46c8-b08b-13f40c8c4892-kube-api-access-cmwnv\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.975133 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/01a92576-40e1-46c8-b08b-13f40c8c4892-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.975163 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/01a92576-40e1-46c8-b08b-13f40c8c4892-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.975183 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/01a92576-40e1-46c8-b08b-13f40c8c4892-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.975210 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/01a92576-40e1-46c8-b08b-13f40c8c4892-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.975232 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/01a92576-40e1-46c8-b08b-13f40c8c4892-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.976047 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/01a92576-40e1-46c8-b08b-13f40c8c4892-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.980548 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/01a92576-40e1-46c8-b08b-13f40c8c4892-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:31 crc kubenswrapper[4909]: I0202 12:15:31.996601 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/01a92576-40e1-46c8-b08b-13f40c8c4892-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.017485 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/01a92576-40e1-46c8-b08b-13f40c8c4892-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.019559 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/01a92576-40e1-46c8-b08b-13f40c8c4892-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.020778 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/01a92576-40e1-46c8-b08b-13f40c8c4892-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.038928 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmwnv\" (UniqueName: \"kubernetes.io/projected/01a92576-40e1-46c8-b08b-13f40c8c4892-kube-api-access-cmwnv\") pod \"alertmanager-metric-storage-0\" (UID: \"01a92576-40e1-46c8-b08b-13f40c8c4892\") " pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.153879 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.157611 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.168307 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.168341 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.168387 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.168541 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-8bc7b" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.168579 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.168707 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.168732 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.168837 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.171042 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.280992 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.359635 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.381063 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/79ff4955-01c1-4a09-b622-1c29d68a1a95-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.381198 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.381226 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.381706 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.381797 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snk8n\" (UniqueName: \"kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-kube-api-access-snk8n\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.381938 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.381961 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0480defb-d066-47ed-a96f-13341ddc88d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.381979 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.382015 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-config\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.382063 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.484736 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/79ff4955-01c1-4a09-b622-1c29d68a1a95-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.484919 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.484956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.485013 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.485069 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snk8n\" (UniqueName: \"kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-kube-api-access-snk8n\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.485152 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.485177 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0480defb-d066-47ed-a96f-13341ddc88d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.485193 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.485219 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-config\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.485253 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.486622 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.486624 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.487910 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.491103 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-config\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.491674 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.491700 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0480defb-d066-47ed-a96f-13341ddc88d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/01d20f6e279cfc29a4c60dc2c6f23d34cac9e38a012d3a024f6e025f113c80d1/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.491736 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.493038 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.493131 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/79ff4955-01c1-4a09-b622-1c29d68a1a95-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.499351 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.510511 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snk8n\" (UniqueName: \"kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-kube-api-access-snk8n\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.562396 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0480defb-d066-47ed-a96f-13341ddc88d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") pod \"prometheus-metric-storage-0\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.735039 4909 generic.go:334] "Generic (PLEG): container finished" podID="a710e525-42ba-4dd6-baf7-514f315b2c26" containerID="83f04bf35bbd5c3e95033ce0c843ef9043fd4dfff663e2b71a1315bd36aa94f8" exitCode=137 Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.737352 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ff2c3534-4d6f-4325-ab60-8390fdaf0e91","Type":"ContainerStarted","Data":"7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4"} Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.737404 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ff2c3534-4d6f-4325-ab60-8390fdaf0e91","Type":"ContainerStarted","Data":"2e4498632dbfb9dbfc05eea9c37c8b206996652b2a849c733ede6a42ac26a477"} Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.747050 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"664bbecb-dcb0-42c9-8da4-83110fbcf138","Type":"ContainerStarted","Data":"6173cc6bda83fd22d5f90fe1a12ebc3acd4a20e3d52faf032a1c52d84504b57a"} Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.758125 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.758098985 podStartE2EDuration="2.758098985s" podCreationTimestamp="2026-02-02 12:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:15:32.753037272 +0000 UTC m=+6258.499138007" watchObservedRunningTime="2026-02-02 12:15:32.758098985 +0000 UTC m=+6258.504199720" Feb 02 12:15:32 crc kubenswrapper[4909]: I0202 12:15:32.852933 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.011705 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 02 12:15:33 crc kubenswrapper[4909]: W0202 12:15:33.029589 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01a92576_40e1_46c8_b08b_13f40c8c4892.slice/crio-4f2e9d51af8eb859049189312b2722eeeb8d4295f956350db8de89d5105b8d66 WatchSource:0}: Error finding container 4f2e9d51af8eb859049189312b2722eeeb8d4295f956350db8de89d5105b8d66: Status 404 returned error can't find the container with id 4f2e9d51af8eb859049189312b2722eeeb8d4295f956350db8de89d5105b8d66 Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.200047 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.309224 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config-secret\") pod \"a710e525-42ba-4dd6-baf7-514f315b2c26\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.309641 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config\") pod \"a710e525-42ba-4dd6-baf7-514f315b2c26\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.309719 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zpbc\" (UniqueName: \"kubernetes.io/projected/a710e525-42ba-4dd6-baf7-514f315b2c26-kube-api-access-6zpbc\") pod \"a710e525-42ba-4dd6-baf7-514f315b2c26\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.309789 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-combined-ca-bundle\") pod \"a710e525-42ba-4dd6-baf7-514f315b2c26\" (UID: \"a710e525-42ba-4dd6-baf7-514f315b2c26\") " Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.394165 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a710e525-42ba-4dd6-baf7-514f315b2c26-kube-api-access-6zpbc" (OuterVolumeSpecName: "kube-api-access-6zpbc") pod "a710e525-42ba-4dd6-baf7-514f315b2c26" (UID: "a710e525-42ba-4dd6-baf7-514f315b2c26"). InnerVolumeSpecName "kube-api-access-6zpbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.425488 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zpbc\" (UniqueName: \"kubernetes.io/projected/a710e525-42ba-4dd6-baf7-514f315b2c26-kube-api-access-6zpbc\") on node \"crc\" DevicePath \"\"" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.489147 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a710e525-42ba-4dd6-baf7-514f315b2c26" (UID: "a710e525-42ba-4dd6-baf7-514f315b2c26"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.522132 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a710e525-42ba-4dd6-baf7-514f315b2c26" (UID: "a710e525-42ba-4dd6-baf7-514f315b2c26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.538443 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.538482 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.549005 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a710e525-42ba-4dd6-baf7-514f315b2c26" (UID: "a710e525-42ba-4dd6-baf7-514f315b2c26"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.640493 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a710e525-42ba-4dd6-baf7-514f315b2c26-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.680032 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 12:15:33 crc kubenswrapper[4909]: W0202 12:15:33.687337 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ff4955_01c1_4a09_b622_1c29d68a1a95.slice/crio-773524db041d9bb1249d32f9c2521693b026f1b81da58fc70b5d77138f698cd9 WatchSource:0}: Error finding container 773524db041d9bb1249d32f9c2521693b026f1b81da58fc70b5d77138f698cd9: Status 404 returned error can't find the container with id 773524db041d9bb1249d32f9c2521693b026f1b81da58fc70b5d77138f698cd9 Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.757501 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"79ff4955-01c1-4a09-b622-1c29d68a1a95","Type":"ContainerStarted","Data":"773524db041d9bb1249d32f9c2521693b026f1b81da58fc70b5d77138f698cd9"} Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.760628 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"664bbecb-dcb0-42c9-8da4-83110fbcf138","Type":"ContainerStarted","Data":"c2cf9a24c99f77fc0a23813f79b9805519b53df0cb341e4afdbb4213d1cdb48a"} Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.760692 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.761719 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"01a92576-40e1-46c8-b08b-13f40c8c4892","Type":"ContainerStarted","Data":"4f2e9d51af8eb859049189312b2722eeeb8d4295f956350db8de89d5105b8d66"} Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.767685 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.769062 4909 scope.go:117] "RemoveContainer" containerID="83f04bf35bbd5c3e95033ce0c843ef9043fd4dfff663e2b71a1315bd36aa94f8" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.788415 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.3516817850000002 podStartE2EDuration="3.788397154s" podCreationTimestamp="2026-02-02 12:15:30 +0000 UTC" firstStartedPulling="2026-02-02 12:15:32.284942153 +0000 UTC m=+6258.031042888" lastFinishedPulling="2026-02-02 12:15:32.721657522 +0000 UTC m=+6258.467758257" observedRunningTime="2026-02-02 12:15:33.787290882 +0000 UTC m=+6259.533391617" watchObservedRunningTime="2026-02-02 12:15:33.788397154 +0000 UTC m=+6259.534497889" Feb 02 12:15:33 crc kubenswrapper[4909]: I0202 12:15:33.791485 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a710e525-42ba-4dd6-baf7-514f315b2c26" podUID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" Feb 02 12:15:35 crc kubenswrapper[4909]: I0202 12:15:35.034700 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a710e525-42ba-4dd6-baf7-514f315b2c26" path="/var/lib/kubelet/pods/a710e525-42ba-4dd6-baf7-514f315b2c26/volumes" Feb 02 12:15:39 crc kubenswrapper[4909]: I0202 12:15:39.854967 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"01a92576-40e1-46c8-b08b-13f40c8c4892","Type":"ContainerStarted","Data":"fab55c53c42c09cfbdadc2d8f2355967ed286e762d8185d69bfc87763e65cee3"} Feb 02 12:15:39 crc kubenswrapper[4909]: I0202 12:15:39.857681 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"79ff4955-01c1-4a09-b622-1c29d68a1a95","Type":"ContainerStarted","Data":"19939baf56adb5ef0a2ef7effca179c320cb0f8b84f618dc1ba1bf5aa1d3b086"} Feb 02 12:15:41 crc kubenswrapper[4909]: I0202 12:15:41.068733 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 02 12:15:45 crc kubenswrapper[4909]: I0202 12:15:45.908557 4909 generic.go:334] "Generic (PLEG): container finished" podID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerID="19939baf56adb5ef0a2ef7effca179c320cb0f8b84f618dc1ba1bf5aa1d3b086" exitCode=0 Feb 02 12:15:45 crc kubenswrapper[4909]: I0202 12:15:45.908640 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"79ff4955-01c1-4a09-b622-1c29d68a1a95","Type":"ContainerDied","Data":"19939baf56adb5ef0a2ef7effca179c320cb0f8b84f618dc1ba1bf5aa1d3b086"} Feb 02 12:15:45 crc kubenswrapper[4909]: I0202 12:15:45.912020 4909 generic.go:334] "Generic (PLEG): container finished" podID="01a92576-40e1-46c8-b08b-13f40c8c4892" containerID="fab55c53c42c09cfbdadc2d8f2355967ed286e762d8185d69bfc87763e65cee3" exitCode=0 Feb 02 12:15:45 crc kubenswrapper[4909]: I0202 12:15:45.912100 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"01a92576-40e1-46c8-b08b-13f40c8c4892","Type":"ContainerDied","Data":"fab55c53c42c09cfbdadc2d8f2355967ed286e762d8185d69bfc87763e65cee3"} Feb 02 12:15:48 crc kubenswrapper[4909]: I0202 12:15:48.945666 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"01a92576-40e1-46c8-b08b-13f40c8c4892","Type":"ContainerStarted","Data":"d677452f6b04bd750385a1cdb8176381fdf088fe1b351e85667d761ad93c6712"} Feb 02 12:15:51 crc kubenswrapper[4909]: I0202 12:15:51.973660 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"01a92576-40e1-46c8-b08b-13f40c8c4892","Type":"ContainerStarted","Data":"93ee27857b7b62e3b53cd5763fd0a0738440753b30503d32c85044d2ec4a6487"} Feb 02 12:15:51 crc kubenswrapper[4909]: I0202 12:15:51.974225 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:51 crc kubenswrapper[4909]: I0202 12:15:51.976465 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 02 12:15:51 crc kubenswrapper[4909]: I0202 12:15:51.976832 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"79ff4955-01c1-4a09-b622-1c29d68a1a95","Type":"ContainerStarted","Data":"ad13553b622818cf7fd80d3d5c84fbfc1788a7da980f7fbdcb7ca1e5720fd0c8"} Feb 02 12:15:52 crc kubenswrapper[4909]: I0202 12:15:52.011175 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.904969536 podStartE2EDuration="21.011151965s" podCreationTimestamp="2026-02-02 12:15:31 +0000 UTC" firstStartedPulling="2026-02-02 12:15:33.032800818 +0000 UTC m=+6258.778901553" lastFinishedPulling="2026-02-02 12:15:48.138983247 +0000 UTC m=+6273.885083982" observedRunningTime="2026-02-02 12:15:52.001606564 +0000 UTC m=+6277.747707299" watchObservedRunningTime="2026-02-02 12:15:52.011151965 +0000 UTC m=+6277.757252700" Feb 02 12:15:55 crc kubenswrapper[4909]: I0202 12:15:55.007334 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"79ff4955-01c1-4a09-b622-1c29d68a1a95","Type":"ContainerStarted","Data":"e35f0bd98f6b9eda19127bb2da002631c125c237ccc7d848b01ca226b4542293"} Feb 02 12:15:55 crc kubenswrapper[4909]: I0202 12:15:55.048562 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f2d4-account-create-update-wsl5j"] Feb 02 12:15:55 crc kubenswrapper[4909]: I0202 12:15:55.059010 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f2d4-account-create-update-wsl5j"] Feb 02 12:15:56 crc kubenswrapper[4909]: I0202 12:15:56.034475 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-tk8f2"] Feb 02 12:15:56 crc kubenswrapper[4909]: I0202 12:15:56.048676 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-tk8f2"] Feb 02 12:15:57 crc kubenswrapper[4909]: I0202 12:15:57.039948 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1428ae65-f2dd-4d55-8e4b-5da119761240" path="/var/lib/kubelet/pods/1428ae65-f2dd-4d55-8e4b-5da119761240/volumes" Feb 02 12:15:57 crc kubenswrapper[4909]: I0202 12:15:57.042689 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ceb788-98ed-4b63-86c6-c872b71e4a4c" path="/var/lib/kubelet/pods/19ceb788-98ed-4b63-86c6-c872b71e4a4c/volumes" Feb 02 12:15:58 crc kubenswrapper[4909]: I0202 12:15:58.060133 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"79ff4955-01c1-4a09-b622-1c29d68a1a95","Type":"ContainerStarted","Data":"23ed671a7117b2d29494692136ae2680fdc5576606348a60de74e77a30850eb9"} Feb 02 12:15:58 crc kubenswrapper[4909]: I0202 12:15:58.104424 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.230061773 podStartE2EDuration="27.10439372s" podCreationTimestamp="2026-02-02 12:15:31 +0000 UTC" firstStartedPulling="2026-02-02 12:15:33.691956678 +0000 UTC m=+6259.438057403" lastFinishedPulling="2026-02-02 12:15:57.566288615 +0000 UTC m=+6283.312389350" observedRunningTime="2026-02-02 12:15:58.096198568 +0000 UTC m=+6283.842299313" watchObservedRunningTime="2026-02-02 12:15:58.10439372 +0000 UTC m=+6283.850494455" Feb 02 12:16:01 crc kubenswrapper[4909]: I0202 12:16:01.033003 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mj84r"] Feb 02 12:16:01 crc kubenswrapper[4909]: I0202 12:16:01.044257 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mj84r"] Feb 02 12:16:02 crc kubenswrapper[4909]: I0202 12:16:02.854666 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:02 crc kubenswrapper[4909]: I0202 12:16:02.854721 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:02 crc kubenswrapper[4909]: I0202 12:16:02.857203 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:03 crc kubenswrapper[4909]: I0202 12:16:03.028236 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907a52c1-86b1-4e4f-a18e-5089411f1704" path="/var/lib/kubelet/pods/907a52c1-86b1-4e4f-a18e-5089411f1704/volumes" Feb 02 12:16:03 crc kubenswrapper[4909]: I0202 12:16:03.121657 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.084680 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.085393 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" containerName="openstackclient" containerID="cri-o://7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4" gracePeriod=2 Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.096244 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.120763 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 02 12:16:04 crc kubenswrapper[4909]: E0202 12:16:04.121317 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" containerName="openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.121344 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" containerName="openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.121619 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" containerName="openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.122507 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.130638 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" podUID="f2e9096b-893f-4741-92f0-70a241bcd035" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.132077 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.188398 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhk4s\" (UniqueName: \"kubernetes.io/projected/f2e9096b-893f-4741-92f0-70a241bcd035-kube-api-access-lhk4s\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.188497 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f2e9096b-893f-4741-92f0-70a241bcd035-openstack-config\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.188523 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f2e9096b-893f-4741-92f0-70a241bcd035-openstack-config-secret\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.188582 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e9096b-893f-4741-92f0-70a241bcd035-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.290249 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhk4s\" (UniqueName: \"kubernetes.io/projected/f2e9096b-893f-4741-92f0-70a241bcd035-kube-api-access-lhk4s\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.290318 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f2e9096b-893f-4741-92f0-70a241bcd035-openstack-config\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.290339 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f2e9096b-893f-4741-92f0-70a241bcd035-openstack-config-secret\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.290389 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e9096b-893f-4741-92f0-70a241bcd035-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.291857 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f2e9096b-893f-4741-92f0-70a241bcd035-openstack-config\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.299550 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f2e9096b-893f-4741-92f0-70a241bcd035-openstack-config-secret\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.299666 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2e9096b-893f-4741-92f0-70a241bcd035-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.319355 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhk4s\" (UniqueName: \"kubernetes.io/projected/f2e9096b-893f-4741-92f0-70a241bcd035-kube-api-access-lhk4s\") pod \"openstackclient\" (UID: \"f2e9096b-893f-4741-92f0-70a241bcd035\") " pod="openstack/openstackclient" Feb 02 12:16:04 crc kubenswrapper[4909]: I0202 12:16:04.445769 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 12:16:05 crc kubenswrapper[4909]: I0202 12:16:05.040523 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 02 12:16:05 crc kubenswrapper[4909]: I0202 12:16:05.144174 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f2e9096b-893f-4741-92f0-70a241bcd035","Type":"ContainerStarted","Data":"b7d51875077d70e1c8fd8823132df854a18b2a3194ab284b9024fd4c19dce6db"} Feb 02 12:16:05 crc kubenswrapper[4909]: I0202 12:16:05.596513 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 12:16:05 crc kubenswrapper[4909]: I0202 12:16:05.597126 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="prometheus" containerID="cri-o://ad13553b622818cf7fd80d3d5c84fbfc1788a7da980f7fbdcb7ca1e5720fd0c8" gracePeriod=600 Feb 02 12:16:05 crc kubenswrapper[4909]: I0202 12:16:05.597295 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="config-reloader" containerID="cri-o://e35f0bd98f6b9eda19127bb2da002631c125c237ccc7d848b01ca226b4542293" gracePeriod=600 Feb 02 12:16:05 crc kubenswrapper[4909]: I0202 12:16:05.597295 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="thanos-sidecar" containerID="cri-o://23ed671a7117b2d29494692136ae2680fdc5576606348a60de74e77a30850eb9" gracePeriod=600 Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.156960 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f2e9096b-893f-4741-92f0-70a241bcd035","Type":"ContainerStarted","Data":"e2e0c200018ac4e9eaa5025f32d64a520c486a62b701ef15db627207610735d5"} Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.168848 4909 generic.go:334] "Generic (PLEG): container finished" podID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerID="23ed671a7117b2d29494692136ae2680fdc5576606348a60de74e77a30850eb9" exitCode=0 Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.169165 4909 generic.go:334] "Generic (PLEG): container finished" podID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerID="e35f0bd98f6b9eda19127bb2da002631c125c237ccc7d848b01ca226b4542293" exitCode=0 Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.169174 4909 generic.go:334] "Generic (PLEG): container finished" podID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerID="ad13553b622818cf7fd80d3d5c84fbfc1788a7da980f7fbdcb7ca1e5720fd0c8" exitCode=0 Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.168931 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"79ff4955-01c1-4a09-b622-1c29d68a1a95","Type":"ContainerDied","Data":"23ed671a7117b2d29494692136ae2680fdc5576606348a60de74e77a30850eb9"} Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.169215 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"79ff4955-01c1-4a09-b622-1c29d68a1a95","Type":"ContainerDied","Data":"e35f0bd98f6b9eda19127bb2da002631c125c237ccc7d848b01ca226b4542293"} Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.169232 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"79ff4955-01c1-4a09-b622-1c29d68a1a95","Type":"ContainerDied","Data":"ad13553b622818cf7fd80d3d5c84fbfc1788a7da980f7fbdcb7ca1e5720fd0c8"} Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.182361 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.182341999 podStartE2EDuration="2.182341999s" podCreationTimestamp="2026-02-02 12:16:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:16:06.17815588 +0000 UTC m=+6291.924256615" watchObservedRunningTime="2026-02-02 12:16:06.182341999 +0000 UTC m=+6291.928442734" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.453335 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.459052 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.463964 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.466173 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.473376 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.476572 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.491406 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" podUID="f2e9096b-893f-4741-92f0-70a241bcd035" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.564793 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-combined-ca-bundle\") pod \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.564973 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config-secret\") pod \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.565100 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config\") pod \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.565167 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn8sx\" (UniqueName: \"kubernetes.io/projected/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-kube-api-access-vn8sx\") pod \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\" (UID: \"ff2c3534-4d6f-4325-ab60-8390fdaf0e91\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.565505 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.565537 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-config-data\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.565553 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl7wd\" (UniqueName: \"kubernetes.io/projected/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-kube-api-access-bl7wd\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.565603 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-run-httpd\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.565654 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-scripts\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.565674 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-log-httpd\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.565698 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.571361 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-kube-api-access-vn8sx" (OuterVolumeSpecName: "kube-api-access-vn8sx") pod "ff2c3534-4d6f-4325-ab60-8390fdaf0e91" (UID: "ff2c3534-4d6f-4325-ab60-8390fdaf0e91"). InnerVolumeSpecName "kube-api-access-vn8sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.607560 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ff2c3534-4d6f-4325-ab60-8390fdaf0e91" (UID: "ff2c3534-4d6f-4325-ab60-8390fdaf0e91"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.620445 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff2c3534-4d6f-4325-ab60-8390fdaf0e91" (UID: "ff2c3534-4d6f-4325-ab60-8390fdaf0e91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.650821 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ff2c3534-4d6f-4325-ab60-8390fdaf0e91" (UID: "ff2c3534-4d6f-4325-ab60-8390fdaf0e91"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668144 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668203 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-config-data\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668223 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl7wd\" (UniqueName: \"kubernetes.io/projected/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-kube-api-access-bl7wd\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668279 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-run-httpd\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668331 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-scripts\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668351 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-log-httpd\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668374 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668504 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668515 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668525 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn8sx\" (UniqueName: \"kubernetes.io/projected/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-kube-api-access-vn8sx\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.668533 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff2c3534-4d6f-4325-ab60-8390fdaf0e91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.670116 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-log-httpd\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.670729 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-run-httpd\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.672186 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.672701 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-scripts\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.673311 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.681683 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-config-data\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.688381 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl7wd\" (UniqueName: \"kubernetes.io/projected/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-kube-api-access-bl7wd\") pod \"ceilometer-0\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.719424 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.771557 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-config\") pod \"79ff4955-01c1-4a09-b622-1c29d68a1a95\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.771710 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-0\") pod \"79ff4955-01c1-4a09-b622-1c29d68a1a95\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.771736 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-1\") pod \"79ff4955-01c1-4a09-b622-1c29d68a1a95\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.771866 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-tls-assets\") pod \"79ff4955-01c1-4a09-b622-1c29d68a1a95\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.771898 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-thanos-prometheus-http-client-file\") pod \"79ff4955-01c1-4a09-b622-1c29d68a1a95\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.771957 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-2\") pod \"79ff4955-01c1-4a09-b622-1c29d68a1a95\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.771990 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/79ff4955-01c1-4a09-b622-1c29d68a1a95-config-out\") pod \"79ff4955-01c1-4a09-b622-1c29d68a1a95\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.772093 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") pod \"79ff4955-01c1-4a09-b622-1c29d68a1a95\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.772134 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-web-config\") pod \"79ff4955-01c1-4a09-b622-1c29d68a1a95\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.772228 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snk8n\" (UniqueName: \"kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-kube-api-access-snk8n\") pod \"79ff4955-01c1-4a09-b622-1c29d68a1a95\" (UID: \"79ff4955-01c1-4a09-b622-1c29d68a1a95\") " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.772315 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "79ff4955-01c1-4a09-b622-1c29d68a1a95" (UID: "79ff4955-01c1-4a09-b622-1c29d68a1a95"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.772742 4909 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.773016 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "79ff4955-01c1-4a09-b622-1c29d68a1a95" (UID: "79ff4955-01c1-4a09-b622-1c29d68a1a95"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.777374 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-config" (OuterVolumeSpecName: "config") pod "79ff4955-01c1-4a09-b622-1c29d68a1a95" (UID: "79ff4955-01c1-4a09-b622-1c29d68a1a95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.777952 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "79ff4955-01c1-4a09-b622-1c29d68a1a95" (UID: "79ff4955-01c1-4a09-b622-1c29d68a1a95"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.783598 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "79ff4955-01c1-4a09-b622-1c29d68a1a95" (UID: "79ff4955-01c1-4a09-b622-1c29d68a1a95"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.785375 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79ff4955-01c1-4a09-b622-1c29d68a1a95-config-out" (OuterVolumeSpecName: "config-out") pod "79ff4955-01c1-4a09-b622-1c29d68a1a95" (UID: "79ff4955-01c1-4a09-b622-1c29d68a1a95"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.795282 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "79ff4955-01c1-4a09-b622-1c29d68a1a95" (UID: "79ff4955-01c1-4a09-b622-1c29d68a1a95"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.798619 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-kube-api-access-snk8n" (OuterVolumeSpecName: "kube-api-access-snk8n") pod "79ff4955-01c1-4a09-b622-1c29d68a1a95" (UID: "79ff4955-01c1-4a09-b622-1c29d68a1a95"). InnerVolumeSpecName "kube-api-access-snk8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.806880 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.813272 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "79ff4955-01c1-4a09-b622-1c29d68a1a95" (UID: "79ff4955-01c1-4a09-b622-1c29d68a1a95"). InnerVolumeSpecName "pvc-0480defb-d066-47ed-a96f-13341ddc88d7". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.847793 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-web-config" (OuterVolumeSpecName: "web-config") pod "79ff4955-01c1-4a09-b622-1c29d68a1a95" (UID: "79ff4955-01c1-4a09-b622-1c29d68a1a95"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.875505 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snk8n\" (UniqueName: \"kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-kube-api-access-snk8n\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.875558 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.875573 4909 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.875587 4909 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/79ff4955-01c1-4a09-b622-1c29d68a1a95-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.875600 4909 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.875612 4909 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/79ff4955-01c1-4a09-b622-1c29d68a1a95-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.875625 4909 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/79ff4955-01c1-4a09-b622-1c29d68a1a95-config-out\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.875670 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0480defb-d066-47ed-a96f-13341ddc88d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") on node \"crc\" " Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.875684 4909 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/79ff4955-01c1-4a09-b622-1c29d68a1a95-web-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.907097 4909 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.907242 4909 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0480defb-d066-47ed-a96f-13341ddc88d7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7") on node "crc" Feb 02 12:16:06 crc kubenswrapper[4909]: I0202 12:16:06.977645 4909 reconciler_common.go:293] "Volume detached for volume \"pvc-0480defb-d066-47ed-a96f-13341ddc88d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.036929 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" path="/var/lib/kubelet/pods/ff2c3534-4d6f-4325-ab60-8390fdaf0e91/volumes" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.180395 4909 generic.go:334] "Generic (PLEG): container finished" podID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" containerID="7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4" exitCode=137 Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.180563 4909 scope.go:117] "RemoveContainer" containerID="7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.180641 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.186625 4909 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ff2c3534-4d6f-4325-ab60-8390fdaf0e91" podUID="f2e9096b-893f-4741-92f0-70a241bcd035" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.187260 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.187426 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"79ff4955-01c1-4a09-b622-1c29d68a1a95","Type":"ContainerDied","Data":"773524db041d9bb1249d32f9c2521693b026f1b81da58fc70b5d77138f698cd9"} Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.209484 4909 scope.go:117] "RemoveContainer" containerID="7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4" Feb 02 12:16:07 crc kubenswrapper[4909]: E0202 12:16:07.210604 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4\": container with ID starting with 7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4 not found: ID does not exist" containerID="7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.210722 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4"} err="failed to get container status \"7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4\": rpc error: code = NotFound desc = could not find container \"7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4\": container with ID starting with 7bce57cb58810c557aa2afc9c680dbbd339438cbd85185f74a5799b27ef75bc4 not found: ID does not exist" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.210841 4909 scope.go:117] "RemoveContainer" containerID="23ed671a7117b2d29494692136ae2680fdc5576606348a60de74e77a30850eb9" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.218841 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.229219 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.246102 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 12:16:07 crc kubenswrapper[4909]: E0202 12:16:07.246832 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="init-config-reloader" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.246948 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="init-config-reloader" Feb 02 12:16:07 crc kubenswrapper[4909]: E0202 12:16:07.247027 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="config-reloader" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.247090 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="config-reloader" Feb 02 12:16:07 crc kubenswrapper[4909]: E0202 12:16:07.247180 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="prometheus" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.247249 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="prometheus" Feb 02 12:16:07 crc kubenswrapper[4909]: E0202 12:16:07.247412 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="thanos-sidecar" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.247488 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="thanos-sidecar" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.247797 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="config-reloader" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.247911 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="prometheus" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.248027 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" containerName="thanos-sidecar" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.248619 4909 scope.go:117] "RemoveContainer" containerID="e35f0bd98f6b9eda19127bb2da002631c125c237ccc7d848b01ca226b4542293" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.252145 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.260676 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.260988 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-8bc7b" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.260737 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.260770 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.260774 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.261588 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.261924 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.262039 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.265899 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.286228 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-config\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.286634 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.288787 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.289030 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d11c3571-1885-455a-bd2f-e6acfdae15ba-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.289150 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0480defb-d066-47ed-a96f-13341ddc88d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.289230 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6r7c\" (UniqueName: \"kubernetes.io/projected/d11c3571-1885-455a-bd2f-e6acfdae15ba-kube-api-access-m6r7c\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.289359 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d11c3571-1885-455a-bd2f-e6acfdae15ba-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.289428 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d11c3571-1885-455a-bd2f-e6acfdae15ba-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.289496 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.289706 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d11c3571-1885-455a-bd2f-e6acfdae15ba-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.289829 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.289902 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.289990 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d11c3571-1885-455a-bd2f-e6acfdae15ba-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.292727 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.295962 4909 scope.go:117] "RemoveContainer" containerID="ad13553b622818cf7fd80d3d5c84fbfc1788a7da980f7fbdcb7ca1e5720fd0c8" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.320957 4909 scope.go:117] "RemoveContainer" containerID="19939baf56adb5ef0a2ef7effca179c320cb0f8b84f618dc1ba1bf5aa1d3b086" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.340024 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:07 crc kubenswrapper[4909]: W0202 12:16:07.361484 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb4d0f0c_f3c2_4710_95a9_4b655b756ace.slice/crio-db8b2dba106a0e343977a1e43c37c3d33896f73babc04c8ce8d58b48e33b444c WatchSource:0}: Error finding container db8b2dba106a0e343977a1e43c37c3d33896f73babc04c8ce8d58b48e33b444c: Status 404 returned error can't find the container with id db8b2dba106a0e343977a1e43c37c3d33896f73babc04c8ce8d58b48e33b444c Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393203 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d11c3571-1885-455a-bd2f-e6acfdae15ba-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393280 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0480defb-d066-47ed-a96f-13341ddc88d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393305 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6r7c\" (UniqueName: \"kubernetes.io/projected/d11c3571-1885-455a-bd2f-e6acfdae15ba-kube-api-access-m6r7c\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393344 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d11c3571-1885-455a-bd2f-e6acfdae15ba-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393364 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d11c3571-1885-455a-bd2f-e6acfdae15ba-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393381 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393455 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d11c3571-1885-455a-bd2f-e6acfdae15ba-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393483 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393508 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393544 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d11c3571-1885-455a-bd2f-e6acfdae15ba-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393588 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-config\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393620 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.393658 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.409236 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d11c3571-1885-455a-bd2f-e6acfdae15ba-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.409703 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d11c3571-1885-455a-bd2f-e6acfdae15ba-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.412671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.413511 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.415281 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/d11c3571-1885-455a-bd2f-e6acfdae15ba-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.415832 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.416992 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d11c3571-1885-455a-bd2f-e6acfdae15ba-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.418307 4909 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.418356 4909 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0480defb-d066-47ed-a96f-13341ddc88d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/01d20f6e279cfc29a4c60dc2c6f23d34cac9e38a012d3a024f6e025f113c80d1/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.423486 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.431392 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-config\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.445285 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/d11c3571-1885-455a-bd2f-e6acfdae15ba-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.460490 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d11c3571-1885-455a-bd2f-e6acfdae15ba-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.473616 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6r7c\" (UniqueName: \"kubernetes.io/projected/d11c3571-1885-455a-bd2f-e6acfdae15ba-kube-api-access-m6r7c\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.510555 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0480defb-d066-47ed-a96f-13341ddc88d7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0480defb-d066-47ed-a96f-13341ddc88d7\") pod \"prometheus-metric-storage-0\" (UID: \"d11c3571-1885-455a-bd2f-e6acfdae15ba\") " pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:07 crc kubenswrapper[4909]: I0202 12:16:07.600709 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:08 crc kubenswrapper[4909]: W0202 12:16:08.000640 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11c3571_1885_455a_bd2f_e6acfdae15ba.slice/crio-1f11b4940a48275622b1dedd51542f9ad4cc0b2cb299634e4651e031ca80f575 WatchSource:0}: Error finding container 1f11b4940a48275622b1dedd51542f9ad4cc0b2cb299634e4651e031ca80f575: Status 404 returned error can't find the container with id 1f11b4940a48275622b1dedd51542f9ad4cc0b2cb299634e4651e031ca80f575 Feb 02 12:16:08 crc kubenswrapper[4909]: I0202 12:16:08.002798 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 02 12:16:08 crc kubenswrapper[4909]: I0202 12:16:08.196624 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb4d0f0c-f3c2-4710-95a9-4b655b756ace","Type":"ContainerStarted","Data":"db8b2dba106a0e343977a1e43c37c3d33896f73babc04c8ce8d58b48e33b444c"} Feb 02 12:16:08 crc kubenswrapper[4909]: I0202 12:16:08.216307 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d11c3571-1885-455a-bd2f-e6acfdae15ba","Type":"ContainerStarted","Data":"1f11b4940a48275622b1dedd51542f9ad4cc0b2cb299634e4651e031ca80f575"} Feb 02 12:16:09 crc kubenswrapper[4909]: I0202 12:16:09.029796 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79ff4955-01c1-4a09-b622-1c29d68a1a95" path="/var/lib/kubelet/pods/79ff4955-01c1-4a09-b622-1c29d68a1a95/volumes" Feb 02 12:16:09 crc kubenswrapper[4909]: I0202 12:16:09.231768 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb4d0f0c-f3c2-4710-95a9-4b655b756ace","Type":"ContainerStarted","Data":"ce8b943ce00d5011fe71ea6711b90db7fd83c405803c03625984ec233f3d6134"} Feb 02 12:16:09 crc kubenswrapper[4909]: I0202 12:16:09.231855 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb4d0f0c-f3c2-4710-95a9-4b655b756ace","Type":"ContainerStarted","Data":"cce01c314859a26d70199888887e5864b78cfddcaacbfd4f69419880a4b40240"} Feb 02 12:16:11 crc kubenswrapper[4909]: I0202 12:16:11.272563 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb4d0f0c-f3c2-4710-95a9-4b655b756ace","Type":"ContainerStarted","Data":"5d59068639bc313f8b5c586cc1f10a6ae7dcecbfba69b314530ab86d48e68642"} Feb 02 12:16:12 crc kubenswrapper[4909]: I0202 12:16:12.283672 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d11c3571-1885-455a-bd2f-e6acfdae15ba","Type":"ContainerStarted","Data":"ae7b7ba56347f3c10df82f7539899815da16b045e7c92753ff7ef51754a2d902"} Feb 02 12:16:13 crc kubenswrapper[4909]: I0202 12:16:13.298077 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb4d0f0c-f3c2-4710-95a9-4b655b756ace","Type":"ContainerStarted","Data":"d4e9ae1321b650b7f4d214174ad66c5fca528a4fdbc920e1d444dbff4a716379"} Feb 02 12:16:13 crc kubenswrapper[4909]: I0202 12:16:13.324485 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.958892767 podStartE2EDuration="7.32446193s" podCreationTimestamp="2026-02-02 12:16:06 +0000 UTC" firstStartedPulling="2026-02-02 12:16:07.366128071 +0000 UTC m=+6293.112228806" lastFinishedPulling="2026-02-02 12:16:12.731697244 +0000 UTC m=+6298.477797969" observedRunningTime="2026-02-02 12:16:13.321412634 +0000 UTC m=+6299.067513379" watchObservedRunningTime="2026-02-02 12:16:13.32446193 +0000 UTC m=+6299.070562665" Feb 02 12:16:14 crc kubenswrapper[4909]: I0202 12:16:14.307287 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.103704 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-f2xgf"] Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.106982 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f2xgf" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.118665 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-f2xgf"] Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.208692 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-2ac2-account-create-update-944cv"] Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.210247 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-2ac2-account-create-update-944cv" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.212636 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.229877 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-2ac2-account-create-update-944cv"] Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.239781 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl6q2\" (UniqueName: \"kubernetes.io/projected/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-kube-api-access-jl6q2\") pod \"aodh-db-create-f2xgf\" (UID: \"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf\") " pod="openstack/aodh-db-create-f2xgf" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.240023 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-operator-scripts\") pod \"aodh-db-create-f2xgf\" (UID: \"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf\") " pod="openstack/aodh-db-create-f2xgf" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.341509 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf6xm\" (UniqueName: \"kubernetes.io/projected/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-kube-api-access-zf6xm\") pod \"aodh-2ac2-account-create-update-944cv\" (UID: \"55c55dfe-a573-4cfd-8d73-cce2d262e7ff\") " pod="openstack/aodh-2ac2-account-create-update-944cv" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.341666 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl6q2\" (UniqueName: \"kubernetes.io/projected/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-kube-api-access-jl6q2\") pod \"aodh-db-create-f2xgf\" (UID: \"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf\") " pod="openstack/aodh-db-create-f2xgf" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.341757 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-operator-scripts\") pod \"aodh-db-create-f2xgf\" (UID: \"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf\") " pod="openstack/aodh-db-create-f2xgf" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.341864 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-operator-scripts\") pod \"aodh-2ac2-account-create-update-944cv\" (UID: \"55c55dfe-a573-4cfd-8d73-cce2d262e7ff\") " pod="openstack/aodh-2ac2-account-create-update-944cv" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.343145 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-operator-scripts\") pod \"aodh-db-create-f2xgf\" (UID: \"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf\") " pod="openstack/aodh-db-create-f2xgf" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.369449 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl6q2\" (UniqueName: \"kubernetes.io/projected/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-kube-api-access-jl6q2\") pod \"aodh-db-create-f2xgf\" (UID: \"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf\") " pod="openstack/aodh-db-create-f2xgf" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.443590 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-operator-scripts\") pod \"aodh-2ac2-account-create-update-944cv\" (UID: \"55c55dfe-a573-4cfd-8d73-cce2d262e7ff\") " pod="openstack/aodh-2ac2-account-create-update-944cv" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.444023 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf6xm\" (UniqueName: \"kubernetes.io/projected/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-kube-api-access-zf6xm\") pod \"aodh-2ac2-account-create-update-944cv\" (UID: \"55c55dfe-a573-4cfd-8d73-cce2d262e7ff\") " pod="openstack/aodh-2ac2-account-create-update-944cv" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.444657 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-operator-scripts\") pod \"aodh-2ac2-account-create-update-944cv\" (UID: \"55c55dfe-a573-4cfd-8d73-cce2d262e7ff\") " pod="openstack/aodh-2ac2-account-create-update-944cv" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.466095 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf6xm\" (UniqueName: \"kubernetes.io/projected/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-kube-api-access-zf6xm\") pod \"aodh-2ac2-account-create-update-944cv\" (UID: \"55c55dfe-a573-4cfd-8d73-cce2d262e7ff\") " pod="openstack/aodh-2ac2-account-create-update-944cv" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.482421 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f2xgf" Feb 02 12:16:16 crc kubenswrapper[4909]: I0202 12:16:16.526946 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-2ac2-account-create-update-944cv" Feb 02 12:16:17 crc kubenswrapper[4909]: I0202 12:16:17.218005 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-f2xgf"] Feb 02 12:16:17 crc kubenswrapper[4909]: W0202 12:16:17.221052 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8eb4f3a_633d_49d9_bb32_6be5b029f5cf.slice/crio-9c02123dc723cf16d85025e4b2a826cd8a039c31a3531c305c77d1b1d23c6282 WatchSource:0}: Error finding container 9c02123dc723cf16d85025e4b2a826cd8a039c31a3531c305c77d1b1d23c6282: Status 404 returned error can't find the container with id 9c02123dc723cf16d85025e4b2a826cd8a039c31a3531c305c77d1b1d23c6282 Feb 02 12:16:17 crc kubenswrapper[4909]: I0202 12:16:17.342535 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-2ac2-account-create-update-944cv"] Feb 02 12:16:17 crc kubenswrapper[4909]: W0202 12:16:17.350370 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55c55dfe_a573_4cfd_8d73_cce2d262e7ff.slice/crio-41a3b67aad21c7e24e320396776942765b505b01f9d43c5d26421d5b5d387c0a WatchSource:0}: Error finding container 41a3b67aad21c7e24e320396776942765b505b01f9d43c5d26421d5b5d387c0a: Status 404 returned error can't find the container with id 41a3b67aad21c7e24e320396776942765b505b01f9d43c5d26421d5b5d387c0a Feb 02 12:16:17 crc kubenswrapper[4909]: I0202 12:16:17.375121 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f2xgf" event={"ID":"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf","Type":"ContainerStarted","Data":"9c02123dc723cf16d85025e4b2a826cd8a039c31a3531c305c77d1b1d23c6282"} Feb 02 12:16:17 crc kubenswrapper[4909]: I0202 12:16:17.387674 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-2ac2-account-create-update-944cv" event={"ID":"55c55dfe-a573-4cfd-8d73-cce2d262e7ff","Type":"ContainerStarted","Data":"41a3b67aad21c7e24e320396776942765b505b01f9d43c5d26421d5b5d387c0a"} Feb 02 12:16:18 crc kubenswrapper[4909]: E0202 12:16:18.045310 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55c55dfe_a573_4cfd_8d73_cce2d262e7ff.slice/crio-conmon-4960d871dac71e24fd970d12f598741a6ccef16d2599222031692a97570b19bf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11c3571_1885_455a_bd2f_e6acfdae15ba.slice/crio-ae7b7ba56347f3c10df82f7539899815da16b045e7c92753ff7ef51754a2d902.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd11c3571_1885_455a_bd2f_e6acfdae15ba.slice/crio-conmon-ae7b7ba56347f3c10df82f7539899815da16b045e7c92753ff7ef51754a2d902.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55c55dfe_a573_4cfd_8d73_cce2d262e7ff.slice/crio-4960d871dac71e24fd970d12f598741a6ccef16d2599222031692a97570b19bf.scope\": RecentStats: unable to find data in memory cache]" Feb 02 12:16:18 crc kubenswrapper[4909]: I0202 12:16:18.409713 4909 generic.go:334] "Generic (PLEG): container finished" podID="55c55dfe-a573-4cfd-8d73-cce2d262e7ff" containerID="4960d871dac71e24fd970d12f598741a6ccef16d2599222031692a97570b19bf" exitCode=0 Feb 02 12:16:18 crc kubenswrapper[4909]: I0202 12:16:18.409887 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-2ac2-account-create-update-944cv" event={"ID":"55c55dfe-a573-4cfd-8d73-cce2d262e7ff","Type":"ContainerDied","Data":"4960d871dac71e24fd970d12f598741a6ccef16d2599222031692a97570b19bf"} Feb 02 12:16:18 crc kubenswrapper[4909]: I0202 12:16:18.414562 4909 generic.go:334] "Generic (PLEG): container finished" podID="d11c3571-1885-455a-bd2f-e6acfdae15ba" containerID="ae7b7ba56347f3c10df82f7539899815da16b045e7c92753ff7ef51754a2d902" exitCode=0 Feb 02 12:16:18 crc kubenswrapper[4909]: I0202 12:16:18.414617 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d11c3571-1885-455a-bd2f-e6acfdae15ba","Type":"ContainerDied","Data":"ae7b7ba56347f3c10df82f7539899815da16b045e7c92753ff7ef51754a2d902"} Feb 02 12:16:18 crc kubenswrapper[4909]: I0202 12:16:18.419268 4909 generic.go:334] "Generic (PLEG): container finished" podID="a8eb4f3a-633d-49d9-bb32-6be5b029f5cf" containerID="9cc8f86b911fd76be3c2301f273a7400c5629a8d2d974d69ad961da19709b1ab" exitCode=0 Feb 02 12:16:18 crc kubenswrapper[4909]: I0202 12:16:18.419315 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f2xgf" event={"ID":"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf","Type":"ContainerDied","Data":"9cc8f86b911fd76be3c2301f273a7400c5629a8d2d974d69ad961da19709b1ab"} Feb 02 12:16:19 crc kubenswrapper[4909]: I0202 12:16:19.434368 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d11c3571-1885-455a-bd2f-e6acfdae15ba","Type":"ContainerStarted","Data":"cf5221870c679331d41d5aa7a49ca79b2f8a8d5bff5ba03aeac91ce2830f1d54"} Feb 02 12:16:19 crc kubenswrapper[4909]: I0202 12:16:19.510436 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:16:19 crc kubenswrapper[4909]: I0202 12:16:19.510786 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:16:19 crc kubenswrapper[4909]: I0202 12:16:19.962786 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-2ac2-account-create-update-944cv" Feb 02 12:16:19 crc kubenswrapper[4909]: I0202 12:16:19.970534 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f2xgf" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.029507 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-operator-scripts\") pod \"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf\" (UID: \"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf\") " Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.029748 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-operator-scripts\") pod \"55c55dfe-a573-4cfd-8d73-cce2d262e7ff\" (UID: \"55c55dfe-a573-4cfd-8d73-cce2d262e7ff\") " Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.029949 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf6xm\" (UniqueName: \"kubernetes.io/projected/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-kube-api-access-zf6xm\") pod \"55c55dfe-a573-4cfd-8d73-cce2d262e7ff\" (UID: \"55c55dfe-a573-4cfd-8d73-cce2d262e7ff\") " Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.030022 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl6q2\" (UniqueName: \"kubernetes.io/projected/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-kube-api-access-jl6q2\") pod \"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf\" (UID: \"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf\") " Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.030685 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a8eb4f3a-633d-49d9-bb32-6be5b029f5cf" (UID: "a8eb4f3a-633d-49d9-bb32-6be5b029f5cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.030754 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55c55dfe-a573-4cfd-8d73-cce2d262e7ff" (UID: "55c55dfe-a573-4cfd-8d73-cce2d262e7ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.030885 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.030910 4909 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.039610 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-kube-api-access-zf6xm" (OuterVolumeSpecName: "kube-api-access-zf6xm") pod "55c55dfe-a573-4cfd-8d73-cce2d262e7ff" (UID: "55c55dfe-a573-4cfd-8d73-cce2d262e7ff"). InnerVolumeSpecName "kube-api-access-zf6xm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.039883 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-kube-api-access-jl6q2" (OuterVolumeSpecName: "kube-api-access-jl6q2") pod "a8eb4f3a-633d-49d9-bb32-6be5b029f5cf" (UID: "a8eb4f3a-633d-49d9-bb32-6be5b029f5cf"). InnerVolumeSpecName "kube-api-access-jl6q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.139196 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf6xm\" (UniqueName: \"kubernetes.io/projected/55c55dfe-a573-4cfd-8d73-cce2d262e7ff-kube-api-access-zf6xm\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.139344 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl6q2\" (UniqueName: \"kubernetes.io/projected/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf-kube-api-access-jl6q2\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.446980 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f2xgf" event={"ID":"a8eb4f3a-633d-49d9-bb32-6be5b029f5cf","Type":"ContainerDied","Data":"9c02123dc723cf16d85025e4b2a826cd8a039c31a3531c305c77d1b1d23c6282"} Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.447297 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c02123dc723cf16d85025e4b2a826cd8a039c31a3531c305c77d1b1d23c6282" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.446997 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f2xgf" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.450901 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-2ac2-account-create-update-944cv" event={"ID":"55c55dfe-a573-4cfd-8d73-cce2d262e7ff","Type":"ContainerDied","Data":"41a3b67aad21c7e24e320396776942765b505b01f9d43c5d26421d5b5d387c0a"} Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.450944 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41a3b67aad21c7e24e320396776942765b505b01f9d43c5d26421d5b5d387c0a" Feb 02 12:16:20 crc kubenswrapper[4909]: I0202 12:16:20.450974 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-2ac2-account-create-update-944cv" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.767879 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-j6khp"] Feb 02 12:16:21 crc kubenswrapper[4909]: E0202 12:16:21.768932 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8eb4f3a-633d-49d9-bb32-6be5b029f5cf" containerName="mariadb-database-create" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.768977 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8eb4f3a-633d-49d9-bb32-6be5b029f5cf" containerName="mariadb-database-create" Feb 02 12:16:21 crc kubenswrapper[4909]: E0202 12:16:21.769000 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c55dfe-a573-4cfd-8d73-cce2d262e7ff" containerName="mariadb-account-create-update" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.769007 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c55dfe-a573-4cfd-8d73-cce2d262e7ff" containerName="mariadb-account-create-update" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.769392 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c55dfe-a573-4cfd-8d73-cce2d262e7ff" containerName="mariadb-account-create-update" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.769434 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8eb4f3a-633d-49d9-bb32-6be5b029f5cf" containerName="mariadb-database-create" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.770670 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.792361 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.792435 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9kf59" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.792604 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.792850 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.800700 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-j6khp"] Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.895127 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-config-data\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.895337 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-combined-ca-bundle\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.895366 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-scripts\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.895398 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45sl4\" (UniqueName: \"kubernetes.io/projected/9b642a8d-95a4-4129-8a23-81739ace58d8-kube-api-access-45sl4\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.997971 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-combined-ca-bundle\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.998040 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-scripts\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.998080 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45sl4\" (UniqueName: \"kubernetes.io/projected/9b642a8d-95a4-4129-8a23-81739ace58d8-kube-api-access-45sl4\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:21 crc kubenswrapper[4909]: I0202 12:16:21.998197 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-config-data\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:22 crc kubenswrapper[4909]: I0202 12:16:22.005213 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-scripts\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:22 crc kubenswrapper[4909]: I0202 12:16:22.006237 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-config-data\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:22 crc kubenswrapper[4909]: I0202 12:16:22.297422 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-combined-ca-bundle\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:22 crc kubenswrapper[4909]: I0202 12:16:22.297432 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45sl4\" (UniqueName: \"kubernetes.io/projected/9b642a8d-95a4-4129-8a23-81739ace58d8-kube-api-access-45sl4\") pod \"aodh-db-sync-j6khp\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:22 crc kubenswrapper[4909]: I0202 12:16:22.414632 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:22 crc kubenswrapper[4909]: W0202 12:16:22.975455 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b642a8d_95a4_4129_8a23_81739ace58d8.slice/crio-ea81107939828ef7462cd1926ec0a2b8fc9779e7b15656a62236e4f5a086d8eb WatchSource:0}: Error finding container ea81107939828ef7462cd1926ec0a2b8fc9779e7b15656a62236e4f5a086d8eb: Status 404 returned error can't find the container with id ea81107939828ef7462cd1926ec0a2b8fc9779e7b15656a62236e4f5a086d8eb Feb 02 12:16:22 crc kubenswrapper[4909]: I0202 12:16:22.978019 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 12:16:22 crc kubenswrapper[4909]: I0202 12:16:22.984740 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-j6khp"] Feb 02 12:16:23 crc kubenswrapper[4909]: I0202 12:16:23.530930 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d11c3571-1885-455a-bd2f-e6acfdae15ba","Type":"ContainerStarted","Data":"86b313ae7db99d5782cec3d4aa8eaaa9716bd37510b13e13b413af58f7c1e996"} Feb 02 12:16:23 crc kubenswrapper[4909]: I0202 12:16:23.531178 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d11c3571-1885-455a-bd2f-e6acfdae15ba","Type":"ContainerStarted","Data":"1a1ba2c29659098027f4ea717f99069a76326bb8878182a290b7e52e3700d943"} Feb 02 12:16:23 crc kubenswrapper[4909]: I0202 12:16:23.533528 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-j6khp" event={"ID":"9b642a8d-95a4-4129-8a23-81739ace58d8","Type":"ContainerStarted","Data":"ea81107939828ef7462cd1926ec0a2b8fc9779e7b15656a62236e4f5a086d8eb"} Feb 02 12:16:23 crc kubenswrapper[4909]: I0202 12:16:23.572761 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.572737227 podStartE2EDuration="16.572737227s" podCreationTimestamp="2026-02-02 12:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:16:23.562520488 +0000 UTC m=+6309.308621233" watchObservedRunningTime="2026-02-02 12:16:23.572737227 +0000 UTC m=+6309.318837962" Feb 02 12:16:27 crc kubenswrapper[4909]: I0202 12:16:27.318701 4909 scope.go:117] "RemoveContainer" containerID="7ee880a2cb6d634dec71fbea463a53e84345ffca12464196465fac4c754087c3" Feb 02 12:16:27 crc kubenswrapper[4909]: I0202 12:16:27.601172 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:28 crc kubenswrapper[4909]: I0202 12:16:28.502080 4909 scope.go:117] "RemoveContainer" containerID="1051899be5519f40d1171dc941739379b8ad062d29734ff26b8910234cac96f4" Feb 02 12:16:28 crc kubenswrapper[4909]: I0202 12:16:28.554334 4909 scope.go:117] "RemoveContainer" containerID="98ba386a8f3d4542e0c86b3588a8190f5dfb775ddca30d03a1168434e01530dc" Feb 02 12:16:28 crc kubenswrapper[4909]: I0202 12:16:28.716249 4909 scope.go:117] "RemoveContainer" containerID="458bc11f9c82911eec716c469623b3293d4b82ddfd5dac84759fc610c1f9f30c" Feb 02 12:16:28 crc kubenswrapper[4909]: I0202 12:16:28.812983 4909 scope.go:117] "RemoveContainer" containerID="a965a9b2e3adc2da89a159d42f257f60e3f094b62ee5418c4c0e324cbaa8294f" Feb 02 12:16:29 crc kubenswrapper[4909]: I0202 12:16:29.600557 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-j6khp" event={"ID":"9b642a8d-95a4-4129-8a23-81739ace58d8","Type":"ContainerStarted","Data":"73c688761d96589b2d874c1b8568bea569e4f43517e6fe0e6776372ade0924a1"} Feb 02 12:16:29 crc kubenswrapper[4909]: I0202 12:16:29.621895 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-j6khp" podStartSLOduration=3.040986971 podStartE2EDuration="8.621882122s" podCreationTimestamp="2026-02-02 12:16:21 +0000 UTC" firstStartedPulling="2026-02-02 12:16:22.977772369 +0000 UTC m=+6308.723873104" lastFinishedPulling="2026-02-02 12:16:28.55866752 +0000 UTC m=+6314.304768255" observedRunningTime="2026-02-02 12:16:29.617417185 +0000 UTC m=+6315.363517940" watchObservedRunningTime="2026-02-02 12:16:29.621882122 +0000 UTC m=+6315.367982857" Feb 02 12:16:31 crc kubenswrapper[4909]: I0202 12:16:31.624470 4909 generic.go:334] "Generic (PLEG): container finished" podID="9b642a8d-95a4-4129-8a23-81739ace58d8" containerID="73c688761d96589b2d874c1b8568bea569e4f43517e6fe0e6776372ade0924a1" exitCode=0 Feb 02 12:16:31 crc kubenswrapper[4909]: I0202 12:16:31.624530 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-j6khp" event={"ID":"9b642a8d-95a4-4129-8a23-81739ace58d8","Type":"ContainerDied","Data":"73c688761d96589b2d874c1b8568bea569e4f43517e6fe0e6776372ade0924a1"} Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.093957 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.256422 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-scripts\") pod \"9b642a8d-95a4-4129-8a23-81739ace58d8\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.256549 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-config-data\") pod \"9b642a8d-95a4-4129-8a23-81739ace58d8\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.256707 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45sl4\" (UniqueName: \"kubernetes.io/projected/9b642a8d-95a4-4129-8a23-81739ace58d8-kube-api-access-45sl4\") pod \"9b642a8d-95a4-4129-8a23-81739ace58d8\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.256792 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-combined-ca-bundle\") pod \"9b642a8d-95a4-4129-8a23-81739ace58d8\" (UID: \"9b642a8d-95a4-4129-8a23-81739ace58d8\") " Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.262005 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-scripts" (OuterVolumeSpecName: "scripts") pod "9b642a8d-95a4-4129-8a23-81739ace58d8" (UID: "9b642a8d-95a4-4129-8a23-81739ace58d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.262535 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b642a8d-95a4-4129-8a23-81739ace58d8-kube-api-access-45sl4" (OuterVolumeSpecName: "kube-api-access-45sl4") pod "9b642a8d-95a4-4129-8a23-81739ace58d8" (UID: "9b642a8d-95a4-4129-8a23-81739ace58d8"). InnerVolumeSpecName "kube-api-access-45sl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.290126 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b642a8d-95a4-4129-8a23-81739ace58d8" (UID: "9b642a8d-95a4-4129-8a23-81739ace58d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.300942 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-config-data" (OuterVolumeSpecName: "config-data") pod "9b642a8d-95a4-4129-8a23-81739ace58d8" (UID: "9b642a8d-95a4-4129-8a23-81739ace58d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.359254 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45sl4\" (UniqueName: \"kubernetes.io/projected/9b642a8d-95a4-4129-8a23-81739ace58d8-kube-api-access-45sl4\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.359515 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.359602 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.359685 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b642a8d-95a4-4129-8a23-81739ace58d8-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.650641 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-j6khp" event={"ID":"9b642a8d-95a4-4129-8a23-81739ace58d8","Type":"ContainerDied","Data":"ea81107939828ef7462cd1926ec0a2b8fc9779e7b15656a62236e4f5a086d8eb"} Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.650681 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-j6khp" Feb 02 12:16:33 crc kubenswrapper[4909]: I0202 12:16:33.650690 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea81107939828ef7462cd1926ec0a2b8fc9779e7b15656a62236e4f5a086d8eb" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.858998 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.875477 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 02 12:16:36 crc kubenswrapper[4909]: E0202 12:16:36.875935 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b642a8d-95a4-4129-8a23-81739ace58d8" containerName="aodh-db-sync" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.875952 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b642a8d-95a4-4129-8a23-81739ace58d8" containerName="aodh-db-sync" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.876183 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b642a8d-95a4-4129-8a23-81739ace58d8" containerName="aodh-db-sync" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.878106 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.884533 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.885056 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9kf59" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.885062 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.914203 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.943222 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-config-data\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.943371 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q25m\" (UniqueName: \"kubernetes.io/projected/475ccef7-f010-4322-8223-5956a6e879f3-kube-api-access-7q25m\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.943424 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:36 crc kubenswrapper[4909]: I0202 12:16:36.943516 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-scripts\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.045111 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.045233 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-scripts\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.045280 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-config-data\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.045375 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q25m\" (UniqueName: \"kubernetes.io/projected/475ccef7-f010-4322-8223-5956a6e879f3-kube-api-access-7q25m\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.052133 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.055148 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-config-data\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.055386 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-scripts\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.061221 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q25m\" (UniqueName: \"kubernetes.io/projected/475ccef7-f010-4322-8223-5956a6e879f3-kube-api-access-7q25m\") pod \"aodh-0\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " pod="openstack/aodh-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.218165 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.602190 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.613903 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.692317 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 02 12:16:37 crc kubenswrapper[4909]: I0202 12:16:37.816885 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 02 12:16:38 crc kubenswrapper[4909]: I0202 12:16:38.698894 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"475ccef7-f010-4322-8223-5956a6e879f3","Type":"ContainerStarted","Data":"51c3727c6e3a0f6576de9032af07bea9a409584ad0522e2345eac9dbcd3afee1"} Feb 02 12:16:39 crc kubenswrapper[4909]: I0202 12:16:39.440986 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:39 crc kubenswrapper[4909]: I0202 12:16:39.441548 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="ceilometer-central-agent" containerID="cri-o://cce01c314859a26d70199888887e5864b78cfddcaacbfd4f69419880a4b40240" gracePeriod=30 Feb 02 12:16:39 crc kubenswrapper[4909]: I0202 12:16:39.442061 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="proxy-httpd" containerID="cri-o://d4e9ae1321b650b7f4d214174ad66c5fca528a4fdbc920e1d444dbff4a716379" gracePeriod=30 Feb 02 12:16:39 crc kubenswrapper[4909]: I0202 12:16:39.442123 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="sg-core" containerID="cri-o://5d59068639bc313f8b5c586cc1f10a6ae7dcecbfba69b314530ab86d48e68642" gracePeriod=30 Feb 02 12:16:39 crc kubenswrapper[4909]: I0202 12:16:39.442154 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="ceilometer-notification-agent" containerID="cri-o://ce8b943ce00d5011fe71ea6711b90db7fd83c405803c03625984ec233f3d6134" gracePeriod=30 Feb 02 12:16:39 crc kubenswrapper[4909]: I0202 12:16:39.720394 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerID="5d59068639bc313f8b5c586cc1f10a6ae7dcecbfba69b314530ab86d48e68642" exitCode=2 Feb 02 12:16:39 crc kubenswrapper[4909]: I0202 12:16:39.720791 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb4d0f0c-f3c2-4710-95a9-4b655b756ace","Type":"ContainerDied","Data":"5d59068639bc313f8b5c586cc1f10a6ae7dcecbfba69b314530ab86d48e68642"} Feb 02 12:16:39 crc kubenswrapper[4909]: I0202 12:16:39.724458 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"475ccef7-f010-4322-8223-5956a6e879f3","Type":"ContainerStarted","Data":"f510893acf9c99f6ba8a298938ae047fd5c30ea95efee1d2e6924af1cd280502"} Feb 02 12:16:40 crc kubenswrapper[4909]: I0202 12:16:40.737314 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerID="d4e9ae1321b650b7f4d214174ad66c5fca528a4fdbc920e1d444dbff4a716379" exitCode=0 Feb 02 12:16:40 crc kubenswrapper[4909]: I0202 12:16:40.738132 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerID="cce01c314859a26d70199888887e5864b78cfddcaacbfd4f69419880a4b40240" exitCode=0 Feb 02 12:16:40 crc kubenswrapper[4909]: I0202 12:16:40.737415 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb4d0f0c-f3c2-4710-95a9-4b655b756ace","Type":"ContainerDied","Data":"d4e9ae1321b650b7f4d214174ad66c5fca528a4fdbc920e1d444dbff4a716379"} Feb 02 12:16:40 crc kubenswrapper[4909]: I0202 12:16:40.738219 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb4d0f0c-f3c2-4710-95a9-4b655b756ace","Type":"ContainerDied","Data":"cce01c314859a26d70199888887e5864b78cfddcaacbfd4f69419880a4b40240"} Feb 02 12:16:40 crc kubenswrapper[4909]: I0202 12:16:40.740577 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"475ccef7-f010-4322-8223-5956a6e879f3","Type":"ContainerStarted","Data":"549a66a6fc307e5613198759d5250573531e750b1a453aecbffdd44060893730"} Feb 02 12:16:41 crc kubenswrapper[4909]: I0202 12:16:41.761365 4909 generic.go:334] "Generic (PLEG): container finished" podID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerID="ce8b943ce00d5011fe71ea6711b90db7fd83c405803c03625984ec233f3d6134" exitCode=0 Feb 02 12:16:41 crc kubenswrapper[4909]: I0202 12:16:41.761956 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb4d0f0c-f3c2-4710-95a9-4b655b756ace","Type":"ContainerDied","Data":"ce8b943ce00d5011fe71ea6711b90db7fd83c405803c03625984ec233f3d6134"} Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.314163 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.384728 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl7wd\" (UniqueName: \"kubernetes.io/projected/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-kube-api-access-bl7wd\") pod \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.384845 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-config-data\") pod \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.384894 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-run-httpd\") pod \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.384967 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-sg-core-conf-yaml\") pod \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.385015 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-combined-ca-bundle\") pod \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.385080 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-log-httpd\") pod \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.385123 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-scripts\") pod \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\" (UID: \"bb4d0f0c-f3c2-4710-95a9-4b655b756ace\") " Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.387129 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bb4d0f0c-f3c2-4710-95a9-4b655b756ace" (UID: "bb4d0f0c-f3c2-4710-95a9-4b655b756ace"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.387358 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bb4d0f0c-f3c2-4710-95a9-4b655b756ace" (UID: "bb4d0f0c-f3c2-4710-95a9-4b655b756ace"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.394958 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-scripts" (OuterVolumeSpecName: "scripts") pod "bb4d0f0c-f3c2-4710-95a9-4b655b756ace" (UID: "bb4d0f0c-f3c2-4710-95a9-4b655b756ace"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.396988 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-kube-api-access-bl7wd" (OuterVolumeSpecName: "kube-api-access-bl7wd") pod "bb4d0f0c-f3c2-4710-95a9-4b655b756ace" (UID: "bb4d0f0c-f3c2-4710-95a9-4b655b756ace"). InnerVolumeSpecName "kube-api-access-bl7wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.445006 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bb4d0f0c-f3c2-4710-95a9-4b655b756ace" (UID: "bb4d0f0c-f3c2-4710-95a9-4b655b756ace"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.488147 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.488178 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.488188 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.488199 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.488207 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl7wd\" (UniqueName: \"kubernetes.io/projected/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-kube-api-access-bl7wd\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.512350 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb4d0f0c-f3c2-4710-95a9-4b655b756ace" (UID: "bb4d0f0c-f3c2-4710-95a9-4b655b756ace"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.542857 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-config-data" (OuterVolumeSpecName: "config-data") pod "bb4d0f0c-f3c2-4710-95a9-4b655b756ace" (UID: "bb4d0f0c-f3c2-4710-95a9-4b655b756ace"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.590676 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.590707 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4d0f0c-f3c2-4710-95a9-4b655b756ace-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.793365 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bb4d0f0c-f3c2-4710-95a9-4b655b756ace","Type":"ContainerDied","Data":"db8b2dba106a0e343977a1e43c37c3d33896f73babc04c8ce8d58b48e33b444c"} Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.793479 4909 scope.go:117] "RemoveContainer" containerID="d4e9ae1321b650b7f4d214174ad66c5fca528a4fdbc920e1d444dbff4a716379" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.793770 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.811370 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"475ccef7-f010-4322-8223-5956a6e879f3","Type":"ContainerStarted","Data":"60b4221d1cbaae54edeeb382625e87dff88e9171d9ef35d73be20c40b6380c2e"} Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.853977 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.864110 4909 scope.go:117] "RemoveContainer" containerID="5d59068639bc313f8b5c586cc1f10a6ae7dcecbfba69b314530ab86d48e68642" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.870517 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.925905 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:42 crc kubenswrapper[4909]: E0202 12:16:42.926425 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="ceilometer-notification-agent" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.926441 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="ceilometer-notification-agent" Feb 02 12:16:42 crc kubenswrapper[4909]: E0202 12:16:42.926465 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="sg-core" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.926474 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="sg-core" Feb 02 12:16:42 crc kubenswrapper[4909]: E0202 12:16:42.926488 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="ceilometer-central-agent" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.926494 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="ceilometer-central-agent" Feb 02 12:16:42 crc kubenswrapper[4909]: E0202 12:16:42.926527 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="proxy-httpd" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.926533 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="proxy-httpd" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.926733 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="ceilometer-central-agent" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.926745 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="sg-core" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.926761 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="proxy-httpd" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.926775 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" containerName="ceilometer-notification-agent" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.928783 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.941336 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.941851 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.949172 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:42 crc kubenswrapper[4909]: I0202 12:16:42.978891 4909 scope.go:117] "RemoveContainer" containerID="ce8b943ce00d5011fe71ea6711b90db7fd83c405803c03625984ec233f3d6134" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.000903 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-scripts\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.001010 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-run-httpd\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.001045 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.001074 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhhh\" (UniqueName: \"kubernetes.io/projected/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-kube-api-access-sdhhh\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.001106 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-config-data\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.001157 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.001196 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-log-httpd\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.022989 4909 scope.go:117] "RemoveContainer" containerID="cce01c314859a26d70199888887e5864b78cfddcaacbfd4f69419880a4b40240" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.031306 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb4d0f0c-f3c2-4710-95a9-4b655b756ace" path="/var/lib/kubelet/pods/bb4d0f0c-f3c2-4710-95a9-4b655b756ace/volumes" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.103138 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-scripts\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.103510 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-run-httpd\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.103555 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.104057 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-run-httpd\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.104264 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhhh\" (UniqueName: \"kubernetes.io/projected/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-kube-api-access-sdhhh\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.104752 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-config-data\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.104831 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.104868 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-log-httpd\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.105436 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-log-httpd\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.111377 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-scripts\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.111671 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-config-data\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.114366 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.116539 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.126413 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhhh\" (UniqueName: \"kubernetes.io/projected/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-kube-api-access-sdhhh\") pod \"ceilometer-0\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.290852 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 12:16:43 crc kubenswrapper[4909]: I0202 12:16:43.546318 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.182635 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.592942 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.593756 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="664bbecb-dcb0-42c9-8da4-83110fbcf138" containerName="kube-state-metrics" containerID="cri-o://c2cf9a24c99f77fc0a23813f79b9805519b53df0cb341e4afdbb4213d1cdb48a" gracePeriod=30 Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.865283 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2","Type":"ContainerStarted","Data":"b7cc53d3e439cfb4e9ba2e501007a9680bceba275a83f40315fd5e1e04ef92dc"} Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.884577 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"475ccef7-f010-4322-8223-5956a6e879f3","Type":"ContainerStarted","Data":"7461a34b33bd3499f3147e377bcd0b7d277edacf26a3da86ea63438385b40f45"} Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.884751 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-api" containerID="cri-o://f510893acf9c99f6ba8a298938ae047fd5c30ea95efee1d2e6924af1cd280502" gracePeriod=30 Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.885153 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-listener" containerID="cri-o://7461a34b33bd3499f3147e377bcd0b7d277edacf26a3da86ea63438385b40f45" gracePeriod=30 Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.890394 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-notifier" containerID="cri-o://60b4221d1cbaae54edeeb382625e87dff88e9171d9ef35d73be20c40b6380c2e" gracePeriod=30 Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.890534 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-evaluator" containerID="cri-o://549a66a6fc307e5613198759d5250573531e750b1a453aecbffdd44060893730" gracePeriod=30 Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.907569 4909 generic.go:334] "Generic (PLEG): container finished" podID="664bbecb-dcb0-42c9-8da4-83110fbcf138" containerID="c2cf9a24c99f77fc0a23813f79b9805519b53df0cb341e4afdbb4213d1cdb48a" exitCode=2 Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.907627 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"664bbecb-dcb0-42c9-8da4-83110fbcf138","Type":"ContainerDied","Data":"c2cf9a24c99f77fc0a23813f79b9805519b53df0cb341e4afdbb4213d1cdb48a"} Feb 02 12:16:44 crc kubenswrapper[4909]: I0202 12:16:44.931674 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.995217429 podStartE2EDuration="8.931647796s" podCreationTimestamp="2026-02-02 12:16:36 +0000 UTC" firstStartedPulling="2026-02-02 12:16:37.829043307 +0000 UTC m=+6323.575144042" lastFinishedPulling="2026-02-02 12:16:43.765473674 +0000 UTC m=+6329.511574409" observedRunningTime="2026-02-02 12:16:44.919157422 +0000 UTC m=+6330.665258157" watchObservedRunningTime="2026-02-02 12:16:44.931647796 +0000 UTC m=+6330.677748531" Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.269107 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.389797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wprtw\" (UniqueName: \"kubernetes.io/projected/664bbecb-dcb0-42c9-8da4-83110fbcf138-kube-api-access-wprtw\") pod \"664bbecb-dcb0-42c9-8da4-83110fbcf138\" (UID: \"664bbecb-dcb0-42c9-8da4-83110fbcf138\") " Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.395927 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/664bbecb-dcb0-42c9-8da4-83110fbcf138-kube-api-access-wprtw" (OuterVolumeSpecName: "kube-api-access-wprtw") pod "664bbecb-dcb0-42c9-8da4-83110fbcf138" (UID: "664bbecb-dcb0-42c9-8da4-83110fbcf138"). InnerVolumeSpecName "kube-api-access-wprtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.492722 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wprtw\" (UniqueName: \"kubernetes.io/projected/664bbecb-dcb0-42c9-8da4-83110fbcf138-kube-api-access-wprtw\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.920725 4909 generic.go:334] "Generic (PLEG): container finished" podID="475ccef7-f010-4322-8223-5956a6e879f3" containerID="60b4221d1cbaae54edeeb382625e87dff88e9171d9ef35d73be20c40b6380c2e" exitCode=0 Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.921052 4909 generic.go:334] "Generic (PLEG): container finished" podID="475ccef7-f010-4322-8223-5956a6e879f3" containerID="549a66a6fc307e5613198759d5250573531e750b1a453aecbffdd44060893730" exitCode=0 Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.921062 4909 generic.go:334] "Generic (PLEG): container finished" podID="475ccef7-f010-4322-8223-5956a6e879f3" containerID="f510893acf9c99f6ba8a298938ae047fd5c30ea95efee1d2e6924af1cd280502" exitCode=0 Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.921109 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"475ccef7-f010-4322-8223-5956a6e879f3","Type":"ContainerDied","Data":"60b4221d1cbaae54edeeb382625e87dff88e9171d9ef35d73be20c40b6380c2e"} Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.921140 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"475ccef7-f010-4322-8223-5956a6e879f3","Type":"ContainerDied","Data":"549a66a6fc307e5613198759d5250573531e750b1a453aecbffdd44060893730"} Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.921151 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"475ccef7-f010-4322-8223-5956a6e879f3","Type":"ContainerDied","Data":"f510893acf9c99f6ba8a298938ae047fd5c30ea95efee1d2e6924af1cd280502"} Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.924450 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"664bbecb-dcb0-42c9-8da4-83110fbcf138","Type":"ContainerDied","Data":"6173cc6bda83fd22d5f90fe1a12ebc3acd4a20e3d52faf032a1c52d84504b57a"} Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.924500 4909 scope.go:117] "RemoveContainer" containerID="c2cf9a24c99f77fc0a23813f79b9805519b53df0cb341e4afdbb4213d1cdb48a" Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.924620 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.938690 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2","Type":"ContainerStarted","Data":"84b9a321f6796e8ff9947c150644b9a9073a86cfe34dd8c2368a17ab78eb259d"} Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.938742 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2","Type":"ContainerStarted","Data":"d30e42df699923788848a6d68e7e1c062c27f9c08cb19f7978409264e2e2b346"} Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.978001 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 12:16:45 crc kubenswrapper[4909]: I0202 12:16:45.995534 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.016776 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 12:16:46 crc kubenswrapper[4909]: E0202 12:16:46.017312 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="664bbecb-dcb0-42c9-8da4-83110fbcf138" containerName="kube-state-metrics" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.017331 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="664bbecb-dcb0-42c9-8da4-83110fbcf138" containerName="kube-state-metrics" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.017544 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="664bbecb-dcb0-42c9-8da4-83110fbcf138" containerName="kube-state-metrics" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.018427 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.021004 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.021004 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.051016 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.104073 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.104259 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.104307 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klwd9\" (UniqueName: \"kubernetes.io/projected/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-kube-api-access-klwd9\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.104354 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.206684 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.206800 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.206847 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klwd9\" (UniqueName: \"kubernetes.io/projected/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-kube-api-access-klwd9\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.206896 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.213438 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.215234 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.227644 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.243511 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klwd9\" (UniqueName: \"kubernetes.io/projected/5ae91886-34b7-4b28-99ac-6f5e7bced3c7-kube-api-access-klwd9\") pod \"kube-state-metrics-0\" (UID: \"5ae91886-34b7-4b28-99ac-6f5e7bced3c7\") " pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.344467 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.936043 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 02 12:16:46 crc kubenswrapper[4909]: W0202 12:16:46.936872 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ae91886_34b7_4b28_99ac_6f5e7bced3c7.slice/crio-80968b28735e5bc0aab15f98e18afc019199a4199e97ddc28b848c5a48961187 WatchSource:0}: Error finding container 80968b28735e5bc0aab15f98e18afc019199a4199e97ddc28b848c5a48961187: Status 404 returned error can't find the container with id 80968b28735e5bc0aab15f98e18afc019199a4199e97ddc28b848c5a48961187 Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.948925 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ae91886-34b7-4b28-99ac-6f5e7bced3c7","Type":"ContainerStarted","Data":"80968b28735e5bc0aab15f98e18afc019199a4199e97ddc28b848c5a48961187"} Feb 02 12:16:46 crc kubenswrapper[4909]: I0202 12:16:46.951682 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2","Type":"ContainerStarted","Data":"109eb7ab0bf1d7f9f9bd507d81ad4374346f9a7802bed873f2f80457edfff0a3"} Feb 02 12:16:47 crc kubenswrapper[4909]: I0202 12:16:47.027697 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="664bbecb-dcb0-42c9-8da4-83110fbcf138" path="/var/lib/kubelet/pods/664bbecb-dcb0-42c9-8da4-83110fbcf138/volumes" Feb 02 12:16:47 crc kubenswrapper[4909]: I0202 12:16:47.627008 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:47 crc kubenswrapper[4909]: I0202 12:16:47.963233 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ae91886-34b7-4b28-99ac-6f5e7bced3c7","Type":"ContainerStarted","Data":"3e2257c74be40f388a3301573ce028fcb6ffde5ab95f42f4b5f47af27e7673d8"} Feb 02 12:16:47 crc kubenswrapper[4909]: I0202 12:16:47.964479 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 02 12:16:47 crc kubenswrapper[4909]: I0202 12:16:47.985200 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.629875121 podStartE2EDuration="2.98517847s" podCreationTimestamp="2026-02-02 12:16:45 +0000 UTC" firstStartedPulling="2026-02-02 12:16:46.939854406 +0000 UTC m=+6332.685955141" lastFinishedPulling="2026-02-02 12:16:47.295157765 +0000 UTC m=+6333.041258490" observedRunningTime="2026-02-02 12:16:47.98412354 +0000 UTC m=+6333.730224275" watchObservedRunningTime="2026-02-02 12:16:47.98517847 +0000 UTC m=+6333.731279205" Feb 02 12:16:48 crc kubenswrapper[4909]: I0202 12:16:48.979385 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2","Type":"ContainerStarted","Data":"da4ef6638af145724b5fe64fe577490de64b0c3e89a710a3f050547f13bf2809"} Feb 02 12:16:48 crc kubenswrapper[4909]: I0202 12:16:48.979572 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="ceilometer-central-agent" containerID="cri-o://d30e42df699923788848a6d68e7e1c062c27f9c08cb19f7978409264e2e2b346" gracePeriod=30 Feb 02 12:16:48 crc kubenswrapper[4909]: I0202 12:16:48.979601 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="proxy-httpd" containerID="cri-o://da4ef6638af145724b5fe64fe577490de64b0c3e89a710a3f050547f13bf2809" gracePeriod=30 Feb 02 12:16:48 crc kubenswrapper[4909]: I0202 12:16:48.979654 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="ceilometer-notification-agent" containerID="cri-o://84b9a321f6796e8ff9947c150644b9a9073a86cfe34dd8c2368a17ab78eb259d" gracePeriod=30 Feb 02 12:16:48 crc kubenswrapper[4909]: I0202 12:16:48.979673 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="sg-core" containerID="cri-o://109eb7ab0bf1d7f9f9bd507d81ad4374346f9a7802bed873f2f80457edfff0a3" gracePeriod=30 Feb 02 12:16:48 crc kubenswrapper[4909]: I0202 12:16:48.981041 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 12:16:49 crc kubenswrapper[4909]: I0202 12:16:49.015992 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.598648179 podStartE2EDuration="7.015974942s" podCreationTimestamp="2026-02-02 12:16:42 +0000 UTC" firstStartedPulling="2026-02-02 12:16:44.20105316 +0000 UTC m=+6329.947153905" lastFinishedPulling="2026-02-02 12:16:48.618379933 +0000 UTC m=+6334.364480668" observedRunningTime="2026-02-02 12:16:49.006046281 +0000 UTC m=+6334.752147026" watchObservedRunningTime="2026-02-02 12:16:49.015974942 +0000 UTC m=+6334.762075677" Feb 02 12:16:49 crc kubenswrapper[4909]: I0202 12:16:49.511455 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:16:49 crc kubenswrapper[4909]: I0202 12:16:49.511519 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:16:49 crc kubenswrapper[4909]: I0202 12:16:49.990339 4909 generic.go:334] "Generic (PLEG): container finished" podID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerID="da4ef6638af145724b5fe64fe577490de64b0c3e89a710a3f050547f13bf2809" exitCode=0 Feb 02 12:16:49 crc kubenswrapper[4909]: I0202 12:16:49.990552 4909 generic.go:334] "Generic (PLEG): container finished" podID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerID="109eb7ab0bf1d7f9f9bd507d81ad4374346f9a7802bed873f2f80457edfff0a3" exitCode=2 Feb 02 12:16:49 crc kubenswrapper[4909]: I0202 12:16:49.990562 4909 generic.go:334] "Generic (PLEG): container finished" podID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerID="84b9a321f6796e8ff9947c150644b9a9073a86cfe34dd8c2368a17ab78eb259d" exitCode=0 Feb 02 12:16:49 crc kubenswrapper[4909]: I0202 12:16:49.990432 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2","Type":"ContainerDied","Data":"da4ef6638af145724b5fe64fe577490de64b0c3e89a710a3f050547f13bf2809"} Feb 02 12:16:49 crc kubenswrapper[4909]: I0202 12:16:49.990669 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2","Type":"ContainerDied","Data":"109eb7ab0bf1d7f9f9bd507d81ad4374346f9a7802bed873f2f80457edfff0a3"} Feb 02 12:16:49 crc kubenswrapper[4909]: I0202 12:16:49.990686 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2","Type":"ContainerDied","Data":"84b9a321f6796e8ff9947c150644b9a9073a86cfe34dd8c2368a17ab78eb259d"} Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.016232 4909 generic.go:334] "Generic (PLEG): container finished" podID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerID="d30e42df699923788848a6d68e7e1c062c27f9c08cb19f7978409264e2e2b346" exitCode=0 Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.016260 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2","Type":"ContainerDied","Data":"d30e42df699923788848a6d68e7e1c062c27f9c08cb19f7978409264e2e2b346"} Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.490293 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.645350 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-log-httpd\") pod \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.645641 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdhhh\" (UniqueName: \"kubernetes.io/projected/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-kube-api-access-sdhhh\") pod \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.645725 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-sg-core-conf-yaml\") pod \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.645768 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-scripts\") pod \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.645832 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-config-data\") pod \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.645890 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-combined-ca-bundle\") pod \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.645956 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" (UID: "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.645977 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-run-httpd\") pod \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\" (UID: \"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2\") " Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.647508 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" (UID: "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.647698 4909 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.647713 4909 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.653458 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-scripts" (OuterVolumeSpecName: "scripts") pod "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" (UID: "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.656009 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-kube-api-access-sdhhh" (OuterVolumeSpecName: "kube-api-access-sdhhh") pod "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" (UID: "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2"). InnerVolumeSpecName "kube-api-access-sdhhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.678058 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" (UID: "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.725562 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" (UID: "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.749307 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.749353 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdhhh\" (UniqueName: \"kubernetes.io/projected/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-kube-api-access-sdhhh\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.749367 4909 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.749382 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.754556 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-config-data" (OuterVolumeSpecName: "config-data") pod "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" (UID: "bf9b0317-8316-4ebd-9ae4-dae875d9bbf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:16:52 crc kubenswrapper[4909]: I0202 12:16:52.850928 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.029356 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bf9b0317-8316-4ebd-9ae4-dae875d9bbf2","Type":"ContainerDied","Data":"b7cc53d3e439cfb4e9ba2e501007a9680bceba275a83f40315fd5e1e04ef92dc"} Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.029432 4909 scope.go:117] "RemoveContainer" containerID="da4ef6638af145724b5fe64fe577490de64b0c3e89a710a3f050547f13bf2809" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.029450 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.062705 4909 scope.go:117] "RemoveContainer" containerID="109eb7ab0bf1d7f9f9bd507d81ad4374346f9a7802bed873f2f80457edfff0a3" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.086349 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.093383 4909 scope.go:117] "RemoveContainer" containerID="84b9a321f6796e8ff9947c150644b9a9073a86cfe34dd8c2368a17ab78eb259d" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.098309 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.111388 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:53 crc kubenswrapper[4909]: E0202 12:16:53.113488 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="sg-core" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.113526 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="sg-core" Feb 02 12:16:53 crc kubenswrapper[4909]: E0202 12:16:53.113543 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="ceilometer-central-agent" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.113552 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="ceilometer-central-agent" Feb 02 12:16:53 crc kubenswrapper[4909]: E0202 12:16:53.113578 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="ceilometer-notification-agent" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.113586 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="ceilometer-notification-agent" Feb 02 12:16:53 crc kubenswrapper[4909]: E0202 12:16:53.113600 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="proxy-httpd" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.113608 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="proxy-httpd" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.114289 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="proxy-httpd" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.114328 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="ceilometer-notification-agent" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.114354 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="ceilometer-central-agent" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.114367 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" containerName="sg-core" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.116875 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.118854 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.118966 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.119192 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.134731 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.151735 4909 scope.go:117] "RemoveContainer" containerID="d30e42df699923788848a6d68e7e1c062c27f9c08cb19f7978409264e2e2b346" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.259023 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.259101 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-scripts\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.259181 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-config-data\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.259258 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.259355 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4650b583-6dc6-4707-8b7a-8742a9275288-run-httpd\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.259390 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.259524 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9q5j\" (UniqueName: \"kubernetes.io/projected/4650b583-6dc6-4707-8b7a-8742a9275288-kube-api-access-b9q5j\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.259553 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4650b583-6dc6-4707-8b7a-8742a9275288-log-httpd\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.361534 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.361592 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-scripts\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.361626 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-config-data\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.361676 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.361750 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4650b583-6dc6-4707-8b7a-8742a9275288-run-httpd\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.361778 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.361855 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9q5j\" (UniqueName: \"kubernetes.io/projected/4650b583-6dc6-4707-8b7a-8742a9275288-kube-api-access-b9q5j\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.361877 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4650b583-6dc6-4707-8b7a-8742a9275288-log-httpd\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.362412 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4650b583-6dc6-4707-8b7a-8742a9275288-log-httpd\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.362428 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4650b583-6dc6-4707-8b7a-8742a9275288-run-httpd\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.366092 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.366099 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.366516 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-config-data\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.379247 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9q5j\" (UniqueName: \"kubernetes.io/projected/4650b583-6dc6-4707-8b7a-8742a9275288-kube-api-access-b9q5j\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.379248 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.385704 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4650b583-6dc6-4707-8b7a-8742a9275288-scripts\") pod \"ceilometer-0\" (UID: \"4650b583-6dc6-4707-8b7a-8742a9275288\") " pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.452309 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 02 12:16:53 crc kubenswrapper[4909]: I0202 12:16:53.925353 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 02 12:16:54 crc kubenswrapper[4909]: I0202 12:16:54.040755 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4650b583-6dc6-4707-8b7a-8742a9275288","Type":"ContainerStarted","Data":"7fe1e152ab483a1a77e2691a5aee40867af8b3a69924860057ba1ba1db5555f6"} Feb 02 12:16:55 crc kubenswrapper[4909]: I0202 12:16:55.042199 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf9b0317-8316-4ebd-9ae4-dae875d9bbf2" path="/var/lib/kubelet/pods/bf9b0317-8316-4ebd-9ae4-dae875d9bbf2/volumes" Feb 02 12:16:55 crc kubenswrapper[4909]: I0202 12:16:55.060063 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4650b583-6dc6-4707-8b7a-8742a9275288","Type":"ContainerStarted","Data":"f3b1d2612768b96b49a3140d2e91c9817cdaa7ebf08dafc8182c27ea4c70ae77"} Feb 02 12:16:56 crc kubenswrapper[4909]: I0202 12:16:56.070714 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4650b583-6dc6-4707-8b7a-8742a9275288","Type":"ContainerStarted","Data":"3b12230a1bf23d2bf04cf581866796191914b3892a2d5698d75031af0143c4d5"} Feb 02 12:16:56 crc kubenswrapper[4909]: I0202 12:16:56.071346 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4650b583-6dc6-4707-8b7a-8742a9275288","Type":"ContainerStarted","Data":"4894977dee0fd4e01a881880962ef554f11e6e94cba3ab3200aa70c334e1a623"} Feb 02 12:16:56 crc kubenswrapper[4909]: I0202 12:16:56.354429 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 02 12:16:58 crc kubenswrapper[4909]: I0202 12:16:58.063624 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hrrp5"] Feb 02 12:16:58 crc kubenswrapper[4909]: I0202 12:16:58.073672 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hrrp5"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.028966 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44f1c177-7f74-49e3-a737-1cf825d08c5d" path="/var/lib/kubelet/pods/44f1c177-7f74-49e3-a737-1cf825d08c5d/volumes" Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.036469 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-5743-account-create-update-v2jqx"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.049632 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-9rcwj"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.062179 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-m8g58"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.074704 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5399-account-create-update-rlcdb"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.086012 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-b701-account-create-update-vpmq8"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.101281 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-b701-account-create-update-vpmq8"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.113320 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-9rcwj"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.117596 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4650b583-6dc6-4707-8b7a-8742a9275288","Type":"ContainerStarted","Data":"fba82141eab221858f3ee42b3515a99db1e59176ecb3cb5ed85ca937ff0a2f39"} Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.118766 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.124691 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5399-account-create-update-rlcdb"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.142125 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-m8g58"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.151877 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-5743-account-create-update-v2jqx"] Feb 02 12:16:59 crc kubenswrapper[4909]: I0202 12:16:59.157119 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.219943229 podStartE2EDuration="6.15710026s" podCreationTimestamp="2026-02-02 12:16:53 +0000 UTC" firstStartedPulling="2026-02-02 12:16:53.935061769 +0000 UTC m=+6339.681162514" lastFinishedPulling="2026-02-02 12:16:57.87221881 +0000 UTC m=+6343.618319545" observedRunningTime="2026-02-02 12:16:59.13841037 +0000 UTC m=+6344.884511105" watchObservedRunningTime="2026-02-02 12:16:59.15710026 +0000 UTC m=+6344.903200995" Feb 02 12:17:01 crc kubenswrapper[4909]: I0202 12:17:01.026905 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e427d5-c936-4052-958c-18d89b07527c" path="/var/lib/kubelet/pods/42e427d5-c936-4052-958c-18d89b07527c/volumes" Feb 02 12:17:01 crc kubenswrapper[4909]: I0202 12:17:01.028213 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4848315d-b355-43b8-961b-440dd5a94e2b" path="/var/lib/kubelet/pods/4848315d-b355-43b8-961b-440dd5a94e2b/volumes" Feb 02 12:17:01 crc kubenswrapper[4909]: I0202 12:17:01.028879 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8714421b-a562-42dc-8b61-262ddf02239f" path="/var/lib/kubelet/pods/8714421b-a562-42dc-8b61-262ddf02239f/volumes" Feb 02 12:17:01 crc kubenswrapper[4909]: I0202 12:17:01.029466 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d11b84-2bb6-432f-9a25-04877030be31" path="/var/lib/kubelet/pods/c5d11b84-2bb6-432f-9a25-04877030be31/volumes" Feb 02 12:17:01 crc kubenswrapper[4909]: I0202 12:17:01.030921 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2986b45-6e46-4553-999c-6a89b4565b88" path="/var/lib/kubelet/pods/d2986b45-6e46-4553-999c-6a89b4565b88/volumes" Feb 02 12:17:13 crc kubenswrapper[4909]: I0202 12:17:13.034415 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lljl6"] Feb 02 12:17:13 crc kubenswrapper[4909]: I0202 12:17:13.064488 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-lljl6"] Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.033525 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8ec03f-2941-41a2-b43f-10c7041993d0" path="/var/lib/kubelet/pods/5c8ec03f-2941-41a2-b43f-10c7041993d0/volumes" Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.267089 4909 generic.go:334] "Generic (PLEG): container finished" podID="475ccef7-f010-4322-8223-5956a6e879f3" containerID="7461a34b33bd3499f3147e377bcd0b7d277edacf26a3da86ea63438385b40f45" exitCode=137 Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.267178 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"475ccef7-f010-4322-8223-5956a6e879f3","Type":"ContainerDied","Data":"7461a34b33bd3499f3147e377bcd0b7d277edacf26a3da86ea63438385b40f45"} Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.267450 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"475ccef7-f010-4322-8223-5956a6e879f3","Type":"ContainerDied","Data":"51c3727c6e3a0f6576de9032af07bea9a409584ad0522e2345eac9dbcd3afee1"} Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.267467 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51c3727c6e3a0f6576de9032af07bea9a409584ad0522e2345eac9dbcd3afee1" Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.360511 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.415996 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-combined-ca-bundle\") pod \"475ccef7-f010-4322-8223-5956a6e879f3\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.416120 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q25m\" (UniqueName: \"kubernetes.io/projected/475ccef7-f010-4322-8223-5956a6e879f3-kube-api-access-7q25m\") pod \"475ccef7-f010-4322-8223-5956a6e879f3\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.416225 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-config-data\") pod \"475ccef7-f010-4322-8223-5956a6e879f3\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.416335 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-scripts\") pod \"475ccef7-f010-4322-8223-5956a6e879f3\" (UID: \"475ccef7-f010-4322-8223-5956a6e879f3\") " Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.425038 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/475ccef7-f010-4322-8223-5956a6e879f3-kube-api-access-7q25m" (OuterVolumeSpecName: "kube-api-access-7q25m") pod "475ccef7-f010-4322-8223-5956a6e879f3" (UID: "475ccef7-f010-4322-8223-5956a6e879f3"). InnerVolumeSpecName "kube-api-access-7q25m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.433007 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-scripts" (OuterVolumeSpecName: "scripts") pod "475ccef7-f010-4322-8223-5956a6e879f3" (UID: "475ccef7-f010-4322-8223-5956a6e879f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.519842 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q25m\" (UniqueName: \"kubernetes.io/projected/475ccef7-f010-4322-8223-5956a6e879f3-kube-api-access-7q25m\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.520152 4909 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-scripts\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.593921 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "475ccef7-f010-4322-8223-5956a6e879f3" (UID: "475ccef7-f010-4322-8223-5956a6e879f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.608059 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-config-data" (OuterVolumeSpecName: "config-data") pod "475ccef7-f010-4322-8223-5956a6e879f3" (UID: "475ccef7-f010-4322-8223-5956a6e879f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.621490 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:15 crc kubenswrapper[4909]: I0202 12:17:15.621521 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/475ccef7-f010-4322-8223-5956a6e879f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.275632 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.315611 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.328373 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.342084 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 02 12:17:16 crc kubenswrapper[4909]: E0202 12:17:16.342755 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-notifier" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.342906 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-notifier" Feb 02 12:17:16 crc kubenswrapper[4909]: E0202 12:17:16.343003 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-api" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.343075 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-api" Feb 02 12:17:16 crc kubenswrapper[4909]: E0202 12:17:16.343150 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-evaluator" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.343201 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-evaluator" Feb 02 12:17:16 crc kubenswrapper[4909]: E0202 12:17:16.343338 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-listener" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.343418 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-listener" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.343713 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-api" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.343802 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-evaluator" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.343884 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-notifier" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.343948 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="475ccef7-f010-4322-8223-5956a6e879f3" containerName="aodh-listener" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.345853 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.349845 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.350050 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.350183 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.351719 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9kf59" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.356189 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.357265 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.543010 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-internal-tls-certs\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.543349 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-public-tls-certs\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.544315 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97bc5\" (UniqueName: \"kubernetes.io/projected/18e799c0-b0ec-4362-8d6a-0d490bb38aab-kube-api-access-97bc5\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.544532 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-combined-ca-bundle\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.544640 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-config-data\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.544673 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-scripts\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.647874 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-public-tls-certs\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.647958 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97bc5\" (UniqueName: \"kubernetes.io/projected/18e799c0-b0ec-4362-8d6a-0d490bb38aab-kube-api-access-97bc5\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.647995 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-combined-ca-bundle\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.648015 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-config-data\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.648030 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-scripts\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.648104 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-internal-tls-certs\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.653178 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-public-tls-certs\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.655876 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-scripts\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.656134 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-internal-tls-certs\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.656258 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-combined-ca-bundle\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.656592 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e799c0-b0ec-4362-8d6a-0d490bb38aab-config-data\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.669394 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97bc5\" (UniqueName: \"kubernetes.io/projected/18e799c0-b0ec-4362-8d6a-0d490bb38aab-kube-api-access-97bc5\") pod \"aodh-0\" (UID: \"18e799c0-b0ec-4362-8d6a-0d490bb38aab\") " pod="openstack/aodh-0" Feb 02 12:17:16 crc kubenswrapper[4909]: I0202 12:17:16.967546 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 02 12:17:17 crc kubenswrapper[4909]: I0202 12:17:17.046466 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="475ccef7-f010-4322-8223-5956a6e879f3" path="/var/lib/kubelet/pods/475ccef7-f010-4322-8223-5956a6e879f3/volumes" Feb 02 12:17:17 crc kubenswrapper[4909]: I0202 12:17:17.492662 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 02 12:17:17 crc kubenswrapper[4909]: W0202 12:17:17.496795 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e799c0_b0ec_4362_8d6a_0d490bb38aab.slice/crio-837fa9b3b7f643c2bb75af2a5b873e251016e2e8854c1e3d432e2cb9f9fbfaad WatchSource:0}: Error finding container 837fa9b3b7f643c2bb75af2a5b873e251016e2e8854c1e3d432e2cb9f9fbfaad: Status 404 returned error can't find the container with id 837fa9b3b7f643c2bb75af2a5b873e251016e2e8854c1e3d432e2cb9f9fbfaad Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.297081 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e799c0-b0ec-4362-8d6a-0d490bb38aab","Type":"ContainerStarted","Data":"837fa9b3b7f643c2bb75af2a5b873e251016e2e8854c1e3d432e2cb9f9fbfaad"} Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.581925 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bcffb485-mkz9x"] Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.584042 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.604005 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.610449 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bcffb485-mkz9x"] Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.690271 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-openstack-cell1\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.690356 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsf9h\" (UniqueName: \"kubernetes.io/projected/589e6fdd-d3b7-4dba-9a23-4a669e8befab-kube-api-access-gsf9h\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.690418 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-config\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.690541 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-sb\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.690826 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-dns-svc\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.690849 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-nb\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.794449 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-openstack-cell1\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.794723 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsf9h\" (UniqueName: \"kubernetes.io/projected/589e6fdd-d3b7-4dba-9a23-4a669e8befab-kube-api-access-gsf9h\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.795130 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-config\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.795241 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-sb\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.795459 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-dns-svc\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.795567 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-nb\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.795904 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-openstack-cell1\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.796648 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-nb\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.797707 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-dns-svc\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.797828 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-sb\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.798919 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-config\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.828686 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsf9h\" (UniqueName: \"kubernetes.io/projected/589e6fdd-d3b7-4dba-9a23-4a669e8befab-kube-api-access-gsf9h\") pod \"dnsmasq-dns-84bcffb485-mkz9x\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:18 crc kubenswrapper[4909]: I0202 12:17:18.905568 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:19 crc kubenswrapper[4909]: I0202 12:17:19.310246 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e799c0-b0ec-4362-8d6a-0d490bb38aab","Type":"ContainerStarted","Data":"81ee818c35a053fc463b01668e15d787ccbd5dcabfd0e619dfa2e5329079aa58"} Feb 02 12:17:19 crc kubenswrapper[4909]: I0202 12:17:19.310676 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e799c0-b0ec-4362-8d6a-0d490bb38aab","Type":"ContainerStarted","Data":"e65243849f97c43e614ef3a432138c72958d7721371ec07404041e8ef5069558"} Feb 02 12:17:19 crc kubenswrapper[4909]: I0202 12:17:19.420364 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bcffb485-mkz9x"] Feb 02 12:17:19 crc kubenswrapper[4909]: I0202 12:17:19.511140 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:17:19 crc kubenswrapper[4909]: I0202 12:17:19.511744 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:17:19 crc kubenswrapper[4909]: I0202 12:17:19.511824 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 12:17:19 crc kubenswrapper[4909]: I0202 12:17:19.513142 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 12:17:19 crc kubenswrapper[4909]: I0202 12:17:19.513231 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" gracePeriod=600 Feb 02 12:17:19 crc kubenswrapper[4909]: E0202 12:17:19.671034 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:17:20 crc kubenswrapper[4909]: I0202 12:17:20.319517 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e799c0-b0ec-4362-8d6a-0d490bb38aab","Type":"ContainerStarted","Data":"90f0ec5d5bb59fa95713ea0c45d5388290051abf5aac06a499757642683b9a8f"} Feb 02 12:17:20 crc kubenswrapper[4909]: I0202 12:17:20.323665 4909 generic.go:334] "Generic (PLEG): container finished" podID="589e6fdd-d3b7-4dba-9a23-4a669e8befab" containerID="1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863" exitCode=0 Feb 02 12:17:20 crc kubenswrapper[4909]: I0202 12:17:20.323724 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" event={"ID":"589e6fdd-d3b7-4dba-9a23-4a669e8befab","Type":"ContainerDied","Data":"1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863"} Feb 02 12:17:20 crc kubenswrapper[4909]: I0202 12:17:20.324095 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" event={"ID":"589e6fdd-d3b7-4dba-9a23-4a669e8befab","Type":"ContainerStarted","Data":"c7775d4bb37444fb448eb27a92eced75f6007b6be20d5b3024a6ea8ac1e30dbb"} Feb 02 12:17:20 crc kubenswrapper[4909]: I0202 12:17:20.328096 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" exitCode=0 Feb 02 12:17:20 crc kubenswrapper[4909]: I0202 12:17:20.328134 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0"} Feb 02 12:17:20 crc kubenswrapper[4909]: I0202 12:17:20.328163 4909 scope.go:117] "RemoveContainer" containerID="7efda102e84c90fb99553b44e1e0fb3c586800eb829f32e3cb14404521e127c2" Feb 02 12:17:20 crc kubenswrapper[4909]: I0202 12:17:20.328956 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:17:20 crc kubenswrapper[4909]: E0202 12:17:20.329210 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:17:21 crc kubenswrapper[4909]: I0202 12:17:21.340069 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"18e799c0-b0ec-4362-8d6a-0d490bb38aab","Type":"ContainerStarted","Data":"b88defd86c6e29782f1562dbc9a3c16333c5750085b5ce8adc1731871c419245"} Feb 02 12:17:21 crc kubenswrapper[4909]: I0202 12:17:21.344590 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" event={"ID":"589e6fdd-d3b7-4dba-9a23-4a669e8befab","Type":"ContainerStarted","Data":"417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387"} Feb 02 12:17:21 crc kubenswrapper[4909]: I0202 12:17:21.345376 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:21 crc kubenswrapper[4909]: I0202 12:17:21.376099 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.492672599 podStartE2EDuration="5.376075437s" podCreationTimestamp="2026-02-02 12:17:16 +0000 UTC" firstStartedPulling="2026-02-02 12:17:17.499720691 +0000 UTC m=+6363.245821446" lastFinishedPulling="2026-02-02 12:17:20.383123549 +0000 UTC m=+6366.129224284" observedRunningTime="2026-02-02 12:17:21.372613679 +0000 UTC m=+6367.118714414" watchObservedRunningTime="2026-02-02 12:17:21.376075437 +0000 UTC m=+6367.122176172" Feb 02 12:17:21 crc kubenswrapper[4909]: I0202 12:17:21.450708 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" podStartSLOduration=3.450688244 podStartE2EDuration="3.450688244s" podCreationTimestamp="2026-02-02 12:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:17:21.430321116 +0000 UTC m=+6367.176421851" watchObservedRunningTime="2026-02-02 12:17:21.450688244 +0000 UTC m=+6367.196788979" Feb 02 12:17:23 crc kubenswrapper[4909]: I0202 12:17:23.467106 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 02 12:17:28 crc kubenswrapper[4909]: I0202 12:17:28.907007 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:28 crc kubenswrapper[4909]: I0202 12:17:28.989212 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d4bc6c6f-wd2wq"] Feb 02 12:17:28 crc kubenswrapper[4909]: I0202 12:17:28.989564 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" podUID="02dee8a0-2a8f-471d-b345-56620697930f" containerName="dnsmasq-dns" containerID="cri-o://866cf9644aadcf2c661c4b7660023c223d1995ead8613324dc731173f33f610a" gracePeriod=10 Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.023754 4909 scope.go:117] "RemoveContainer" containerID="769460b0ab5109d1895e253677ff79de6e322d836ca318ae6308a6fbbf4e02b4" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.094678 4909 scope.go:117] "RemoveContainer" containerID="24c9a991d3fd21a9ed5d3b49b487ede40551fd41bdb80efea4b32e9ff8af9291" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.231752 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8455c59dc-cb5zc"] Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.239954 4909 scope.go:117] "RemoveContainer" containerID="35dd6db6f8142a6fa8503586aa43ded6a3fc9638519c7a152f8c877b372bdc13" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.254578 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8455c59dc-cb5zc"] Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.254675 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.292290 4909 scope.go:117] "RemoveContainer" containerID="2e6a2733feb147774be39298787e3b3e5b27d1fe88d1ac9eba9603544ababe3d" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.357974 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f5g6\" (UniqueName: \"kubernetes.io/projected/ef32a0af-7943-4b2f-bf8b-a71421a53d26-kube-api-access-8f5g6\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.358461 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-ovsdbserver-nb\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.358494 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-ovsdbserver-sb\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.358540 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-openstack-cell1\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.358595 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-config\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.358695 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-dns-svc\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.364275 4909 scope.go:117] "RemoveContainer" containerID="48081bf6362bf6146ca90dc6fa377e02b8965d15aca0a3b9cd6b4eee0776927f" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.452392 4909 generic.go:334] "Generic (PLEG): container finished" podID="02dee8a0-2a8f-471d-b345-56620697930f" containerID="866cf9644aadcf2c661c4b7660023c223d1995ead8613324dc731173f33f610a" exitCode=0 Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.452453 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" event={"ID":"02dee8a0-2a8f-471d-b345-56620697930f","Type":"ContainerDied","Data":"866cf9644aadcf2c661c4b7660023c223d1995ead8613324dc731173f33f610a"} Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.460037 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-dns-svc\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.460102 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f5g6\" (UniqueName: \"kubernetes.io/projected/ef32a0af-7943-4b2f-bf8b-a71421a53d26-kube-api-access-8f5g6\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.460220 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-ovsdbserver-nb\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.460243 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-ovsdbserver-sb\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.460273 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-openstack-cell1\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.460307 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-config\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.463385 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-dns-svc\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.463695 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-ovsdbserver-nb\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.463986 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-openstack-cell1\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.464909 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-ovsdbserver-sb\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.464925 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef32a0af-7943-4b2f-bf8b-a71421a53d26-config\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.476864 4909 scope.go:117] "RemoveContainer" containerID="f509d4206fb48841bfc489248aad52ac9240136f0d669359c7867bdb92d643bc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.493179 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f5g6\" (UniqueName: \"kubernetes.io/projected/ef32a0af-7943-4b2f-bf8b-a71421a53d26-kube-api-access-8f5g6\") pod \"dnsmasq-dns-8455c59dc-cb5zc\" (UID: \"ef32a0af-7943-4b2f-bf8b-a71421a53d26\") " pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.534847 4909 scope.go:117] "RemoveContainer" containerID="bb88c2966cceec49ea9bc8516b2c47570c6c0c018f1d7519bb9180de67e69a78" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.671856 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.730485 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.767100 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-nb\") pod \"02dee8a0-2a8f-471d-b345-56620697930f\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.767156 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-dns-svc\") pod \"02dee8a0-2a8f-471d-b345-56620697930f\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.767220 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-sb\") pod \"02dee8a0-2a8f-471d-b345-56620697930f\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.767480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-config\") pod \"02dee8a0-2a8f-471d-b345-56620697930f\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.767559 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjcmp\" (UniqueName: \"kubernetes.io/projected/02dee8a0-2a8f-471d-b345-56620697930f-kube-api-access-mjcmp\") pod \"02dee8a0-2a8f-471d-b345-56620697930f\" (UID: \"02dee8a0-2a8f-471d-b345-56620697930f\") " Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.778247 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02dee8a0-2a8f-471d-b345-56620697930f-kube-api-access-mjcmp" (OuterVolumeSpecName: "kube-api-access-mjcmp") pod "02dee8a0-2a8f-471d-b345-56620697930f" (UID: "02dee8a0-2a8f-471d-b345-56620697930f"). InnerVolumeSpecName "kube-api-access-mjcmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.847885 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "02dee8a0-2a8f-471d-b345-56620697930f" (UID: "02dee8a0-2a8f-471d-b345-56620697930f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.858850 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "02dee8a0-2a8f-471d-b345-56620697930f" (UID: "02dee8a0-2a8f-471d-b345-56620697930f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.858944 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-config" (OuterVolumeSpecName: "config") pod "02dee8a0-2a8f-471d-b345-56620697930f" (UID: "02dee8a0-2a8f-471d-b345-56620697930f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.870172 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjcmp\" (UniqueName: \"kubernetes.io/projected/02dee8a0-2a8f-471d-b345-56620697930f-kube-api-access-mjcmp\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.870206 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.870219 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.870232 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.877593 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "02dee8a0-2a8f-471d-b345-56620697930f" (UID: "02dee8a0-2a8f-471d-b345-56620697930f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:17:29 crc kubenswrapper[4909]: I0202 12:17:29.973002 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02dee8a0-2a8f-471d-b345-56620697930f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:30 crc kubenswrapper[4909]: I0202 12:17:30.349482 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8455c59dc-cb5zc"] Feb 02 12:17:30 crc kubenswrapper[4909]: I0202 12:17:30.488162 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" event={"ID":"ef32a0af-7943-4b2f-bf8b-a71421a53d26","Type":"ContainerStarted","Data":"259de69e6a7a9286605bdd2cd8fa5d1c1bfb801613a7e8ccc7426757f2d0efd1"} Feb 02 12:17:30 crc kubenswrapper[4909]: I0202 12:17:30.489926 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" event={"ID":"02dee8a0-2a8f-471d-b345-56620697930f","Type":"ContainerDied","Data":"fed58ec3a28cc1a88d382a3438d42e6a2f7733d1d1c32eca752ff00627517fb5"} Feb 02 12:17:30 crc kubenswrapper[4909]: I0202 12:17:30.489960 4909 scope.go:117] "RemoveContainer" containerID="866cf9644aadcf2c661c4b7660023c223d1995ead8613324dc731173f33f610a" Feb 02 12:17:30 crc kubenswrapper[4909]: I0202 12:17:30.490036 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4bc6c6f-wd2wq" Feb 02 12:17:30 crc kubenswrapper[4909]: I0202 12:17:30.534704 4909 scope.go:117] "RemoveContainer" containerID="a036980d6ac355b6fff60685a4b354a3bc947bead551eff24929b8c2ec2143e8" Feb 02 12:17:30 crc kubenswrapper[4909]: I0202 12:17:30.577081 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d4bc6c6f-wd2wq"] Feb 02 12:17:30 crc kubenswrapper[4909]: I0202 12:17:30.586247 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86d4bc6c6f-wd2wq"] Feb 02 12:17:31 crc kubenswrapper[4909]: I0202 12:17:31.017433 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:17:31 crc kubenswrapper[4909]: E0202 12:17:31.017677 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:17:31 crc kubenswrapper[4909]: I0202 12:17:31.031617 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02dee8a0-2a8f-471d-b345-56620697930f" path="/var/lib/kubelet/pods/02dee8a0-2a8f-471d-b345-56620697930f/volumes" Feb 02 12:17:31 crc kubenswrapper[4909]: I0202 12:17:31.504155 4909 generic.go:334] "Generic (PLEG): container finished" podID="ef32a0af-7943-4b2f-bf8b-a71421a53d26" containerID="f61df5615b25c5aeff31e3888b370de17a306f89d53e5892a277382c3a215041" exitCode=0 Feb 02 12:17:31 crc kubenswrapper[4909]: I0202 12:17:31.504204 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" event={"ID":"ef32a0af-7943-4b2f-bf8b-a71421a53d26","Type":"ContainerDied","Data":"f61df5615b25c5aeff31e3888b370de17a306f89d53e5892a277382c3a215041"} Feb 02 12:17:32 crc kubenswrapper[4909]: I0202 12:17:32.066520 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nv42m"] Feb 02 12:17:32 crc kubenswrapper[4909]: I0202 12:17:32.077416 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-nv42m"] Feb 02 12:17:32 crc kubenswrapper[4909]: I0202 12:17:32.525399 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" event={"ID":"ef32a0af-7943-4b2f-bf8b-a71421a53d26","Type":"ContainerStarted","Data":"986b72c106faae8e22bb456d458cf540e7af6400dad31236241578c99b5951ef"} Feb 02 12:17:32 crc kubenswrapper[4909]: I0202 12:17:32.525757 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:32 crc kubenswrapper[4909]: I0202 12:17:32.545341 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" podStartSLOduration=3.545318981 podStartE2EDuration="3.545318981s" podCreationTimestamp="2026-02-02 12:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:17:32.540645748 +0000 UTC m=+6378.286746493" watchObservedRunningTime="2026-02-02 12:17:32.545318981 +0000 UTC m=+6378.291419716" Feb 02 12:17:33 crc kubenswrapper[4909]: I0202 12:17:33.028169 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61a9bf80-53fb-48a1-b6ce-92cb6c3b0736" path="/var/lib/kubelet/pods/61a9bf80-53fb-48a1-b6ce-92cb6c3b0736/volumes" Feb 02 12:17:34 crc kubenswrapper[4909]: I0202 12:17:34.048466 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gwh2v"] Feb 02 12:17:34 crc kubenswrapper[4909]: I0202 12:17:34.061544 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gwh2v"] Feb 02 12:17:35 crc kubenswrapper[4909]: I0202 12:17:35.029018 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bf1731a-52df-489c-a7f3-8f07774153a1" path="/var/lib/kubelet/pods/2bf1731a-52df-489c-a7f3-8f07774153a1/volumes" Feb 02 12:17:39 crc kubenswrapper[4909]: I0202 12:17:39.733680 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8455c59dc-cb5zc" Feb 02 12:17:39 crc kubenswrapper[4909]: I0202 12:17:39.838508 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bcffb485-mkz9x"] Feb 02 12:17:39 crc kubenswrapper[4909]: I0202 12:17:39.838779 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" podUID="589e6fdd-d3b7-4dba-9a23-4a669e8befab" containerName="dnsmasq-dns" containerID="cri-o://417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387" gracePeriod=10 Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.561624 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.613062 4909 generic.go:334] "Generic (PLEG): container finished" podID="589e6fdd-d3b7-4dba-9a23-4a669e8befab" containerID="417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387" exitCode=0 Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.613116 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" event={"ID":"589e6fdd-d3b7-4dba-9a23-4a669e8befab","Type":"ContainerDied","Data":"417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387"} Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.613145 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" event={"ID":"589e6fdd-d3b7-4dba-9a23-4a669e8befab","Type":"ContainerDied","Data":"c7775d4bb37444fb448eb27a92eced75f6007b6be20d5b3024a6ea8ac1e30dbb"} Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.613165 4909 scope.go:117] "RemoveContainer" containerID="417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.613362 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bcffb485-mkz9x" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.622990 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-openstack-cell1\") pod \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.623099 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-sb\") pod \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.623129 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-dns-svc\") pod \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.623166 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsf9h\" (UniqueName: \"kubernetes.io/projected/589e6fdd-d3b7-4dba-9a23-4a669e8befab-kube-api-access-gsf9h\") pod \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.623474 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-config\") pod \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.623570 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-nb\") pod \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\" (UID: \"589e6fdd-d3b7-4dba-9a23-4a669e8befab\") " Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.633487 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/589e6fdd-d3b7-4dba-9a23-4a669e8befab-kube-api-access-gsf9h" (OuterVolumeSpecName: "kube-api-access-gsf9h") pod "589e6fdd-d3b7-4dba-9a23-4a669e8befab" (UID: "589e6fdd-d3b7-4dba-9a23-4a669e8befab"). InnerVolumeSpecName "kube-api-access-gsf9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.653509 4909 scope.go:117] "RemoveContainer" containerID="1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.697260 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-config" (OuterVolumeSpecName: "config") pod "589e6fdd-d3b7-4dba-9a23-4a669e8befab" (UID: "589e6fdd-d3b7-4dba-9a23-4a669e8befab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.698612 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "589e6fdd-d3b7-4dba-9a23-4a669e8befab" (UID: "589e6fdd-d3b7-4dba-9a23-4a669e8befab"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.703560 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "589e6fdd-d3b7-4dba-9a23-4a669e8befab" (UID: "589e6fdd-d3b7-4dba-9a23-4a669e8befab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.706798 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "589e6fdd-d3b7-4dba-9a23-4a669e8befab" (UID: "589e6fdd-d3b7-4dba-9a23-4a669e8befab"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.723879 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "589e6fdd-d3b7-4dba-9a23-4a669e8befab" (UID: "589e6fdd-d3b7-4dba-9a23-4a669e8befab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.732901 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.732937 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.732955 4909 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.732964 4909 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.732975 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsf9h\" (UniqueName: \"kubernetes.io/projected/589e6fdd-d3b7-4dba-9a23-4a669e8befab-kube-api-access-gsf9h\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.732986 4909 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/589e6fdd-d3b7-4dba-9a23-4a669e8befab-config\") on node \"crc\" DevicePath \"\"" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.775647 4909 scope.go:117] "RemoveContainer" containerID="417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387" Feb 02 12:17:40 crc kubenswrapper[4909]: E0202 12:17:40.777103 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387\": container with ID starting with 417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387 not found: ID does not exist" containerID="417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.777139 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387"} err="failed to get container status \"417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387\": rpc error: code = NotFound desc = could not find container \"417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387\": container with ID starting with 417da3d0d2862effeef38ca6593b7da364185a89d20bedcbaa9d708008bbb387 not found: ID does not exist" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.777168 4909 scope.go:117] "RemoveContainer" containerID="1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863" Feb 02 12:17:40 crc kubenswrapper[4909]: E0202 12:17:40.777466 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863\": container with ID starting with 1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863 not found: ID does not exist" containerID="1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.777499 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863"} err="failed to get container status \"1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863\": rpc error: code = NotFound desc = could not find container \"1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863\": container with ID starting with 1043c963e4c4b5aa2e91fcb0a91fd772ea2c1d426ded7227086f1b2a115d2863 not found: ID does not exist" Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.943062 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bcffb485-mkz9x"] Feb 02 12:17:40 crc kubenswrapper[4909]: I0202 12:17:40.951428 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bcffb485-mkz9x"] Feb 02 12:17:41 crc kubenswrapper[4909]: I0202 12:17:41.049025 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="589e6fdd-d3b7-4dba-9a23-4a669e8befab" path="/var/lib/kubelet/pods/589e6fdd-d3b7-4dba-9a23-4a669e8befab/volumes" Feb 02 12:17:46 crc kubenswrapper[4909]: I0202 12:17:46.016539 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:17:46 crc kubenswrapper[4909]: E0202 12:17:46.017321 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.034318 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-p47c9"] Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.044969 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-p47c9"] Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.288074 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998"] Feb 02 12:17:49 crc kubenswrapper[4909]: E0202 12:17:49.288540 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02dee8a0-2a8f-471d-b345-56620697930f" containerName="init" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.288556 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="02dee8a0-2a8f-471d-b345-56620697930f" containerName="init" Feb 02 12:17:49 crc kubenswrapper[4909]: E0202 12:17:49.288566 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02dee8a0-2a8f-471d-b345-56620697930f" containerName="dnsmasq-dns" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.288573 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="02dee8a0-2a8f-471d-b345-56620697930f" containerName="dnsmasq-dns" Feb 02 12:17:49 crc kubenswrapper[4909]: E0202 12:17:49.288590 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589e6fdd-d3b7-4dba-9a23-4a669e8befab" containerName="dnsmasq-dns" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.288597 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="589e6fdd-d3b7-4dba-9a23-4a669e8befab" containerName="dnsmasq-dns" Feb 02 12:17:49 crc kubenswrapper[4909]: E0202 12:17:49.288616 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="589e6fdd-d3b7-4dba-9a23-4a669e8befab" containerName="init" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.288621 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="589e6fdd-d3b7-4dba-9a23-4a669e8befab" containerName="init" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.288795 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="589e6fdd-d3b7-4dba-9a23-4a669e8befab" containerName="dnsmasq-dns" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.288882 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="02dee8a0-2a8f-471d-b345-56620697930f" containerName="dnsmasq-dns" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.289598 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.292197 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.293018 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.293199 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.293439 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.305909 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998"] Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.321185 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.321233 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hk52\" (UniqueName: \"kubernetes.io/projected/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-kube-api-access-7hk52\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.321420 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.321515 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.423159 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.423273 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.423319 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.423345 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hk52\" (UniqueName: \"kubernetes.io/projected/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-kube-api-access-7hk52\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.429048 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.429099 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.429558 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.438796 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hk52\" (UniqueName: \"kubernetes.io/projected/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-kube-api-access-7hk52\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8n998\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:49 crc kubenswrapper[4909]: I0202 12:17:49.609758 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:17:50 crc kubenswrapper[4909]: I0202 12:17:50.225088 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998"] Feb 02 12:17:50 crc kubenswrapper[4909]: W0202 12:17:50.236839 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod568f0ced_c6cd_4f8d_8b47_f8e093042a2f.slice/crio-042c2c1b060b57b678a6a8f101773961de56abaac5875601e05d4be1dec94ba8 WatchSource:0}: Error finding container 042c2c1b060b57b678a6a8f101773961de56abaac5875601e05d4be1dec94ba8: Status 404 returned error can't find the container with id 042c2c1b060b57b678a6a8f101773961de56abaac5875601e05d4be1dec94ba8 Feb 02 12:17:50 crc kubenswrapper[4909]: I0202 12:17:50.716872 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" event={"ID":"568f0ced-c6cd-4f8d-8b47-f8e093042a2f","Type":"ContainerStarted","Data":"042c2c1b060b57b678a6a8f101773961de56abaac5875601e05d4be1dec94ba8"} Feb 02 12:17:51 crc kubenswrapper[4909]: I0202 12:17:51.029195 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c373ff3-6de7-4d89-be4c-b79c826a5d05" path="/var/lib/kubelet/pods/1c373ff3-6de7-4d89-be4c-b79c826a5d05/volumes" Feb 02 12:17:59 crc kubenswrapper[4909]: I0202 12:17:59.817884 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" event={"ID":"568f0ced-c6cd-4f8d-8b47-f8e093042a2f","Type":"ContainerStarted","Data":"5d10ae6ca4c02073bcfb92363e280b5fa4d08595247a2b8a73ee8d3b5298f016"} Feb 02 12:17:59 crc kubenswrapper[4909]: I0202 12:17:59.838592 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" podStartSLOduration=1.5270839349999998 podStartE2EDuration="10.838576628s" podCreationTimestamp="2026-02-02 12:17:49 +0000 UTC" firstStartedPulling="2026-02-02 12:17:50.23959899 +0000 UTC m=+6395.985699725" lastFinishedPulling="2026-02-02 12:17:59.551091683 +0000 UTC m=+6405.297192418" observedRunningTime="2026-02-02 12:17:59.836442478 +0000 UTC m=+6405.582543213" watchObservedRunningTime="2026-02-02 12:17:59.838576628 +0000 UTC m=+6405.584677363" Feb 02 12:18:01 crc kubenswrapper[4909]: I0202 12:18:01.017528 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:18:01 crc kubenswrapper[4909]: E0202 12:18:01.018261 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:18:12 crc kubenswrapper[4909]: I0202 12:18:12.936126 4909 generic.go:334] "Generic (PLEG): container finished" podID="568f0ced-c6cd-4f8d-8b47-f8e093042a2f" containerID="5d10ae6ca4c02073bcfb92363e280b5fa4d08595247a2b8a73ee8d3b5298f016" exitCode=0 Feb 02 12:18:12 crc kubenswrapper[4909]: I0202 12:18:12.936210 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" event={"ID":"568f0ced-c6cd-4f8d-8b47-f8e093042a2f","Type":"ContainerDied","Data":"5d10ae6ca4c02073bcfb92363e280b5fa4d08595247a2b8a73ee8d3b5298f016"} Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.017018 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:18:14 crc kubenswrapper[4909]: E0202 12:18:14.017663 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.433766 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.513579 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-pre-adoption-validation-combined-ca-bundle\") pod \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.513749 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-ssh-key-openstack-cell1\") pod \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.513843 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hk52\" (UniqueName: \"kubernetes.io/projected/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-kube-api-access-7hk52\") pod \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.513879 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-inventory\") pod \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\" (UID: \"568f0ced-c6cd-4f8d-8b47-f8e093042a2f\") " Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.520029 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-kube-api-access-7hk52" (OuterVolumeSpecName: "kube-api-access-7hk52") pod "568f0ced-c6cd-4f8d-8b47-f8e093042a2f" (UID: "568f0ced-c6cd-4f8d-8b47-f8e093042a2f"). InnerVolumeSpecName "kube-api-access-7hk52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.520156 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "568f0ced-c6cd-4f8d-8b47-f8e093042a2f" (UID: "568f0ced-c6cd-4f8d-8b47-f8e093042a2f"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.543482 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-inventory" (OuterVolumeSpecName: "inventory") pod "568f0ced-c6cd-4f8d-8b47-f8e093042a2f" (UID: "568f0ced-c6cd-4f8d-8b47-f8e093042a2f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.547348 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "568f0ced-c6cd-4f8d-8b47-f8e093042a2f" (UID: "568f0ced-c6cd-4f8d-8b47-f8e093042a2f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.616624 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hk52\" (UniqueName: \"kubernetes.io/projected/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-kube-api-access-7hk52\") on node \"crc\" DevicePath \"\"" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.616659 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.616673 4909 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.616687 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/568f0ced-c6cd-4f8d-8b47-f8e093042a2f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.964193 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" event={"ID":"568f0ced-c6cd-4f8d-8b47-f8e093042a2f","Type":"ContainerDied","Data":"042c2c1b060b57b678a6a8f101773961de56abaac5875601e05d4be1dec94ba8"} Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.964246 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="042c2c1b060b57b678a6a8f101773961de56abaac5875601e05d4be1dec94ba8" Feb 02 12:18:14 crc kubenswrapper[4909]: I0202 12:18:14.964269 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8n998" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.699208 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst"] Feb 02 12:18:21 crc kubenswrapper[4909]: E0202 12:18:21.700505 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="568f0ced-c6cd-4f8d-8b47-f8e093042a2f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.700522 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="568f0ced-c6cd-4f8d-8b47-f8e093042a2f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.700747 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="568f0ced-c6cd-4f8d-8b47-f8e093042a2f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.701729 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.704771 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.704800 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.705117 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.708134 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.710954 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst"] Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.771908 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.772226 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvbhr\" (UniqueName: \"kubernetes.io/projected/f8db189d-64bd-4a95-93de-3ddcb680c6b0-kube-api-access-wvbhr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.772425 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.772511 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.874000 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.874051 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvbhr\" (UniqueName: \"kubernetes.io/projected/f8db189d-64bd-4a95-93de-3ddcb680c6b0-kube-api-access-wvbhr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.874135 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.874181 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.880579 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.888500 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.893643 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:21 crc kubenswrapper[4909]: I0202 12:18:21.894647 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvbhr\" (UniqueName: \"kubernetes.io/projected/f8db189d-64bd-4a95-93de-3ddcb680c6b0-kube-api-access-wvbhr\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:22 crc kubenswrapper[4909]: I0202 12:18:22.031429 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:18:22 crc kubenswrapper[4909]: I0202 12:18:22.582333 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst"] Feb 02 12:18:23 crc kubenswrapper[4909]: I0202 12:18:23.040292 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" event={"ID":"f8db189d-64bd-4a95-93de-3ddcb680c6b0","Type":"ContainerStarted","Data":"a70e1c9fb088b4aba2ea25be060bb664960bb91b84f4e7495d367ec309768f38"} Feb 02 12:18:24 crc kubenswrapper[4909]: I0202 12:18:24.054180 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" event={"ID":"f8db189d-64bd-4a95-93de-3ddcb680c6b0","Type":"ContainerStarted","Data":"83c8557280e43b9707cd6115dfa90f608011fd0cd1322fc421d9e991f8f70f9b"} Feb 02 12:18:24 crc kubenswrapper[4909]: I0202 12:18:24.080676 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" podStartSLOduration=2.631738413 podStartE2EDuration="3.080659108s" podCreationTimestamp="2026-02-02 12:18:21 +0000 UTC" firstStartedPulling="2026-02-02 12:18:22.588971601 +0000 UTC m=+6428.335072336" lastFinishedPulling="2026-02-02 12:18:23.037892286 +0000 UTC m=+6428.783993031" observedRunningTime="2026-02-02 12:18:24.074339119 +0000 UTC m=+6429.820439854" watchObservedRunningTime="2026-02-02 12:18:24.080659108 +0000 UTC m=+6429.826759843" Feb 02 12:18:27 crc kubenswrapper[4909]: I0202 12:18:27.018911 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:18:27 crc kubenswrapper[4909]: E0202 12:18:27.019695 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:18:29 crc kubenswrapper[4909]: I0202 12:18:29.848637 4909 scope.go:117] "RemoveContainer" containerID="f8e95fbdf18d135b654dd615e0aa94162c15ef0a4cc4e312b5c15941e23926bc" Feb 02 12:18:29 crc kubenswrapper[4909]: I0202 12:18:29.890489 4909 scope.go:117] "RemoveContainer" containerID="96e735086c0c6df7af6c8606f7e182a4ab467f3a4db3f6f794b92cca05873bf7" Feb 02 12:18:30 crc kubenswrapper[4909]: I0202 12:18:30.090440 4909 scope.go:117] "RemoveContainer" containerID="be1161d05bde089eede341d85c3b89f4a7814c7514ec7035c999b85395228059" Feb 02 12:18:30 crc kubenswrapper[4909]: I0202 12:18:30.186975 4909 scope.go:117] "RemoveContainer" containerID="bf69fd00c80753a26faf7b00cc5e277be0917380cad15a3769e68b8edc6c98dd" Feb 02 12:18:30 crc kubenswrapper[4909]: I0202 12:18:30.238903 4909 scope.go:117] "RemoveContainer" containerID="15f89a865819b640d7ca1377e3b4971b1562d430744d75a274e1ecfd98268c27" Feb 02 12:18:39 crc kubenswrapper[4909]: I0202 12:18:39.016925 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:18:39 crc kubenswrapper[4909]: E0202 12:18:39.017633 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:18:52 crc kubenswrapper[4909]: I0202 12:18:52.018796 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:18:52 crc kubenswrapper[4909]: E0202 12:18:52.020342 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:19:04 crc kubenswrapper[4909]: I0202 12:19:04.016648 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:19:04 crc kubenswrapper[4909]: E0202 12:19:04.017428 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:19:19 crc kubenswrapper[4909]: I0202 12:19:19.017653 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:19:19 crc kubenswrapper[4909]: E0202 12:19:19.018698 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.207597 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bs887"] Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.210247 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.226093 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs887"] Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.382959 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-utilities\") pod \"redhat-operators-bs887\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.383065 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-catalog-content\") pod \"redhat-operators-bs887\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.383199 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdk8b\" (UniqueName: \"kubernetes.io/projected/6bc45033-8deb-43b6-bfab-c5ad19d960ec-kube-api-access-cdk8b\") pod \"redhat-operators-bs887\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.485107 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdk8b\" (UniqueName: \"kubernetes.io/projected/6bc45033-8deb-43b6-bfab-c5ad19d960ec-kube-api-access-cdk8b\") pod \"redhat-operators-bs887\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.485256 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-utilities\") pod \"redhat-operators-bs887\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.485321 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-catalog-content\") pod \"redhat-operators-bs887\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.485778 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-catalog-content\") pod \"redhat-operators-bs887\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.486432 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-utilities\") pod \"redhat-operators-bs887\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.507837 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdk8b\" (UniqueName: \"kubernetes.io/projected/6bc45033-8deb-43b6-bfab-c5ad19d960ec-kube-api-access-cdk8b\") pod \"redhat-operators-bs887\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:24 crc kubenswrapper[4909]: I0202 12:19:24.541761 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:25 crc kubenswrapper[4909]: I0202 12:19:25.077088 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs887"] Feb 02 12:19:25 crc kubenswrapper[4909]: I0202 12:19:25.616655 4909 generic.go:334] "Generic (PLEG): container finished" podID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerID="b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74" exitCode=0 Feb 02 12:19:25 crc kubenswrapper[4909]: I0202 12:19:25.616730 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs887" event={"ID":"6bc45033-8deb-43b6-bfab-c5ad19d960ec","Type":"ContainerDied","Data":"b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74"} Feb 02 12:19:25 crc kubenswrapper[4909]: I0202 12:19:25.616989 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs887" event={"ID":"6bc45033-8deb-43b6-bfab-c5ad19d960ec","Type":"ContainerStarted","Data":"9c17eb18dbfac732a73d94470dcfb9c5574eafeaed642f32c32d1db5f296748f"} Feb 02 12:19:26 crc kubenswrapper[4909]: I0202 12:19:26.627961 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs887" event={"ID":"6bc45033-8deb-43b6-bfab-c5ad19d960ec","Type":"ContainerStarted","Data":"b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7"} Feb 02 12:19:30 crc kubenswrapper[4909]: I0202 12:19:30.016457 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:19:30 crc kubenswrapper[4909]: E0202 12:19:30.017232 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:19:31 crc kubenswrapper[4909]: I0202 12:19:31.674987 4909 generic.go:334] "Generic (PLEG): container finished" podID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerID="b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7" exitCode=0 Feb 02 12:19:31 crc kubenswrapper[4909]: I0202 12:19:31.675095 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs887" event={"ID":"6bc45033-8deb-43b6-bfab-c5ad19d960ec","Type":"ContainerDied","Data":"b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7"} Feb 02 12:19:32 crc kubenswrapper[4909]: I0202 12:19:32.687367 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs887" event={"ID":"6bc45033-8deb-43b6-bfab-c5ad19d960ec","Type":"ContainerStarted","Data":"1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec"} Feb 02 12:19:32 crc kubenswrapper[4909]: I0202 12:19:32.712128 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bs887" podStartSLOduration=2.184010173 podStartE2EDuration="8.712109935s" podCreationTimestamp="2026-02-02 12:19:24 +0000 UTC" firstStartedPulling="2026-02-02 12:19:25.618265404 +0000 UTC m=+6491.364366139" lastFinishedPulling="2026-02-02 12:19:32.146365156 +0000 UTC m=+6497.892465901" observedRunningTime="2026-02-02 12:19:32.709371397 +0000 UTC m=+6498.455472132" watchObservedRunningTime="2026-02-02 12:19:32.712109935 +0000 UTC m=+6498.458210670" Feb 02 12:19:34 crc kubenswrapper[4909]: I0202 12:19:34.047879 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-8brcl"] Feb 02 12:19:34 crc kubenswrapper[4909]: I0202 12:19:34.057392 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-8brcl"] Feb 02 12:19:34 crc kubenswrapper[4909]: I0202 12:19:34.542944 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:34 crc kubenswrapper[4909]: I0202 12:19:34.543019 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:19:35 crc kubenswrapper[4909]: I0202 12:19:35.030479 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="138ed7b8-dd8c-43f5-a928-092f3d8dd670" path="/var/lib/kubelet/pods/138ed7b8-dd8c-43f5-a928-092f3d8dd670/volumes" Feb 02 12:19:35 crc kubenswrapper[4909]: I0202 12:19:35.038411 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-f673-account-create-update-hn76l"] Feb 02 12:19:35 crc kubenswrapper[4909]: I0202 12:19:35.049904 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-f673-account-create-update-hn76l"] Feb 02 12:19:35 crc kubenswrapper[4909]: I0202 12:19:35.590990 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs887" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="registry-server" probeResult="failure" output=< Feb 02 12:19:35 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:19:35 crc kubenswrapper[4909]: > Feb 02 12:19:37 crc kubenswrapper[4909]: I0202 12:19:37.039404 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5683dbc-e166-419a-97a4-6ae32908deff" path="/var/lib/kubelet/pods/e5683dbc-e166-419a-97a4-6ae32908deff/volumes" Feb 02 12:19:40 crc kubenswrapper[4909]: I0202 12:19:40.032790 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-cv7jt"] Feb 02 12:19:40 crc kubenswrapper[4909]: I0202 12:19:40.043332 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-cv7jt"] Feb 02 12:19:41 crc kubenswrapper[4909]: I0202 12:19:41.030660 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d48e572-d205-42b6-87ba-232cea9b3d35" path="/var/lib/kubelet/pods/8d48e572-d205-42b6-87ba-232cea9b3d35/volumes" Feb 02 12:19:41 crc kubenswrapper[4909]: I0202 12:19:41.031530 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-6b5e-account-create-update-dgmlx"] Feb 02 12:19:41 crc kubenswrapper[4909]: I0202 12:19:41.041402 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-6b5e-account-create-update-dgmlx"] Feb 02 12:19:42 crc kubenswrapper[4909]: I0202 12:19:42.016899 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:19:42 crc kubenswrapper[4909]: E0202 12:19:42.017864 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:19:43 crc kubenswrapper[4909]: I0202 12:19:43.028864 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc5c8186-caff-40d7-9273-5cbb5864bf99" path="/var/lib/kubelet/pods/cc5c8186-caff-40d7-9273-5cbb5864bf99/volumes" Feb 02 12:19:45 crc kubenswrapper[4909]: I0202 12:19:45.587107 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs887" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="registry-server" probeResult="failure" output=< Feb 02 12:19:45 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:19:45 crc kubenswrapper[4909]: > Feb 02 12:19:54 crc kubenswrapper[4909]: I0202 12:19:54.016277 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:19:54 crc kubenswrapper[4909]: E0202 12:19:54.018320 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:19:55 crc kubenswrapper[4909]: I0202 12:19:55.590597 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs887" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="registry-server" probeResult="failure" output=< Feb 02 12:19:55 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:19:55 crc kubenswrapper[4909]: > Feb 02 12:20:04 crc kubenswrapper[4909]: I0202 12:20:04.593467 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:20:04 crc kubenswrapper[4909]: I0202 12:20:04.649351 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:20:04 crc kubenswrapper[4909]: I0202 12:20:04.833340 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bs887"] Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.000189 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bs887" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="registry-server" containerID="cri-o://1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec" gracePeriod=2 Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.499137 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.522426 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdk8b\" (UniqueName: \"kubernetes.io/projected/6bc45033-8deb-43b6-bfab-c5ad19d960ec-kube-api-access-cdk8b\") pod \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.522506 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-catalog-content\") pod \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.522547 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-utilities\") pod \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\" (UID: \"6bc45033-8deb-43b6-bfab-c5ad19d960ec\") " Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.523855 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-utilities" (OuterVolumeSpecName: "utilities") pod "6bc45033-8deb-43b6-bfab-c5ad19d960ec" (UID: "6bc45033-8deb-43b6-bfab-c5ad19d960ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.533004 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc45033-8deb-43b6-bfab-c5ad19d960ec-kube-api-access-cdk8b" (OuterVolumeSpecName: "kube-api-access-cdk8b") pod "6bc45033-8deb-43b6-bfab-c5ad19d960ec" (UID: "6bc45033-8deb-43b6-bfab-c5ad19d960ec"). InnerVolumeSpecName "kube-api-access-cdk8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.624556 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdk8b\" (UniqueName: \"kubernetes.io/projected/6bc45033-8deb-43b6-bfab-c5ad19d960ec-kube-api-access-cdk8b\") on node \"crc\" DevicePath \"\"" Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.624599 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.648181 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6bc45033-8deb-43b6-bfab-c5ad19d960ec" (UID: "6bc45033-8deb-43b6-bfab-c5ad19d960ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:20:06 crc kubenswrapper[4909]: I0202 12:20:06.728369 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6bc45033-8deb-43b6-bfab-c5ad19d960ec-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.011275 4909 generic.go:334] "Generic (PLEG): container finished" podID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerID="1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec" exitCode=0 Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.011352 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs887" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.011348 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs887" event={"ID":"6bc45033-8deb-43b6-bfab-c5ad19d960ec","Type":"ContainerDied","Data":"1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec"} Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.011762 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs887" event={"ID":"6bc45033-8deb-43b6-bfab-c5ad19d960ec","Type":"ContainerDied","Data":"9c17eb18dbfac732a73d94470dcfb9c5574eafeaed642f32c32d1db5f296748f"} Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.011787 4909 scope.go:117] "RemoveContainer" containerID="1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.016793 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:20:07 crc kubenswrapper[4909]: E0202 12:20:07.017208 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.043347 4909 scope.go:117] "RemoveContainer" containerID="b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.048647 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bs887"] Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.062675 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bs887"] Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.071702 4909 scope.go:117] "RemoveContainer" containerID="b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.114136 4909 scope.go:117] "RemoveContainer" containerID="1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec" Feb 02 12:20:07 crc kubenswrapper[4909]: E0202 12:20:07.114446 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec\": container with ID starting with 1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec not found: ID does not exist" containerID="1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.114486 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec"} err="failed to get container status \"1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec\": rpc error: code = NotFound desc = could not find container \"1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec\": container with ID starting with 1258ad6b2d474fa199d494174fb859bf32fe7f5a600356f624b75f696014faec not found: ID does not exist" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.114521 4909 scope.go:117] "RemoveContainer" containerID="b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7" Feb 02 12:20:07 crc kubenswrapper[4909]: E0202 12:20:07.116463 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7\": container with ID starting with b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7 not found: ID does not exist" containerID="b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.116489 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7"} err="failed to get container status \"b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7\": rpc error: code = NotFound desc = could not find container \"b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7\": container with ID starting with b892265441a1de765027fd488335add4274836d04e067f1916f27c64837cc1c7 not found: ID does not exist" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.116506 4909 scope.go:117] "RemoveContainer" containerID="b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74" Feb 02 12:20:07 crc kubenswrapper[4909]: E0202 12:20:07.116923 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74\": container with ID starting with b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74 not found: ID does not exist" containerID="b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74" Feb 02 12:20:07 crc kubenswrapper[4909]: I0202 12:20:07.116972 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74"} err="failed to get container status \"b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74\": rpc error: code = NotFound desc = could not find container \"b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74\": container with ID starting with b0b931173e9907848f43edcab10bc1fd05c6316d73ceee61e02ae0654a71bf74 not found: ID does not exist" Feb 02 12:20:09 crc kubenswrapper[4909]: I0202 12:20:09.036536 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" path="/var/lib/kubelet/pods/6bc45033-8deb-43b6-bfab-c5ad19d960ec/volumes" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.040602 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h7kb5"] Feb 02 12:20:12 crc kubenswrapper[4909]: E0202 12:20:12.041280 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="registry-server" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.041294 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="registry-server" Feb 02 12:20:12 crc kubenswrapper[4909]: E0202 12:20:12.041303 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="extract-content" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.041309 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="extract-content" Feb 02 12:20:12 crc kubenswrapper[4909]: E0202 12:20:12.041331 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="extract-utilities" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.041337 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="extract-utilities" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.041521 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc45033-8deb-43b6-bfab-c5ad19d960ec" containerName="registry-server" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.043230 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.054684 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7kb5"] Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.137631 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-catalog-content\") pod \"redhat-marketplace-h7kb5\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.138100 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh2m5\" (UniqueName: \"kubernetes.io/projected/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-kube-api-access-jh2m5\") pod \"redhat-marketplace-h7kb5\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.138146 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-utilities\") pod \"redhat-marketplace-h7kb5\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.240044 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-catalog-content\") pod \"redhat-marketplace-h7kb5\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.240184 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jh2m5\" (UniqueName: \"kubernetes.io/projected/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-kube-api-access-jh2m5\") pod \"redhat-marketplace-h7kb5\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.240234 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-utilities\") pod \"redhat-marketplace-h7kb5\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.240576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-catalog-content\") pod \"redhat-marketplace-h7kb5\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.240691 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-utilities\") pod \"redhat-marketplace-h7kb5\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.260338 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh2m5\" (UniqueName: \"kubernetes.io/projected/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-kube-api-access-jh2m5\") pod \"redhat-marketplace-h7kb5\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.366788 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:12 crc kubenswrapper[4909]: I0202 12:20:12.820400 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7kb5"] Feb 02 12:20:13 crc kubenswrapper[4909]: I0202 12:20:13.084255 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7kb5" event={"ID":"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0","Type":"ContainerStarted","Data":"953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd"} Feb 02 12:20:13 crc kubenswrapper[4909]: I0202 12:20:13.084563 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7kb5" event={"ID":"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0","Type":"ContainerStarted","Data":"13c1aecbf358e499ba8cc8ae146d9c1f807a16e4af0ebe5dded0e5bc2cd79d55"} Feb 02 12:20:14 crc kubenswrapper[4909]: I0202 12:20:14.096293 4909 generic.go:334] "Generic (PLEG): container finished" podID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerID="953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd" exitCode=0 Feb 02 12:20:14 crc kubenswrapper[4909]: I0202 12:20:14.096359 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7kb5" event={"ID":"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0","Type":"ContainerDied","Data":"953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd"} Feb 02 12:20:14 crc kubenswrapper[4909]: I0202 12:20:14.096676 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7kb5" event={"ID":"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0","Type":"ContainerStarted","Data":"97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d"} Feb 02 12:20:15 crc kubenswrapper[4909]: I0202 12:20:15.105824 4909 generic.go:334] "Generic (PLEG): container finished" podID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerID="97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d" exitCode=0 Feb 02 12:20:15 crc kubenswrapper[4909]: I0202 12:20:15.105882 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7kb5" event={"ID":"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0","Type":"ContainerDied","Data":"97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d"} Feb 02 12:20:16 crc kubenswrapper[4909]: I0202 12:20:16.116186 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7kb5" event={"ID":"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0","Type":"ContainerStarted","Data":"278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d"} Feb 02 12:20:16 crc kubenswrapper[4909]: I0202 12:20:16.139088 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h7kb5" podStartSLOduration=1.680995169 podStartE2EDuration="4.13906643s" podCreationTimestamp="2026-02-02 12:20:12 +0000 UTC" firstStartedPulling="2026-02-02 12:20:13.091010432 +0000 UTC m=+6538.837111167" lastFinishedPulling="2026-02-02 12:20:15.549081693 +0000 UTC m=+6541.295182428" observedRunningTime="2026-02-02 12:20:16.134943393 +0000 UTC m=+6541.881044128" watchObservedRunningTime="2026-02-02 12:20:16.13906643 +0000 UTC m=+6541.885167165" Feb 02 12:20:21 crc kubenswrapper[4909]: I0202 12:20:21.016847 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:20:21 crc kubenswrapper[4909]: E0202 12:20:21.017584 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:20:22 crc kubenswrapper[4909]: I0202 12:20:22.367223 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:22 crc kubenswrapper[4909]: I0202 12:20:22.367616 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:22 crc kubenswrapper[4909]: I0202 12:20:22.419546 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:23 crc kubenswrapper[4909]: I0202 12:20:23.226536 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:23 crc kubenswrapper[4909]: I0202 12:20:23.279537 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7kb5"] Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.198106 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h7kb5" podUID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerName="registry-server" containerID="cri-o://278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d" gracePeriod=2 Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.664702 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.847478 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jh2m5\" (UniqueName: \"kubernetes.io/projected/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-kube-api-access-jh2m5\") pod \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.848109 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-catalog-content\") pod \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.848217 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-utilities\") pod \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\" (UID: \"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0\") " Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.849074 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-utilities" (OuterVolumeSpecName: "utilities") pod "f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" (UID: "f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.849390 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.852626 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-kube-api-access-jh2m5" (OuterVolumeSpecName: "kube-api-access-jh2m5") pod "f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" (UID: "f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0"). InnerVolumeSpecName "kube-api-access-jh2m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.871791 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" (UID: "f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.950782 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:20:25 crc kubenswrapper[4909]: I0202 12:20:25.950831 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jh2m5\" (UniqueName: \"kubernetes.io/projected/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0-kube-api-access-jh2m5\") on node \"crc\" DevicePath \"\"" Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.043829 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-85xwk"] Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.054070 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-85xwk"] Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.207005 4909 generic.go:334] "Generic (PLEG): container finished" podID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerID="278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d" exitCode=0 Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.207081 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h7kb5" Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.208801 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7kb5" event={"ID":"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0","Type":"ContainerDied","Data":"278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d"} Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.209074 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h7kb5" event={"ID":"f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0","Type":"ContainerDied","Data":"13c1aecbf358e499ba8cc8ae146d9c1f807a16e4af0ebe5dded0e5bc2cd79d55"} Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.209212 4909 scope.go:117] "RemoveContainer" containerID="278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d" Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.245682 4909 scope.go:117] "RemoveContainer" containerID="97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d" Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.250020 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7kb5"] Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.259650 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h7kb5"] Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.267514 4909 scope.go:117] "RemoveContainer" containerID="953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd" Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.310122 4909 scope.go:117] "RemoveContainer" containerID="278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d" Feb 02 12:20:26 crc kubenswrapper[4909]: E0202 12:20:26.310860 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d\": container with ID starting with 278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d not found: ID does not exist" containerID="278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d" Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.310895 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d"} err="failed to get container status \"278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d\": rpc error: code = NotFound desc = could not find container \"278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d\": container with ID starting with 278640a2abba3097d1a39e8cfd7ad53d4d1d8f1010f6450db27b380a41a7ed6d not found: ID does not exist" Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.310918 4909 scope.go:117] "RemoveContainer" containerID="97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d" Feb 02 12:20:26 crc kubenswrapper[4909]: E0202 12:20:26.311390 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d\": container with ID starting with 97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d not found: ID does not exist" containerID="97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d" Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.311413 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d"} err="failed to get container status \"97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d\": rpc error: code = NotFound desc = could not find container \"97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d\": container with ID starting with 97be153b9c336efb42d2efe1cd2faa4ef5444c3a96c4882e7448bf0b43abce1d not found: ID does not exist" Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.311427 4909 scope.go:117] "RemoveContainer" containerID="953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd" Feb 02 12:20:26 crc kubenswrapper[4909]: E0202 12:20:26.311656 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd\": container with ID starting with 953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd not found: ID does not exist" containerID="953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd" Feb 02 12:20:26 crc kubenswrapper[4909]: I0202 12:20:26.311678 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd"} err="failed to get container status \"953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd\": rpc error: code = NotFound desc = could not find container \"953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd\": container with ID starting with 953096e536fd42232bc2ff96cb83059d2e066e99451c2322231784cd19aba0bd not found: ID does not exist" Feb 02 12:20:27 crc kubenswrapper[4909]: I0202 12:20:27.030664 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45214a2c-8581-4551-b61e-67df0ba0fc95" path="/var/lib/kubelet/pods/45214a2c-8581-4551-b61e-67df0ba0fc95/volumes" Feb 02 12:20:27 crc kubenswrapper[4909]: I0202 12:20:27.032602 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" path="/var/lib/kubelet/pods/f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0/volumes" Feb 02 12:20:30 crc kubenswrapper[4909]: I0202 12:20:30.423008 4909 scope.go:117] "RemoveContainer" containerID="55f98ab1d8456725fac00c3381b3f7e9c27849430d6b36039f807db276b30cf2" Feb 02 12:20:30 crc kubenswrapper[4909]: I0202 12:20:30.456915 4909 scope.go:117] "RemoveContainer" containerID="c1657283ee2b61b2109fee0fa83c07814e58d7ac19beb46ee4632428ef03e0d9" Feb 02 12:20:30 crc kubenswrapper[4909]: I0202 12:20:30.534755 4909 scope.go:117] "RemoveContainer" containerID="f85a23472c8fbff2be548289b891e068417c6e870773040eff8852255fb2a595" Feb 02 12:20:30 crc kubenswrapper[4909]: I0202 12:20:30.594783 4909 scope.go:117] "RemoveContainer" containerID="e7ba5ff8af450cd2ed5d8fd43c29c4180d2f9c17d21310f46bdcfd4f4ac0eaa7" Feb 02 12:20:30 crc kubenswrapper[4909]: I0202 12:20:30.637872 4909 scope.go:117] "RemoveContainer" containerID="495eefbb942e4827063b82e831562cc26299407117ec95bec19891bde00bd1bd" Feb 02 12:20:30 crc kubenswrapper[4909]: I0202 12:20:30.686370 4909 scope.go:117] "RemoveContainer" containerID="72882e55a0328cad1ba98adaf9846bd3856d646e3f71d9de19bd8c97d4bb8199" Feb 02 12:20:30 crc kubenswrapper[4909]: I0202 12:20:30.731500 4909 scope.go:117] "RemoveContainer" containerID="4fd9ab104737c22de985eec737ba2ff364bbbf46839e534c28ca326da1c9eec2" Feb 02 12:20:30 crc kubenswrapper[4909]: I0202 12:20:30.755181 4909 scope.go:117] "RemoveContainer" containerID="7da288cabb0051a161429863fb022fb5d4e4b697754a33459489d2f0e28c7459" Feb 02 12:20:30 crc kubenswrapper[4909]: I0202 12:20:30.774352 4909 scope.go:117] "RemoveContainer" containerID="7186e22863fcb9269053e02643dac50be88c09728944307c150773560cf9bc33" Feb 02 12:20:36 crc kubenswrapper[4909]: I0202 12:20:36.016326 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:20:36 crc kubenswrapper[4909]: E0202 12:20:36.016840 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:20:49 crc kubenswrapper[4909]: I0202 12:20:49.016181 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:20:49 crc kubenswrapper[4909]: E0202 12:20:49.016999 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:21:03 crc kubenswrapper[4909]: I0202 12:21:03.016778 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:21:03 crc kubenswrapper[4909]: E0202 12:21:03.017720 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:21:15 crc kubenswrapper[4909]: I0202 12:21:15.024143 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:21:15 crc kubenswrapper[4909]: E0202 12:21:15.025044 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:21:27 crc kubenswrapper[4909]: I0202 12:21:27.021708 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:21:27 crc kubenswrapper[4909]: E0202 12:21:27.022573 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:21:38 crc kubenswrapper[4909]: I0202 12:21:38.016906 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:21:38 crc kubenswrapper[4909]: E0202 12:21:38.017653 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:21:52 crc kubenswrapper[4909]: I0202 12:21:52.017353 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:21:52 crc kubenswrapper[4909]: E0202 12:21:52.019015 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:22:07 crc kubenswrapper[4909]: I0202 12:22:07.017236 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:22:07 crc kubenswrapper[4909]: E0202 12:22:07.018094 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:22:20 crc kubenswrapper[4909]: I0202 12:22:20.017203 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:22:21 crc kubenswrapper[4909]: I0202 12:22:21.241228 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"1a6781067b416e92ff82faf52056033ec4a8268b572f99a2e337453f09d68263"} Feb 02 12:23:08 crc kubenswrapper[4909]: I0202 12:23:08.046707 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-jfzjt"] Feb 02 12:23:08 crc kubenswrapper[4909]: I0202 12:23:08.057869 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-f0d4-account-create-update-7f2tp"] Feb 02 12:23:08 crc kubenswrapper[4909]: I0202 12:23:08.067840 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-f0d4-account-create-update-7f2tp"] Feb 02 12:23:08 crc kubenswrapper[4909]: I0202 12:23:08.076496 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-jfzjt"] Feb 02 12:23:09 crc kubenswrapper[4909]: I0202 12:23:09.037670 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6828af76-a320-4845-9d2f-584f161f5c8d" path="/var/lib/kubelet/pods/6828af76-a320-4845-9d2f-584f161f5c8d/volumes" Feb 02 12:23:09 crc kubenswrapper[4909]: I0202 12:23:09.038894 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c503a762-1f95-4131-8fca-dfbced528ce4" path="/var/lib/kubelet/pods/c503a762-1f95-4131-8fca-dfbced528ce4/volumes" Feb 02 12:23:20 crc kubenswrapper[4909]: I0202 12:23:20.041349 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-cs882"] Feb 02 12:23:20 crc kubenswrapper[4909]: I0202 12:23:20.051111 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-cs882"] Feb 02 12:23:21 crc kubenswrapper[4909]: I0202 12:23:21.028417 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e" path="/var/lib/kubelet/pods/7a1e5bfe-7920-4cd4-84ea-2ff4bf7b8b0e/volumes" Feb 02 12:23:30 crc kubenswrapper[4909]: I0202 12:23:30.960875 4909 scope.go:117] "RemoveContainer" containerID="58411504fe3d8433ec2120550ba87a3e8c64bf94853cfdb54f5b27484ebcf7ee" Feb 02 12:23:30 crc kubenswrapper[4909]: I0202 12:23:30.988993 4909 scope.go:117] "RemoveContainer" containerID="7461a34b33bd3499f3147e377bcd0b7d277edacf26a3da86ea63438385b40f45" Feb 02 12:23:31 crc kubenswrapper[4909]: I0202 12:23:31.037092 4909 scope.go:117] "RemoveContainer" containerID="60b4221d1cbaae54edeeb382625e87dff88e9171d9ef35d73be20c40b6380c2e" Feb 02 12:23:31 crc kubenswrapper[4909]: I0202 12:23:31.061527 4909 scope.go:117] "RemoveContainer" containerID="810f86a4485c966a50ea2804a5d809ff2f855f29e1333d6e8a7636a9e60855b3" Feb 02 12:23:31 crc kubenswrapper[4909]: I0202 12:23:31.089658 4909 scope.go:117] "RemoveContainer" containerID="f510893acf9c99f6ba8a298938ae047fd5c30ea95efee1d2e6924af1cd280502" Feb 02 12:23:31 crc kubenswrapper[4909]: I0202 12:23:31.139623 4909 scope.go:117] "RemoveContainer" containerID="8044a9e89bcc0f2f50dcbd6bfcc65e225cba711291d3258c9425f7367a790d75" Feb 02 12:23:31 crc kubenswrapper[4909]: I0202 12:23:31.180149 4909 scope.go:117] "RemoveContainer" containerID="549a66a6fc307e5613198759d5250573531e750b1a453aecbffdd44060893730" Feb 02 12:23:45 crc kubenswrapper[4909]: I0202 12:23:45.898316 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hdvjz"] Feb 02 12:23:45 crc kubenswrapper[4909]: E0202 12:23:45.900214 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerName="extract-content" Feb 02 12:23:45 crc kubenswrapper[4909]: I0202 12:23:45.900235 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerName="extract-content" Feb 02 12:23:45 crc kubenswrapper[4909]: E0202 12:23:45.900248 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerName="registry-server" Feb 02 12:23:45 crc kubenswrapper[4909]: I0202 12:23:45.900255 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerName="registry-server" Feb 02 12:23:45 crc kubenswrapper[4909]: E0202 12:23:45.900273 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerName="extract-utilities" Feb 02 12:23:45 crc kubenswrapper[4909]: I0202 12:23:45.900280 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerName="extract-utilities" Feb 02 12:23:45 crc kubenswrapper[4909]: I0202 12:23:45.901332 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c6424f-b1f9-4bd4-8776-d2dfaeb4eca0" containerName="registry-server" Feb 02 12:23:45 crc kubenswrapper[4909]: I0202 12:23:45.905213 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:45 crc kubenswrapper[4909]: I0202 12:23:45.909449 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hdvjz"] Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.065488 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwvpm\" (UniqueName: \"kubernetes.io/projected/adb80dc0-c89e-4930-88b4-1077e301a731-kube-api-access-zwvpm\") pod \"community-operators-hdvjz\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.065573 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-utilities\") pod \"community-operators-hdvjz\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.065720 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-catalog-content\") pod \"community-operators-hdvjz\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.167504 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvpm\" (UniqueName: \"kubernetes.io/projected/adb80dc0-c89e-4930-88b4-1077e301a731-kube-api-access-zwvpm\") pod \"community-operators-hdvjz\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.167726 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-utilities\") pod \"community-operators-hdvjz\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.168141 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-catalog-content\") pod \"community-operators-hdvjz\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.169570 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-catalog-content\") pod \"community-operators-hdvjz\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.169783 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-utilities\") pod \"community-operators-hdvjz\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.189514 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwvpm\" (UniqueName: \"kubernetes.io/projected/adb80dc0-c89e-4930-88b4-1077e301a731-kube-api-access-zwvpm\") pod \"community-operators-hdvjz\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.235016 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:23:46 crc kubenswrapper[4909]: I0202 12:23:46.575534 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hdvjz"] Feb 02 12:23:47 crc kubenswrapper[4909]: I0202 12:23:47.021400 4909 generic.go:334] "Generic (PLEG): container finished" podID="adb80dc0-c89e-4930-88b4-1077e301a731" containerID="c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07" exitCode=0 Feb 02 12:23:47 crc kubenswrapper[4909]: I0202 12:23:47.023300 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 12:23:47 crc kubenswrapper[4909]: I0202 12:23:47.037584 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hdvjz" event={"ID":"adb80dc0-c89e-4930-88b4-1077e301a731","Type":"ContainerDied","Data":"c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07"} Feb 02 12:23:47 crc kubenswrapper[4909]: I0202 12:23:47.037627 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hdvjz" event={"ID":"adb80dc0-c89e-4930-88b4-1077e301a731","Type":"ContainerStarted","Data":"b1ff3ed13b7a949a5f534d833c3f5eba3b80b7c9a1627754842b5c050809d272"} Feb 02 12:23:49 crc kubenswrapper[4909]: I0202 12:23:49.042404 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hdvjz" event={"ID":"adb80dc0-c89e-4930-88b4-1077e301a731","Type":"ContainerStarted","Data":"69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0"} Feb 02 12:23:49 crc kubenswrapper[4909]: I0202 12:23:49.865332 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6m6xb"] Feb 02 12:23:49 crc kubenswrapper[4909]: I0202 12:23:49.881734 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:49 crc kubenswrapper[4909]: I0202 12:23:49.934024 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6m6xb"] Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.063257 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-utilities\") pod \"certified-operators-6m6xb\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.065421 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lv89\" (UniqueName: \"kubernetes.io/projected/2acc149b-7c91-45d1-8be2-ae184eedbb67-kube-api-access-8lv89\") pod \"certified-operators-6m6xb\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.065961 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-catalog-content\") pod \"certified-operators-6m6xb\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.167625 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-utilities\") pod \"certified-operators-6m6xb\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.167734 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lv89\" (UniqueName: \"kubernetes.io/projected/2acc149b-7c91-45d1-8be2-ae184eedbb67-kube-api-access-8lv89\") pod \"certified-operators-6m6xb\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.167851 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-catalog-content\") pod \"certified-operators-6m6xb\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.168107 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-utilities\") pod \"certified-operators-6m6xb\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.168396 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-catalog-content\") pod \"certified-operators-6m6xb\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.187752 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lv89\" (UniqueName: \"kubernetes.io/projected/2acc149b-7c91-45d1-8be2-ae184eedbb67-kube-api-access-8lv89\") pod \"certified-operators-6m6xb\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.229887 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:23:50 crc kubenswrapper[4909]: I0202 12:23:50.824189 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6m6xb"] Feb 02 12:23:51 crc kubenswrapper[4909]: I0202 12:23:51.066766 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6m6xb" event={"ID":"2acc149b-7c91-45d1-8be2-ae184eedbb67","Type":"ContainerStarted","Data":"343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e"} Feb 02 12:23:51 crc kubenswrapper[4909]: I0202 12:23:51.066828 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6m6xb" event={"ID":"2acc149b-7c91-45d1-8be2-ae184eedbb67","Type":"ContainerStarted","Data":"b0b72714f9426c757fbe787e434f7aacad67dc11a89ae7b3d739f78fa6bf6345"} Feb 02 12:23:52 crc kubenswrapper[4909]: I0202 12:23:52.077939 4909 generic.go:334] "Generic (PLEG): container finished" podID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerID="343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e" exitCode=0 Feb 02 12:23:52 crc kubenswrapper[4909]: I0202 12:23:52.078086 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6m6xb" event={"ID":"2acc149b-7c91-45d1-8be2-ae184eedbb67","Type":"ContainerDied","Data":"343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e"} Feb 02 12:23:53 crc kubenswrapper[4909]: I0202 12:23:53.095529 4909 generic.go:334] "Generic (PLEG): container finished" podID="adb80dc0-c89e-4930-88b4-1077e301a731" containerID="69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0" exitCode=0 Feb 02 12:23:53 crc kubenswrapper[4909]: I0202 12:23:53.095655 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hdvjz" event={"ID":"adb80dc0-c89e-4930-88b4-1077e301a731","Type":"ContainerDied","Data":"69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0"} Feb 02 12:23:54 crc kubenswrapper[4909]: I0202 12:23:54.109509 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6m6xb" event={"ID":"2acc149b-7c91-45d1-8be2-ae184eedbb67","Type":"ContainerStarted","Data":"28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435"} Feb 02 12:23:57 crc kubenswrapper[4909]: I0202 12:23:57.143718 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hdvjz" event={"ID":"adb80dc0-c89e-4930-88b4-1077e301a731","Type":"ContainerStarted","Data":"d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94"} Feb 02 12:23:57 crc kubenswrapper[4909]: I0202 12:23:57.169493 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hdvjz" podStartSLOduration=2.605773081 podStartE2EDuration="12.169468237s" podCreationTimestamp="2026-02-02 12:23:45 +0000 UTC" firstStartedPulling="2026-02-02 12:23:47.023083401 +0000 UTC m=+6752.769184136" lastFinishedPulling="2026-02-02 12:23:56.586778557 +0000 UTC m=+6762.332879292" observedRunningTime="2026-02-02 12:23:57.161936673 +0000 UTC m=+6762.908037428" watchObservedRunningTime="2026-02-02 12:23:57.169468237 +0000 UTC m=+6762.915568972" Feb 02 12:24:00 crc kubenswrapper[4909]: I0202 12:24:00.172301 4909 generic.go:334] "Generic (PLEG): container finished" podID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerID="28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435" exitCode=0 Feb 02 12:24:00 crc kubenswrapper[4909]: I0202 12:24:00.172387 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6m6xb" event={"ID":"2acc149b-7c91-45d1-8be2-ae184eedbb67","Type":"ContainerDied","Data":"28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435"} Feb 02 12:24:01 crc kubenswrapper[4909]: I0202 12:24:01.184238 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6m6xb" event={"ID":"2acc149b-7c91-45d1-8be2-ae184eedbb67","Type":"ContainerStarted","Data":"c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404"} Feb 02 12:24:01 crc kubenswrapper[4909]: I0202 12:24:01.209502 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6m6xb" podStartSLOduration=3.664273373 podStartE2EDuration="12.209483588s" podCreationTimestamp="2026-02-02 12:23:49 +0000 UTC" firstStartedPulling="2026-02-02 12:23:52.080004687 +0000 UTC m=+6757.826105422" lastFinishedPulling="2026-02-02 12:24:00.625214902 +0000 UTC m=+6766.371315637" observedRunningTime="2026-02-02 12:24:01.20110027 +0000 UTC m=+6766.947201005" watchObservedRunningTime="2026-02-02 12:24:01.209483588 +0000 UTC m=+6766.955584323" Feb 02 12:24:06 crc kubenswrapper[4909]: I0202 12:24:06.235514 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:24:06 crc kubenswrapper[4909]: I0202 12:24:06.236173 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:24:06 crc kubenswrapper[4909]: I0202 12:24:06.296963 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:24:07 crc kubenswrapper[4909]: I0202 12:24:07.280153 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:24:10 crc kubenswrapper[4909]: I0202 12:24:10.105412 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hdvjz"] Feb 02 12:24:10 crc kubenswrapper[4909]: I0202 12:24:10.105954 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hdvjz" podUID="adb80dc0-c89e-4930-88b4-1077e301a731" containerName="registry-server" containerID="cri-o://d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94" gracePeriod=2 Feb 02 12:24:10 crc kubenswrapper[4909]: I0202 12:24:10.230906 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:24:10 crc kubenswrapper[4909]: I0202 12:24:10.231236 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:24:10 crc kubenswrapper[4909]: I0202 12:24:10.282374 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.098536 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.232346 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-catalog-content\") pod \"adb80dc0-c89e-4930-88b4-1077e301a731\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.233151 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-utilities\") pod \"adb80dc0-c89e-4930-88b4-1077e301a731\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.233643 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwvpm\" (UniqueName: \"kubernetes.io/projected/adb80dc0-c89e-4930-88b4-1077e301a731-kube-api-access-zwvpm\") pod \"adb80dc0-c89e-4930-88b4-1077e301a731\" (UID: \"adb80dc0-c89e-4930-88b4-1077e301a731\") " Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.234769 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-utilities" (OuterVolumeSpecName: "utilities") pod "adb80dc0-c89e-4930-88b4-1077e301a731" (UID: "adb80dc0-c89e-4930-88b4-1077e301a731"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.240406 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb80dc0-c89e-4930-88b4-1077e301a731-kube-api-access-zwvpm" (OuterVolumeSpecName: "kube-api-access-zwvpm") pod "adb80dc0-c89e-4930-88b4-1077e301a731" (UID: "adb80dc0-c89e-4930-88b4-1077e301a731"). InnerVolumeSpecName "kube-api-access-zwvpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.277612 4909 generic.go:334] "Generic (PLEG): container finished" podID="adb80dc0-c89e-4930-88b4-1077e301a731" containerID="d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94" exitCode=0 Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.279110 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hdvjz" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.280129 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hdvjz" event={"ID":"adb80dc0-c89e-4930-88b4-1077e301a731","Type":"ContainerDied","Data":"d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94"} Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.280240 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hdvjz" event={"ID":"adb80dc0-c89e-4930-88b4-1077e301a731","Type":"ContainerDied","Data":"b1ff3ed13b7a949a5f534d833c3f5eba3b80b7c9a1627754842b5c050809d272"} Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.280271 4909 scope.go:117] "RemoveContainer" containerID="d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.290944 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adb80dc0-c89e-4930-88b4-1077e301a731" (UID: "adb80dc0-c89e-4930-88b4-1077e301a731"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.310045 4909 scope.go:117] "RemoveContainer" containerID="69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.335038 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.337596 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwvpm\" (UniqueName: \"kubernetes.io/projected/adb80dc0-c89e-4930-88b4-1077e301a731-kube-api-access-zwvpm\") on node \"crc\" DevicePath \"\"" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.337657 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.337669 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adb80dc0-c89e-4930-88b4-1077e301a731-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.348082 4909 scope.go:117] "RemoveContainer" containerID="c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.394261 4909 scope.go:117] "RemoveContainer" containerID="d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94" Feb 02 12:24:11 crc kubenswrapper[4909]: E0202 12:24:11.395475 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94\": container with ID starting with d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94 not found: ID does not exist" containerID="d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.395538 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94"} err="failed to get container status \"d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94\": rpc error: code = NotFound desc = could not find container \"d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94\": container with ID starting with d14f863191763b857299c63eda359649f8c86420605d94e48626ef7618983a94 not found: ID does not exist" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.395571 4909 scope.go:117] "RemoveContainer" containerID="69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0" Feb 02 12:24:11 crc kubenswrapper[4909]: E0202 12:24:11.395959 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0\": container with ID starting with 69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0 not found: ID does not exist" containerID="69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.396003 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0"} err="failed to get container status \"69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0\": rpc error: code = NotFound desc = could not find container \"69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0\": container with ID starting with 69e57cea0a7b0266a9e0331a53a52178313f0b0ac43f2a4477ad8fc7756424d0 not found: ID does not exist" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.396034 4909 scope.go:117] "RemoveContainer" containerID="c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07" Feb 02 12:24:11 crc kubenswrapper[4909]: E0202 12:24:11.396465 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07\": container with ID starting with c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07 not found: ID does not exist" containerID="c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.396494 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07"} err="failed to get container status \"c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07\": rpc error: code = NotFound desc = could not find container \"c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07\": container with ID starting with c63fb5cbd41d23c3affed6d29de595e53cfdcaa52e02bb0a8ac90496f3441d07 not found: ID does not exist" Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.615354 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hdvjz"] Feb 02 12:24:11 crc kubenswrapper[4909]: I0202 12:24:11.623181 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hdvjz"] Feb 02 12:24:13 crc kubenswrapper[4909]: I0202 12:24:13.030572 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adb80dc0-c89e-4930-88b4-1077e301a731" path="/var/lib/kubelet/pods/adb80dc0-c89e-4930-88b4-1077e301a731/volumes" Feb 02 12:24:14 crc kubenswrapper[4909]: I0202 12:24:14.638249 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6m6xb"] Feb 02 12:24:14 crc kubenswrapper[4909]: I0202 12:24:14.638867 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6m6xb" podUID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerName="registry-server" containerID="cri-o://c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404" gracePeriod=2 Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.095172 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.218541 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lv89\" (UniqueName: \"kubernetes.io/projected/2acc149b-7c91-45d1-8be2-ae184eedbb67-kube-api-access-8lv89\") pod \"2acc149b-7c91-45d1-8be2-ae184eedbb67\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.218767 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-catalog-content\") pod \"2acc149b-7c91-45d1-8be2-ae184eedbb67\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.218905 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-utilities\") pod \"2acc149b-7c91-45d1-8be2-ae184eedbb67\" (UID: \"2acc149b-7c91-45d1-8be2-ae184eedbb67\") " Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.219564 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-utilities" (OuterVolumeSpecName: "utilities") pod "2acc149b-7c91-45d1-8be2-ae184eedbb67" (UID: "2acc149b-7c91-45d1-8be2-ae184eedbb67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.223755 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2acc149b-7c91-45d1-8be2-ae184eedbb67-kube-api-access-8lv89" (OuterVolumeSpecName: "kube-api-access-8lv89") pod "2acc149b-7c91-45d1-8be2-ae184eedbb67" (UID: "2acc149b-7c91-45d1-8be2-ae184eedbb67"). InnerVolumeSpecName "kube-api-access-8lv89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.265250 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2acc149b-7c91-45d1-8be2-ae184eedbb67" (UID: "2acc149b-7c91-45d1-8be2-ae184eedbb67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.315853 4909 generic.go:334] "Generic (PLEG): container finished" podID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerID="c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404" exitCode=0 Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.315901 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6m6xb" event={"ID":"2acc149b-7c91-45d1-8be2-ae184eedbb67","Type":"ContainerDied","Data":"c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404"} Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.315927 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6m6xb" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.315943 4909 scope.go:117] "RemoveContainer" containerID="c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.315932 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6m6xb" event={"ID":"2acc149b-7c91-45d1-8be2-ae184eedbb67","Type":"ContainerDied","Data":"b0b72714f9426c757fbe787e434f7aacad67dc11a89ae7b3d739f78fa6bf6345"} Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.320849 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.320876 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2acc149b-7c91-45d1-8be2-ae184eedbb67-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.320890 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lv89\" (UniqueName: \"kubernetes.io/projected/2acc149b-7c91-45d1-8be2-ae184eedbb67-kube-api-access-8lv89\") on node \"crc\" DevicePath \"\"" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.339670 4909 scope.go:117] "RemoveContainer" containerID="28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.349377 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6m6xb"] Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.360140 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6m6xb"] Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.373479 4909 scope.go:117] "RemoveContainer" containerID="343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.408500 4909 scope.go:117] "RemoveContainer" containerID="c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404" Feb 02 12:24:15 crc kubenswrapper[4909]: E0202 12:24:15.409049 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404\": container with ID starting with c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404 not found: ID does not exist" containerID="c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.409118 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404"} err="failed to get container status \"c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404\": rpc error: code = NotFound desc = could not find container \"c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404\": container with ID starting with c2ea8f86adafa7d47a2220d66ecffbff24ea32d44bbf8677aa19f2d849383404 not found: ID does not exist" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.409151 4909 scope.go:117] "RemoveContainer" containerID="28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435" Feb 02 12:24:15 crc kubenswrapper[4909]: E0202 12:24:15.409522 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435\": container with ID starting with 28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435 not found: ID does not exist" containerID="28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.409565 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435"} err="failed to get container status \"28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435\": rpc error: code = NotFound desc = could not find container \"28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435\": container with ID starting with 28ae5d1dccb9a79899c91ace7d73599abbcbb3b0f9062ec61c6bd791813bf435 not found: ID does not exist" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.409603 4909 scope.go:117] "RemoveContainer" containerID="343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e" Feb 02 12:24:15 crc kubenswrapper[4909]: E0202 12:24:15.410152 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e\": container with ID starting with 343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e not found: ID does not exist" containerID="343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e" Feb 02 12:24:15 crc kubenswrapper[4909]: I0202 12:24:15.410184 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e"} err="failed to get container status \"343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e\": rpc error: code = NotFound desc = could not find container \"343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e\": container with ID starting with 343b11165bef17e88cc467a37d5ba778fc92e76bfc469d29800619a660149c7e not found: ID does not exist" Feb 02 12:24:17 crc kubenswrapper[4909]: I0202 12:24:17.030527 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2acc149b-7c91-45d1-8be2-ae184eedbb67" path="/var/lib/kubelet/pods/2acc149b-7c91-45d1-8be2-ae184eedbb67/volumes" Feb 02 12:24:49 crc kubenswrapper[4909]: I0202 12:24:49.511630 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:24:49 crc kubenswrapper[4909]: I0202 12:24:49.512203 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:25:19 crc kubenswrapper[4909]: I0202 12:25:19.510912 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:25:19 crc kubenswrapper[4909]: I0202 12:25:19.512110 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:25:49 crc kubenswrapper[4909]: I0202 12:25:49.511392 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:25:49 crc kubenswrapper[4909]: I0202 12:25:49.512041 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:25:49 crc kubenswrapper[4909]: I0202 12:25:49.512083 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 12:25:49 crc kubenswrapper[4909]: I0202 12:25:49.512914 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a6781067b416e92ff82faf52056033ec4a8268b572f99a2e337453f09d68263"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 12:25:49 crc kubenswrapper[4909]: I0202 12:25:49.512976 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://1a6781067b416e92ff82faf52056033ec4a8268b572f99a2e337453f09d68263" gracePeriod=600 Feb 02 12:25:50 crc kubenswrapper[4909]: I0202 12:25:50.312595 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="1a6781067b416e92ff82faf52056033ec4a8268b572f99a2e337453f09d68263" exitCode=0 Feb 02 12:25:50 crc kubenswrapper[4909]: I0202 12:25:50.312704 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"1a6781067b416e92ff82faf52056033ec4a8268b572f99a2e337453f09d68263"} Feb 02 12:25:50 crc kubenswrapper[4909]: I0202 12:25:50.312982 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba"} Feb 02 12:25:50 crc kubenswrapper[4909]: I0202 12:25:50.313019 4909 scope.go:117] "RemoveContainer" containerID="442d50396a285abe1a6d9ec05956a98d2d1b18720018b06bf8a5c5acc33f72d0" Feb 02 12:26:20 crc kubenswrapper[4909]: I0202 12:26:20.041560 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-2ac2-account-create-update-944cv"] Feb 02 12:26:20 crc kubenswrapper[4909]: I0202 12:26:20.051156 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-2ac2-account-create-update-944cv"] Feb 02 12:26:21 crc kubenswrapper[4909]: I0202 12:26:21.032027 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c55dfe-a573-4cfd-8d73-cce2d262e7ff" path="/var/lib/kubelet/pods/55c55dfe-a573-4cfd-8d73-cce2d262e7ff/volumes" Feb 02 12:26:21 crc kubenswrapper[4909]: I0202 12:26:21.033122 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-f2xgf"] Feb 02 12:26:21 crc kubenswrapper[4909]: I0202 12:26:21.040886 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-f2xgf"] Feb 02 12:26:23 crc kubenswrapper[4909]: I0202 12:26:23.037228 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8eb4f3a-633d-49d9-bb32-6be5b029f5cf" path="/var/lib/kubelet/pods/a8eb4f3a-633d-49d9-bb32-6be5b029f5cf/volumes" Feb 02 12:26:31 crc kubenswrapper[4909]: I0202 12:26:31.464629 4909 scope.go:117] "RemoveContainer" containerID="4960d871dac71e24fd970d12f598741a6ccef16d2599222031692a97570b19bf" Feb 02 12:26:31 crc kubenswrapper[4909]: I0202 12:26:31.499074 4909 scope.go:117] "RemoveContainer" containerID="9cc8f86b911fd76be3c2301f273a7400c5629a8d2d974d69ad961da19709b1ab" Feb 02 12:26:33 crc kubenswrapper[4909]: I0202 12:26:33.031576 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-j6khp"] Feb 02 12:26:33 crc kubenswrapper[4909]: I0202 12:26:33.043201 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-j6khp"] Feb 02 12:26:35 crc kubenswrapper[4909]: I0202 12:26:35.028319 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b642a8d-95a4-4129-8a23-81739ace58d8" path="/var/lib/kubelet/pods/9b642a8d-95a4-4129-8a23-81739ace58d8/volumes" Feb 02 12:27:31 crc kubenswrapper[4909]: I0202 12:27:31.607142 4909 scope.go:117] "RemoveContainer" containerID="73c688761d96589b2d874c1b8568bea569e4f43517e6fe0e6776372ade0924a1" Feb 02 12:27:49 crc kubenswrapper[4909]: I0202 12:27:49.511269 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:27:49 crc kubenswrapper[4909]: I0202 12:27:49.511844 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:28:19 crc kubenswrapper[4909]: I0202 12:28:19.511547 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:28:19 crc kubenswrapper[4909]: I0202 12:28:19.512484 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:28:49 crc kubenswrapper[4909]: I0202 12:28:49.511676 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:28:49 crc kubenswrapper[4909]: I0202 12:28:49.512155 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:28:49 crc kubenswrapper[4909]: I0202 12:28:49.512202 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 12:28:49 crc kubenswrapper[4909]: I0202 12:28:49.512964 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 12:28:49 crc kubenswrapper[4909]: I0202 12:28:49.513024 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" gracePeriod=600 Feb 02 12:28:49 crc kubenswrapper[4909]: E0202 12:28:49.653877 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:28:49 crc kubenswrapper[4909]: I0202 12:28:49.969016 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" exitCode=0 Feb 02 12:28:49 crc kubenswrapper[4909]: I0202 12:28:49.969093 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba"} Feb 02 12:28:49 crc kubenswrapper[4909]: I0202 12:28:49.969368 4909 scope.go:117] "RemoveContainer" containerID="1a6781067b416e92ff82faf52056033ec4a8268b572f99a2e337453f09d68263" Feb 02 12:28:49 crc kubenswrapper[4909]: I0202 12:28:49.970094 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:28:49 crc kubenswrapper[4909]: E0202 12:28:49.970454 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:29:03 crc kubenswrapper[4909]: I0202 12:29:03.016579 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:29:03 crc kubenswrapper[4909]: E0202 12:29:03.017261 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:29:18 crc kubenswrapper[4909]: I0202 12:29:18.016416 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:29:18 crc kubenswrapper[4909]: E0202 12:29:18.017114 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:29:32 crc kubenswrapper[4909]: I0202 12:29:32.016495 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:29:32 crc kubenswrapper[4909]: E0202 12:29:32.017375 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:29:45 crc kubenswrapper[4909]: I0202 12:29:45.022449 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:29:45 crc kubenswrapper[4909]: E0202 12:29:45.023310 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.017199 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:30:00 crc kubenswrapper[4909]: E0202 12:30:00.017969 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.160555 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5"] Feb 02 12:30:00 crc kubenswrapper[4909]: E0202 12:30:00.161603 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerName="extract-utilities" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.161723 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerName="extract-utilities" Feb 02 12:30:00 crc kubenswrapper[4909]: E0202 12:30:00.161799 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerName="extract-content" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.161887 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerName="extract-content" Feb 02 12:30:00 crc kubenswrapper[4909]: E0202 12:30:00.162013 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb80dc0-c89e-4930-88b4-1077e301a731" containerName="registry-server" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.162086 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb80dc0-c89e-4930-88b4-1077e301a731" containerName="registry-server" Feb 02 12:30:00 crc kubenswrapper[4909]: E0202 12:30:00.162158 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerName="registry-server" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.162217 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerName="registry-server" Feb 02 12:30:00 crc kubenswrapper[4909]: E0202 12:30:00.162371 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb80dc0-c89e-4930-88b4-1077e301a731" containerName="extract-content" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.162469 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb80dc0-c89e-4930-88b4-1077e301a731" containerName="extract-content" Feb 02 12:30:00 crc kubenswrapper[4909]: E0202 12:30:00.162546 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb80dc0-c89e-4930-88b4-1077e301a731" containerName="extract-utilities" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.162613 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb80dc0-c89e-4930-88b4-1077e301a731" containerName="extract-utilities" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.162928 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2acc149b-7c91-45d1-8be2-ae184eedbb67" containerName="registry-server" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.163099 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb80dc0-c89e-4930-88b4-1077e301a731" containerName="registry-server" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.163990 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.167223 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.174437 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.195884 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5"] Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.255771 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-secret-volume\") pod \"collect-profiles-29500590-76fp5\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.255847 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-config-volume\") pod \"collect-profiles-29500590-76fp5\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.256179 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hlrz\" (UniqueName: \"kubernetes.io/projected/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-kube-api-access-7hlrz\") pod \"collect-profiles-29500590-76fp5\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.359083 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-secret-volume\") pod \"collect-profiles-29500590-76fp5\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.359157 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-config-volume\") pod \"collect-profiles-29500590-76fp5\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.359270 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hlrz\" (UniqueName: \"kubernetes.io/projected/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-kube-api-access-7hlrz\") pod \"collect-profiles-29500590-76fp5\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.360201 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-config-volume\") pod \"collect-profiles-29500590-76fp5\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.365464 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-secret-volume\") pod \"collect-profiles-29500590-76fp5\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.376353 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hlrz\" (UniqueName: \"kubernetes.io/projected/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-kube-api-access-7hlrz\") pod \"collect-profiles-29500590-76fp5\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:00 crc kubenswrapper[4909]: I0202 12:30:00.489563 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:01 crc kubenswrapper[4909]: I0202 12:30:01.029940 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5"] Feb 02 12:30:01 crc kubenswrapper[4909]: I0202 12:30:01.650130 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" event={"ID":"7eeeb7b6-cf1b-4786-acba-8aa056a0c195","Type":"ContainerStarted","Data":"8134a912773b43482bb43d8388b015c9559f5f59c3aadd8a86108c743d0011b5"} Feb 02 12:30:01 crc kubenswrapper[4909]: I0202 12:30:01.650685 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" event={"ID":"7eeeb7b6-cf1b-4786-acba-8aa056a0c195","Type":"ContainerStarted","Data":"204c3b69f1c8fa7d35a45f852f55b01a9962225e6611c5fcb92e7dd28c6bbdb8"} Feb 02 12:30:01 crc kubenswrapper[4909]: I0202 12:30:01.673001 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" podStartSLOduration=1.672979458 podStartE2EDuration="1.672979458s" podCreationTimestamp="2026-02-02 12:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:30:01.664349523 +0000 UTC m=+7127.410450258" watchObservedRunningTime="2026-02-02 12:30:01.672979458 +0000 UTC m=+7127.419080193" Feb 02 12:30:02 crc kubenswrapper[4909]: I0202 12:30:02.662338 4909 generic.go:334] "Generic (PLEG): container finished" podID="7eeeb7b6-cf1b-4786-acba-8aa056a0c195" containerID="8134a912773b43482bb43d8388b015c9559f5f59c3aadd8a86108c743d0011b5" exitCode=0 Feb 02 12:30:02 crc kubenswrapper[4909]: I0202 12:30:02.662610 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" event={"ID":"7eeeb7b6-cf1b-4786-acba-8aa056a0c195","Type":"ContainerDied","Data":"8134a912773b43482bb43d8388b015c9559f5f59c3aadd8a86108c743d0011b5"} Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.023364 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.139228 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-secret-volume\") pod \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.139399 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-config-volume\") pod \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.139483 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hlrz\" (UniqueName: \"kubernetes.io/projected/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-kube-api-access-7hlrz\") pod \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\" (UID: \"7eeeb7b6-cf1b-4786-acba-8aa056a0c195\") " Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.139879 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-config-volume" (OuterVolumeSpecName: "config-volume") pod "7eeeb7b6-cf1b-4786-acba-8aa056a0c195" (UID: "7eeeb7b6-cf1b-4786-acba-8aa056a0c195"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.140177 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.145679 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-kube-api-access-7hlrz" (OuterVolumeSpecName: "kube-api-access-7hlrz") pod "7eeeb7b6-cf1b-4786-acba-8aa056a0c195" (UID: "7eeeb7b6-cf1b-4786-acba-8aa056a0c195"). InnerVolumeSpecName "kube-api-access-7hlrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.146419 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7eeeb7b6-cf1b-4786-acba-8aa056a0c195" (UID: "7eeeb7b6-cf1b-4786-acba-8aa056a0c195"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.242072 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hlrz\" (UniqueName: \"kubernetes.io/projected/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-kube-api-access-7hlrz\") on node \"crc\" DevicePath \"\"" Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.242101 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7eeeb7b6-cf1b-4786-acba-8aa056a0c195-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.683489 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" event={"ID":"7eeeb7b6-cf1b-4786-acba-8aa056a0c195","Type":"ContainerDied","Data":"204c3b69f1c8fa7d35a45f852f55b01a9962225e6611c5fcb92e7dd28c6bbdb8"} Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.683924 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="204c3b69f1c8fa7d35a45f852f55b01a9962225e6611c5fcb92e7dd28c6bbdb8" Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.683589 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5" Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.758658 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p"] Feb 02 12:30:04 crc kubenswrapper[4909]: I0202 12:30:04.767441 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500545-8tm2p"] Feb 02 12:30:05 crc kubenswrapper[4909]: I0202 12:30:05.036834 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae9c104-1064-429b-b67a-2b6fca33d38c" path="/var/lib/kubelet/pods/bae9c104-1064-429b-b67a-2b6fca33d38c/volumes" Feb 02 12:30:15 crc kubenswrapper[4909]: I0202 12:30:15.026890 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:30:15 crc kubenswrapper[4909]: E0202 12:30:15.027887 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.344499 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-566vs"] Feb 02 12:30:26 crc kubenswrapper[4909]: E0202 12:30:26.346086 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eeeb7b6-cf1b-4786-acba-8aa056a0c195" containerName="collect-profiles" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.346106 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eeeb7b6-cf1b-4786-acba-8aa056a0c195" containerName="collect-profiles" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.346362 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eeeb7b6-cf1b-4786-acba-8aa056a0c195" containerName="collect-profiles" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.348731 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.361049 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-566vs"] Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.422660 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-catalog-content\") pod \"redhat-operators-566vs\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.423046 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c2lg\" (UniqueName: \"kubernetes.io/projected/5ca2bba8-6828-4435-b2ee-09063f169026-kube-api-access-8c2lg\") pod \"redhat-operators-566vs\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.423091 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-utilities\") pod \"redhat-operators-566vs\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.526461 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-catalog-content\") pod \"redhat-operators-566vs\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.526527 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c2lg\" (UniqueName: \"kubernetes.io/projected/5ca2bba8-6828-4435-b2ee-09063f169026-kube-api-access-8c2lg\") pod \"redhat-operators-566vs\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.526569 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-utilities\") pod \"redhat-operators-566vs\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.527002 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-catalog-content\") pod \"redhat-operators-566vs\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.527228 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-utilities\") pod \"redhat-operators-566vs\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.554563 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c2lg\" (UniqueName: \"kubernetes.io/projected/5ca2bba8-6828-4435-b2ee-09063f169026-kube-api-access-8c2lg\") pod \"redhat-operators-566vs\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:26 crc kubenswrapper[4909]: I0202 12:30:26.681704 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:27 crc kubenswrapper[4909]: I0202 12:30:27.150447 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-566vs"] Feb 02 12:30:27 crc kubenswrapper[4909]: I0202 12:30:27.919237 4909 generic.go:334] "Generic (PLEG): container finished" podID="5ca2bba8-6828-4435-b2ee-09063f169026" containerID="37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec" exitCode=0 Feb 02 12:30:27 crc kubenswrapper[4909]: I0202 12:30:27.919290 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-566vs" event={"ID":"5ca2bba8-6828-4435-b2ee-09063f169026","Type":"ContainerDied","Data":"37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec"} Feb 02 12:30:27 crc kubenswrapper[4909]: I0202 12:30:27.919531 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-566vs" event={"ID":"5ca2bba8-6828-4435-b2ee-09063f169026","Type":"ContainerStarted","Data":"20c94a5e6ff6447196df892207dbf142edd8fd4c602981266de77883f69ec14d"} Feb 02 12:30:27 crc kubenswrapper[4909]: I0202 12:30:27.921375 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 12:30:28 crc kubenswrapper[4909]: I0202 12:30:28.930208 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-566vs" event={"ID":"5ca2bba8-6828-4435-b2ee-09063f169026","Type":"ContainerStarted","Data":"086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3"} Feb 02 12:30:29 crc kubenswrapper[4909]: I0202 12:30:29.016735 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:30:29 crc kubenswrapper[4909]: E0202 12:30:29.017050 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:30:31 crc kubenswrapper[4909]: I0202 12:30:31.710952 4909 scope.go:117] "RemoveContainer" containerID="dcc5795e21b5e3c7234f9ac6df8bd69227be7ed5bf67eb183dda28cde0d54e14" Feb 02 12:30:33 crc kubenswrapper[4909]: I0202 12:30:33.978444 4909 generic.go:334] "Generic (PLEG): container finished" podID="5ca2bba8-6828-4435-b2ee-09063f169026" containerID="086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3" exitCode=0 Feb 02 12:30:33 crc kubenswrapper[4909]: I0202 12:30:33.978533 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-566vs" event={"ID":"5ca2bba8-6828-4435-b2ee-09063f169026","Type":"ContainerDied","Data":"086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3"} Feb 02 12:30:34 crc kubenswrapper[4909]: I0202 12:30:34.992131 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-566vs" event={"ID":"5ca2bba8-6828-4435-b2ee-09063f169026","Type":"ContainerStarted","Data":"e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341"} Feb 02 12:30:35 crc kubenswrapper[4909]: I0202 12:30:35.032650 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-566vs" podStartSLOduration=2.597719506 podStartE2EDuration="9.032620118s" podCreationTimestamp="2026-02-02 12:30:26 +0000 UTC" firstStartedPulling="2026-02-02 12:30:27.920966426 +0000 UTC m=+7153.667067161" lastFinishedPulling="2026-02-02 12:30:34.355867038 +0000 UTC m=+7160.101967773" observedRunningTime="2026-02-02 12:30:35.023641143 +0000 UTC m=+7160.769741878" watchObservedRunningTime="2026-02-02 12:30:35.032620118 +0000 UTC m=+7160.778720853" Feb 02 12:30:36 crc kubenswrapper[4909]: I0202 12:30:36.682707 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:36 crc kubenswrapper[4909]: I0202 12:30:36.683432 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:37 crc kubenswrapper[4909]: I0202 12:30:37.756398 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-566vs" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" containerName="registry-server" probeResult="failure" output=< Feb 02 12:30:37 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:30:37 crc kubenswrapper[4909]: > Feb 02 12:30:41 crc kubenswrapper[4909]: I0202 12:30:41.017121 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:30:41 crc kubenswrapper[4909]: E0202 12:30:41.017772 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:30:44 crc kubenswrapper[4909]: I0202 12:30:44.082368 4909 generic.go:334] "Generic (PLEG): container finished" podID="f8db189d-64bd-4a95-93de-3ddcb680c6b0" containerID="83c8557280e43b9707cd6115dfa90f608011fd0cd1322fc421d9e991f8f70f9b" exitCode=0 Feb 02 12:30:44 crc kubenswrapper[4909]: I0202 12:30:44.082423 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" event={"ID":"f8db189d-64bd-4a95-93de-3ddcb680c6b0","Type":"ContainerDied","Data":"83c8557280e43b9707cd6115dfa90f608011fd0cd1322fc421d9e991f8f70f9b"} Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.522188 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.650695 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvbhr\" (UniqueName: \"kubernetes.io/projected/f8db189d-64bd-4a95-93de-3ddcb680c6b0-kube-api-access-wvbhr\") pod \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.651020 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-tripleo-cleanup-combined-ca-bundle\") pod \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.651083 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-inventory\") pod \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.651112 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-ssh-key-openstack-cell1\") pod \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\" (UID: \"f8db189d-64bd-4a95-93de-3ddcb680c6b0\") " Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.657433 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8db189d-64bd-4a95-93de-3ddcb680c6b0-kube-api-access-wvbhr" (OuterVolumeSpecName: "kube-api-access-wvbhr") pod "f8db189d-64bd-4a95-93de-3ddcb680c6b0" (UID: "f8db189d-64bd-4a95-93de-3ddcb680c6b0"). InnerVolumeSpecName "kube-api-access-wvbhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.657493 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "f8db189d-64bd-4a95-93de-3ddcb680c6b0" (UID: "f8db189d-64bd-4a95-93de-3ddcb680c6b0"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.689689 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f8db189d-64bd-4a95-93de-3ddcb680c6b0" (UID: "f8db189d-64bd-4a95-93de-3ddcb680c6b0"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.697682 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-inventory" (OuterVolumeSpecName: "inventory") pod "f8db189d-64bd-4a95-93de-3ddcb680c6b0" (UID: "f8db189d-64bd-4a95-93de-3ddcb680c6b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.753861 4909 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.753894 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.753905 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f8db189d-64bd-4a95-93de-3ddcb680c6b0-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:30:45 crc kubenswrapper[4909]: I0202 12:30:45.753914 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvbhr\" (UniqueName: \"kubernetes.io/projected/f8db189d-64bd-4a95-93de-3ddcb680c6b0-kube-api-access-wvbhr\") on node \"crc\" DevicePath \"\"" Feb 02 12:30:46 crc kubenswrapper[4909]: I0202 12:30:46.101595 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" event={"ID":"f8db189d-64bd-4a95-93de-3ddcb680c6b0","Type":"ContainerDied","Data":"a70e1c9fb088b4aba2ea25be060bb664960bb91b84f4e7495d367ec309768f38"} Feb 02 12:30:46 crc kubenswrapper[4909]: I0202 12:30:46.101656 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a70e1c9fb088b4aba2ea25be060bb664960bb91b84f4e7495d367ec309768f38" Feb 02 12:30:46 crc kubenswrapper[4909]: I0202 12:30:46.101678 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst" Feb 02 12:30:47 crc kubenswrapper[4909]: I0202 12:30:47.735271 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-566vs" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" containerName="registry-server" probeResult="failure" output=< Feb 02 12:30:47 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:30:47 crc kubenswrapper[4909]: > Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.233552 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bcjvg"] Feb 02 12:30:53 crc kubenswrapper[4909]: E0202 12:30:53.235890 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8db189d-64bd-4a95-93de-3ddcb680c6b0" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.235943 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8db189d-64bd-4a95-93de-3ddcb680c6b0" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.236840 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8db189d-64bd-4a95-93de-3ddcb680c6b0" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.252652 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.258852 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.260743 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.261061 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.268653 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bcjvg"] Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.269251 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.324770 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.325323 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cndgv\" (UniqueName: \"kubernetes.io/projected/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-kube-api-access-cndgv\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.325375 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-inventory\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.325456 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.427964 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cndgv\" (UniqueName: \"kubernetes.io/projected/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-kube-api-access-cndgv\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.428046 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-inventory\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.428157 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.428234 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.433797 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.434151 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-inventory\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.450477 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.457045 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cndgv\" (UniqueName: \"kubernetes.io/projected/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-kube-api-access-cndgv\") pod \"bootstrap-openstack-openstack-cell1-bcjvg\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:53 crc kubenswrapper[4909]: I0202 12:30:53.602825 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:30:54 crc kubenswrapper[4909]: I0202 12:30:54.192507 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bcjvg"] Feb 02 12:30:55 crc kubenswrapper[4909]: I0202 12:30:55.204945 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" event={"ID":"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d","Type":"ContainerStarted","Data":"e71bf601c5fab7cf40db9fc036654afe2d518d7e320d0079ee79227a621bb0f2"} Feb 02 12:30:55 crc kubenswrapper[4909]: I0202 12:30:55.205244 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" event={"ID":"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d","Type":"ContainerStarted","Data":"b91026122bd7cdc5b17f844d19a3e3df57c5d5e854e96750c8d32b9e5b17de16"} Feb 02 12:30:55 crc kubenswrapper[4909]: I0202 12:30:55.228100 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" podStartSLOduration=1.8228381599999999 podStartE2EDuration="2.228083234s" podCreationTimestamp="2026-02-02 12:30:53 +0000 UTC" firstStartedPulling="2026-02-02 12:30:54.210981266 +0000 UTC m=+7179.957082001" lastFinishedPulling="2026-02-02 12:30:54.61622634 +0000 UTC m=+7180.362327075" observedRunningTime="2026-02-02 12:30:55.221028873 +0000 UTC m=+7180.967129608" watchObservedRunningTime="2026-02-02 12:30:55.228083234 +0000 UTC m=+7180.974183969" Feb 02 12:30:56 crc kubenswrapper[4909]: I0202 12:30:56.016954 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:30:56 crc kubenswrapper[4909]: E0202 12:30:56.017509 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:30:56 crc kubenswrapper[4909]: I0202 12:30:56.743646 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:56 crc kubenswrapper[4909]: I0202 12:30:56.796916 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:57 crc kubenswrapper[4909]: I0202 12:30:57.545537 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-566vs"] Feb 02 12:30:58 crc kubenswrapper[4909]: I0202 12:30:58.236324 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-566vs" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" containerName="registry-server" containerID="cri-o://e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341" gracePeriod=2 Feb 02 12:30:58 crc kubenswrapper[4909]: I0202 12:30:58.779096 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:58 crc kubenswrapper[4909]: I0202 12:30:58.841773 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-utilities\") pod \"5ca2bba8-6828-4435-b2ee-09063f169026\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " Feb 02 12:30:58 crc kubenswrapper[4909]: I0202 12:30:58.841943 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c2lg\" (UniqueName: \"kubernetes.io/projected/5ca2bba8-6828-4435-b2ee-09063f169026-kube-api-access-8c2lg\") pod \"5ca2bba8-6828-4435-b2ee-09063f169026\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " Feb 02 12:30:58 crc kubenswrapper[4909]: I0202 12:30:58.841985 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-catalog-content\") pod \"5ca2bba8-6828-4435-b2ee-09063f169026\" (UID: \"5ca2bba8-6828-4435-b2ee-09063f169026\") " Feb 02 12:30:58 crc kubenswrapper[4909]: I0202 12:30:58.842535 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-utilities" (OuterVolumeSpecName: "utilities") pod "5ca2bba8-6828-4435-b2ee-09063f169026" (UID: "5ca2bba8-6828-4435-b2ee-09063f169026"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:30:58 crc kubenswrapper[4909]: I0202 12:30:58.851101 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ca2bba8-6828-4435-b2ee-09063f169026-kube-api-access-8c2lg" (OuterVolumeSpecName: "kube-api-access-8c2lg") pod "5ca2bba8-6828-4435-b2ee-09063f169026" (UID: "5ca2bba8-6828-4435-b2ee-09063f169026"). InnerVolumeSpecName "kube-api-access-8c2lg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:30:58 crc kubenswrapper[4909]: I0202 12:30:58.945137 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:30:58 crc kubenswrapper[4909]: I0202 12:30:58.945169 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c2lg\" (UniqueName: \"kubernetes.io/projected/5ca2bba8-6828-4435-b2ee-09063f169026-kube-api-access-8c2lg\") on node \"crc\" DevicePath \"\"" Feb 02 12:30:58 crc kubenswrapper[4909]: I0202 12:30:58.966718 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ca2bba8-6828-4435-b2ee-09063f169026" (UID: "5ca2bba8-6828-4435-b2ee-09063f169026"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.047754 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ca2bba8-6828-4435-b2ee-09063f169026-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.246617 4909 generic.go:334] "Generic (PLEG): container finished" podID="5ca2bba8-6828-4435-b2ee-09063f169026" containerID="e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341" exitCode=0 Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.246660 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-566vs" event={"ID":"5ca2bba8-6828-4435-b2ee-09063f169026","Type":"ContainerDied","Data":"e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341"} Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.246687 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-566vs" event={"ID":"5ca2bba8-6828-4435-b2ee-09063f169026","Type":"ContainerDied","Data":"20c94a5e6ff6447196df892207dbf142edd8fd4c602981266de77883f69ec14d"} Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.246696 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-566vs" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.246706 4909 scope.go:117] "RemoveContainer" containerID="e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.275394 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-566vs"] Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.275878 4909 scope.go:117] "RemoveContainer" containerID="086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.286802 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-566vs"] Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.298074 4909 scope.go:117] "RemoveContainer" containerID="37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.351598 4909 scope.go:117] "RemoveContainer" containerID="e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341" Feb 02 12:30:59 crc kubenswrapper[4909]: E0202 12:30:59.352169 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341\": container with ID starting with e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341 not found: ID does not exist" containerID="e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.352212 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341"} err="failed to get container status \"e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341\": rpc error: code = NotFound desc = could not find container \"e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341\": container with ID starting with e6d41d65eb6d94c075bd4ec0640279ccf31452551c874033075717b7d8226341 not found: ID does not exist" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.352241 4909 scope.go:117] "RemoveContainer" containerID="086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3" Feb 02 12:30:59 crc kubenswrapper[4909]: E0202 12:30:59.352605 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3\": container with ID starting with 086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3 not found: ID does not exist" containerID="086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.352646 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3"} err="failed to get container status \"086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3\": rpc error: code = NotFound desc = could not find container \"086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3\": container with ID starting with 086b5fcc58dad6efe3422036112ff0fd201ad792ace0b8f6feb6bd7307d931f3 not found: ID does not exist" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.352666 4909 scope.go:117] "RemoveContainer" containerID="37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec" Feb 02 12:30:59 crc kubenswrapper[4909]: E0202 12:30:59.352964 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec\": container with ID starting with 37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec not found: ID does not exist" containerID="37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec" Feb 02 12:30:59 crc kubenswrapper[4909]: I0202 12:30:59.352997 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec"} err="failed to get container status \"37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec\": rpc error: code = NotFound desc = could not find container \"37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec\": container with ID starting with 37168baf527239e0745446b4ce62479e97b2428a070e88d7d9008ceadb23bfec not found: ID does not exist" Feb 02 12:31:01 crc kubenswrapper[4909]: I0202 12:31:01.027619 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" path="/var/lib/kubelet/pods/5ca2bba8-6828-4435-b2ee-09063f169026/volumes" Feb 02 12:31:07 crc kubenswrapper[4909]: I0202 12:31:07.016885 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:31:07 crc kubenswrapper[4909]: E0202 12:31:07.017674 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:31:18 crc kubenswrapper[4909]: I0202 12:31:18.016909 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:31:18 crc kubenswrapper[4909]: E0202 12:31:18.019109 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:31:33 crc kubenswrapper[4909]: I0202 12:31:33.018167 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:31:33 crc kubenswrapper[4909]: E0202 12:31:33.019962 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:31:45 crc kubenswrapper[4909]: I0202 12:31:45.024422 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:31:45 crc kubenswrapper[4909]: E0202 12:31:45.025186 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:32:00 crc kubenswrapper[4909]: I0202 12:32:00.016778 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:32:00 crc kubenswrapper[4909]: E0202 12:32:00.017563 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:32:10 crc kubenswrapper[4909]: I0202 12:32:10.904895 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:32:10 crc kubenswrapper[4909]: E0202 12:32:10.905544 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:32:23 crc kubenswrapper[4909]: I0202 12:32:23.017044 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:32:23 crc kubenswrapper[4909]: E0202 12:32:23.018899 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:32:36 crc kubenswrapper[4909]: I0202 12:32:36.016307 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:32:36 crc kubenswrapper[4909]: E0202 12:32:36.017137 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:32:49 crc kubenswrapper[4909]: I0202 12:32:49.016521 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:32:49 crc kubenswrapper[4909]: E0202 12:32:49.017406 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:33:02 crc kubenswrapper[4909]: I0202 12:33:02.017716 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:33:02 crc kubenswrapper[4909]: E0202 12:33:02.018524 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:33:15 crc kubenswrapper[4909]: I0202 12:33:15.026680 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:33:15 crc kubenswrapper[4909]: E0202 12:33:15.028268 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:33:30 crc kubenswrapper[4909]: I0202 12:33:30.017139 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:33:30 crc kubenswrapper[4909]: E0202 12:33:30.017872 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:33:41 crc kubenswrapper[4909]: I0202 12:33:41.016510 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:33:41 crc kubenswrapper[4909]: E0202 12:33:41.017430 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:33:52 crc kubenswrapper[4909]: I0202 12:33:52.017116 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:33:52 crc kubenswrapper[4909]: I0202 12:33:52.839304 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"3c721b1aabb3d968c8ddbe1edde1cd3f18f6d49e271134714d86a9cde069c5c4"} Feb 02 12:33:56 crc kubenswrapper[4909]: I0202 12:33:56.886214 4909 generic.go:334] "Generic (PLEG): container finished" podID="5aaf8cb2-7efd-4f8b-8348-ee983dbe284d" containerID="e71bf601c5fab7cf40db9fc036654afe2d518d7e320d0079ee79227a621bb0f2" exitCode=0 Feb 02 12:33:56 crc kubenswrapper[4909]: I0202 12:33:56.886372 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" event={"ID":"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d","Type":"ContainerDied","Data":"e71bf601c5fab7cf40db9fc036654afe2d518d7e320d0079ee79227a621bb0f2"} Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.291861 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.485230 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-ssh-key-openstack-cell1\") pod \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.485651 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-bootstrap-combined-ca-bundle\") pod \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.485706 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-inventory\") pod \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.485751 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cndgv\" (UniqueName: \"kubernetes.io/projected/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-kube-api-access-cndgv\") pod \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\" (UID: \"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d\") " Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.491688 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5aaf8cb2-7efd-4f8b-8348-ee983dbe284d" (UID: "5aaf8cb2-7efd-4f8b-8348-ee983dbe284d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.491764 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-kube-api-access-cndgv" (OuterVolumeSpecName: "kube-api-access-cndgv") pod "5aaf8cb2-7efd-4f8b-8348-ee983dbe284d" (UID: "5aaf8cb2-7efd-4f8b-8348-ee983dbe284d"). InnerVolumeSpecName "kube-api-access-cndgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.515542 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-inventory" (OuterVolumeSpecName: "inventory") pod "5aaf8cb2-7efd-4f8b-8348-ee983dbe284d" (UID: "5aaf8cb2-7efd-4f8b-8348-ee983dbe284d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.518770 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5aaf8cb2-7efd-4f8b-8348-ee983dbe284d" (UID: "5aaf8cb2-7efd-4f8b-8348-ee983dbe284d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.588879 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.588916 4909 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.588931 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.588943 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cndgv\" (UniqueName: \"kubernetes.io/projected/5aaf8cb2-7efd-4f8b-8348-ee983dbe284d-kube-api-access-cndgv\") on node \"crc\" DevicePath \"\"" Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.906349 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" event={"ID":"5aaf8cb2-7efd-4f8b-8348-ee983dbe284d","Type":"ContainerDied","Data":"b91026122bd7cdc5b17f844d19a3e3df57c5d5e854e96750c8d32b9e5b17de16"} Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.906395 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b91026122bd7cdc5b17f844d19a3e3df57c5d5e854e96750c8d32b9e5b17de16" Feb 02 12:33:58 crc kubenswrapper[4909]: I0202 12:33:58.906452 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcjvg" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.063662 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-7pdh7"] Feb 02 12:33:59 crc kubenswrapper[4909]: E0202 12:33:59.064114 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" containerName="registry-server" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.064142 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" containerName="registry-server" Feb 02 12:33:59 crc kubenswrapper[4909]: E0202 12:33:59.064165 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aaf8cb2-7efd-4f8b-8348-ee983dbe284d" containerName="bootstrap-openstack-openstack-cell1" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.064174 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aaf8cb2-7efd-4f8b-8348-ee983dbe284d" containerName="bootstrap-openstack-openstack-cell1" Feb 02 12:33:59 crc kubenswrapper[4909]: E0202 12:33:59.064186 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" containerName="extract-utilities" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.064193 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" containerName="extract-utilities" Feb 02 12:33:59 crc kubenswrapper[4909]: E0202 12:33:59.064212 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" containerName="extract-content" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.064221 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" containerName="extract-content" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.064470 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ca2bba8-6828-4435-b2ee-09063f169026" containerName="registry-server" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.064489 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aaf8cb2-7efd-4f8b-8348-ee983dbe284d" containerName="bootstrap-openstack-openstack-cell1" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.065330 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.071585 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.072150 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.073400 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.073601 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.084014 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-7pdh7"] Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.106183 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-7pdh7\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.106283 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nklv\" (UniqueName: \"kubernetes.io/projected/44aa8cf1-5580-47f1-b457-c1afd10ffa00-kube-api-access-6nklv\") pod \"download-cache-openstack-openstack-cell1-7pdh7\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.106307 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-inventory\") pod \"download-cache-openstack-openstack-cell1-7pdh7\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.208219 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-7pdh7\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.208335 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nklv\" (UniqueName: \"kubernetes.io/projected/44aa8cf1-5580-47f1-b457-c1afd10ffa00-kube-api-access-6nklv\") pod \"download-cache-openstack-openstack-cell1-7pdh7\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.208385 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-inventory\") pod \"download-cache-openstack-openstack-cell1-7pdh7\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.214411 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-7pdh7\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.225411 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-inventory\") pod \"download-cache-openstack-openstack-cell1-7pdh7\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.226568 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nklv\" (UniqueName: \"kubernetes.io/projected/44aa8cf1-5580-47f1-b457-c1afd10ffa00-kube-api-access-6nklv\") pod \"download-cache-openstack-openstack-cell1-7pdh7\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.392636 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.905486 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-7pdh7"] Feb 02 12:33:59 crc kubenswrapper[4909]: I0202 12:33:59.918054 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" event={"ID":"44aa8cf1-5580-47f1-b457-c1afd10ffa00","Type":"ContainerStarted","Data":"bf6c470ab9419a3e99bc8716d408af29506a569e66fffb6fab0a89d35fb45bac"} Feb 02 12:34:00 crc kubenswrapper[4909]: I0202 12:34:00.939765 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" event={"ID":"44aa8cf1-5580-47f1-b457-c1afd10ffa00","Type":"ContainerStarted","Data":"1d9726de0c5d5754669cdc9f6e5e51207fa5247001f05f68f2f39ebde35576f6"} Feb 02 12:34:00 crc kubenswrapper[4909]: I0202 12:34:00.962019 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" podStartSLOduration=2.5113006369999997 podStartE2EDuration="2.961996423s" podCreationTimestamp="2026-02-02 12:33:58 +0000 UTC" firstStartedPulling="2026-02-02 12:33:59.90456795 +0000 UTC m=+7365.650668685" lastFinishedPulling="2026-02-02 12:34:00.355263736 +0000 UTC m=+7366.101364471" observedRunningTime="2026-02-02 12:34:00.961941202 +0000 UTC m=+7366.708041937" watchObservedRunningTime="2026-02-02 12:34:00.961996423 +0000 UTC m=+7366.708097158" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.281122 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9qnq2"] Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.285252 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.301569 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9qnq2"] Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.422302 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p989d\" (UniqueName: \"kubernetes.io/projected/d933f00a-fad5-465f-ac15-52de107de878-kube-api-access-p989d\") pod \"community-operators-9qnq2\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.422457 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-utilities\") pod \"community-operators-9qnq2\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.422488 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-catalog-content\") pod \"community-operators-9qnq2\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.525040 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p989d\" (UniqueName: \"kubernetes.io/projected/d933f00a-fad5-465f-ac15-52de107de878-kube-api-access-p989d\") pod \"community-operators-9qnq2\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.525535 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-utilities\") pod \"community-operators-9qnq2\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.525665 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-catalog-content\") pod \"community-operators-9qnq2\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.526064 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-utilities\") pod \"community-operators-9qnq2\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.526274 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-catalog-content\") pod \"community-operators-9qnq2\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.561960 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p989d\" (UniqueName: \"kubernetes.io/projected/d933f00a-fad5-465f-ac15-52de107de878-kube-api-access-p989d\") pod \"community-operators-9qnq2\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:18 crc kubenswrapper[4909]: I0202 12:34:18.612183 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:19 crc kubenswrapper[4909]: I0202 12:34:19.291321 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9qnq2"] Feb 02 12:34:19 crc kubenswrapper[4909]: W0202 12:34:19.293771 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd933f00a_fad5_465f_ac15_52de107de878.slice/crio-624d882c31d72907e51c29e02dd79aff21d19dbfb32989d2cc71058269262773 WatchSource:0}: Error finding container 624d882c31d72907e51c29e02dd79aff21d19dbfb32989d2cc71058269262773: Status 404 returned error can't find the container with id 624d882c31d72907e51c29e02dd79aff21d19dbfb32989d2cc71058269262773 Feb 02 12:34:20 crc kubenswrapper[4909]: I0202 12:34:20.119127 4909 generic.go:334] "Generic (PLEG): container finished" podID="d933f00a-fad5-465f-ac15-52de107de878" containerID="cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c" exitCode=0 Feb 02 12:34:20 crc kubenswrapper[4909]: I0202 12:34:20.119357 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qnq2" event={"ID":"d933f00a-fad5-465f-ac15-52de107de878","Type":"ContainerDied","Data":"cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c"} Feb 02 12:34:20 crc kubenswrapper[4909]: I0202 12:34:20.119481 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qnq2" event={"ID":"d933f00a-fad5-465f-ac15-52de107de878","Type":"ContainerStarted","Data":"624d882c31d72907e51c29e02dd79aff21d19dbfb32989d2cc71058269262773"} Feb 02 12:34:21 crc kubenswrapper[4909]: I0202 12:34:21.132429 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qnq2" event={"ID":"d933f00a-fad5-465f-ac15-52de107de878","Type":"ContainerStarted","Data":"5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15"} Feb 02 12:34:23 crc kubenswrapper[4909]: I0202 12:34:23.162389 4909 generic.go:334] "Generic (PLEG): container finished" podID="d933f00a-fad5-465f-ac15-52de107de878" containerID="5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15" exitCode=0 Feb 02 12:34:23 crc kubenswrapper[4909]: I0202 12:34:23.162471 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qnq2" event={"ID":"d933f00a-fad5-465f-ac15-52de107de878","Type":"ContainerDied","Data":"5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15"} Feb 02 12:34:24 crc kubenswrapper[4909]: I0202 12:34:24.175640 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qnq2" event={"ID":"d933f00a-fad5-465f-ac15-52de107de878","Type":"ContainerStarted","Data":"cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2"} Feb 02 12:34:24 crc kubenswrapper[4909]: I0202 12:34:24.206269 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9qnq2" podStartSLOduration=2.621963074 podStartE2EDuration="6.206247926s" podCreationTimestamp="2026-02-02 12:34:18 +0000 UTC" firstStartedPulling="2026-02-02 12:34:20.122725819 +0000 UTC m=+7385.868826554" lastFinishedPulling="2026-02-02 12:34:23.707010661 +0000 UTC m=+7389.453111406" observedRunningTime="2026-02-02 12:34:24.199947247 +0000 UTC m=+7389.946047992" watchObservedRunningTime="2026-02-02 12:34:24.206247926 +0000 UTC m=+7389.952348661" Feb 02 12:34:28 crc kubenswrapper[4909]: I0202 12:34:28.612345 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:28 crc kubenswrapper[4909]: I0202 12:34:28.612942 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:28 crc kubenswrapper[4909]: I0202 12:34:28.661952 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:29 crc kubenswrapper[4909]: I0202 12:34:29.267653 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:29 crc kubenswrapper[4909]: I0202 12:34:29.318248 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9qnq2"] Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.233599 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9qnq2" podUID="d933f00a-fad5-465f-ac15-52de107de878" containerName="registry-server" containerID="cri-o://cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2" gracePeriod=2 Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.715230 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.845123 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p989d\" (UniqueName: \"kubernetes.io/projected/d933f00a-fad5-465f-ac15-52de107de878-kube-api-access-p989d\") pod \"d933f00a-fad5-465f-ac15-52de107de878\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.845252 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-utilities\") pod \"d933f00a-fad5-465f-ac15-52de107de878\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.845345 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-catalog-content\") pod \"d933f00a-fad5-465f-ac15-52de107de878\" (UID: \"d933f00a-fad5-465f-ac15-52de107de878\") " Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.846886 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-utilities" (OuterVolumeSpecName: "utilities") pod "d933f00a-fad5-465f-ac15-52de107de878" (UID: "d933f00a-fad5-465f-ac15-52de107de878"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.851266 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d933f00a-fad5-465f-ac15-52de107de878-kube-api-access-p989d" (OuterVolumeSpecName: "kube-api-access-p989d") pod "d933f00a-fad5-465f-ac15-52de107de878" (UID: "d933f00a-fad5-465f-ac15-52de107de878"). InnerVolumeSpecName "kube-api-access-p989d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.915621 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d933f00a-fad5-465f-ac15-52de107de878" (UID: "d933f00a-fad5-465f-ac15-52de107de878"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.948198 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p989d\" (UniqueName: \"kubernetes.io/projected/d933f00a-fad5-465f-ac15-52de107de878-kube-api-access-p989d\") on node \"crc\" DevicePath \"\"" Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.948240 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:34:31 crc kubenswrapper[4909]: I0202 12:34:31.948252 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d933f00a-fad5-465f-ac15-52de107de878-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.272509 4909 generic.go:334] "Generic (PLEG): container finished" podID="d933f00a-fad5-465f-ac15-52de107de878" containerID="cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2" exitCode=0 Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.272558 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qnq2" event={"ID":"d933f00a-fad5-465f-ac15-52de107de878","Type":"ContainerDied","Data":"cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2"} Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.272590 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9qnq2" event={"ID":"d933f00a-fad5-465f-ac15-52de107de878","Type":"ContainerDied","Data":"624d882c31d72907e51c29e02dd79aff21d19dbfb32989d2cc71058269262773"} Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.272606 4909 scope.go:117] "RemoveContainer" containerID="cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2" Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.272762 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9qnq2" Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.311945 4909 scope.go:117] "RemoveContainer" containerID="5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15" Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.326707 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9qnq2"] Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.337260 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9qnq2"] Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.341428 4909 scope.go:117] "RemoveContainer" containerID="cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c" Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.391017 4909 scope.go:117] "RemoveContainer" containerID="cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2" Feb 02 12:34:32 crc kubenswrapper[4909]: E0202 12:34:32.391666 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2\": container with ID starting with cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2 not found: ID does not exist" containerID="cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2" Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.391708 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2"} err="failed to get container status \"cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2\": rpc error: code = NotFound desc = could not find container \"cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2\": container with ID starting with cb462b440f6b5f4d5582713f40de53c8688bf4f2fbb2e07f46059b4eb5f3bbc2 not found: ID does not exist" Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.391735 4909 scope.go:117] "RemoveContainer" containerID="5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15" Feb 02 12:34:32 crc kubenswrapper[4909]: E0202 12:34:32.392062 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15\": container with ID starting with 5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15 not found: ID does not exist" containerID="5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15" Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.392114 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15"} err="failed to get container status \"5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15\": rpc error: code = NotFound desc = could not find container \"5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15\": container with ID starting with 5e85131732ab9532f14049a9c5e63af7d07f8bf8cdc944fcd367416232d69e15 not found: ID does not exist" Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.392146 4909 scope.go:117] "RemoveContainer" containerID="cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c" Feb 02 12:34:32 crc kubenswrapper[4909]: E0202 12:34:32.392672 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c\": container with ID starting with cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c not found: ID does not exist" containerID="cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c" Feb 02 12:34:32 crc kubenswrapper[4909]: I0202 12:34:32.392702 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c"} err="failed to get container status \"cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c\": rpc error: code = NotFound desc = could not find container \"cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c\": container with ID starting with cd7d4264003a40a258fd6901ada7e5695aca24c6e917c6a7d49e876052e61e8c not found: ID does not exist" Feb 02 12:34:33 crc kubenswrapper[4909]: I0202 12:34:33.028668 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d933f00a-fad5-465f-ac15-52de107de878" path="/var/lib/kubelet/pods/d933f00a-fad5-465f-ac15-52de107de878/volumes" Feb 02 12:34:39 crc kubenswrapper[4909]: I0202 12:34:39.869656 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bqpxr"] Feb 02 12:34:39 crc kubenswrapper[4909]: E0202 12:34:39.871522 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d933f00a-fad5-465f-ac15-52de107de878" containerName="extract-content" Feb 02 12:34:39 crc kubenswrapper[4909]: I0202 12:34:39.871546 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d933f00a-fad5-465f-ac15-52de107de878" containerName="extract-content" Feb 02 12:34:39 crc kubenswrapper[4909]: E0202 12:34:39.871598 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d933f00a-fad5-465f-ac15-52de107de878" containerName="extract-utilities" Feb 02 12:34:39 crc kubenswrapper[4909]: I0202 12:34:39.871608 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d933f00a-fad5-465f-ac15-52de107de878" containerName="extract-utilities" Feb 02 12:34:39 crc kubenswrapper[4909]: E0202 12:34:39.871628 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d933f00a-fad5-465f-ac15-52de107de878" containerName="registry-server" Feb 02 12:34:39 crc kubenswrapper[4909]: I0202 12:34:39.871638 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d933f00a-fad5-465f-ac15-52de107de878" containerName="registry-server" Feb 02 12:34:39 crc kubenswrapper[4909]: I0202 12:34:39.871991 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d933f00a-fad5-465f-ac15-52de107de878" containerName="registry-server" Feb 02 12:34:39 crc kubenswrapper[4909]: I0202 12:34:39.874480 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:39 crc kubenswrapper[4909]: I0202 12:34:39.885289 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bqpxr"] Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.022837 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-utilities\") pod \"certified-operators-bqpxr\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.022899 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drjqj\" (UniqueName: \"kubernetes.io/projected/4134e849-bc83-4bba-a580-de513b2ef76b-kube-api-access-drjqj\") pod \"certified-operators-bqpxr\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.023372 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-catalog-content\") pod \"certified-operators-bqpxr\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.127396 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-utilities\") pod \"certified-operators-bqpxr\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.127506 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drjqj\" (UniqueName: \"kubernetes.io/projected/4134e849-bc83-4bba-a580-de513b2ef76b-kube-api-access-drjqj\") pod \"certified-operators-bqpxr\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.127673 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-catalog-content\") pod \"certified-operators-bqpxr\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.128041 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-utilities\") pod \"certified-operators-bqpxr\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.128298 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-catalog-content\") pod \"certified-operators-bqpxr\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.156623 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drjqj\" (UniqueName: \"kubernetes.io/projected/4134e849-bc83-4bba-a580-de513b2ef76b-kube-api-access-drjqj\") pod \"certified-operators-bqpxr\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.204336 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:40 crc kubenswrapper[4909]: I0202 12:34:40.737314 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bqpxr"] Feb 02 12:34:41 crc kubenswrapper[4909]: I0202 12:34:41.368361 4909 generic.go:334] "Generic (PLEG): container finished" podID="4134e849-bc83-4bba-a580-de513b2ef76b" containerID="e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f" exitCode=0 Feb 02 12:34:41 crc kubenswrapper[4909]: I0202 12:34:41.368438 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpxr" event={"ID":"4134e849-bc83-4bba-a580-de513b2ef76b","Type":"ContainerDied","Data":"e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f"} Feb 02 12:34:41 crc kubenswrapper[4909]: I0202 12:34:41.368873 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpxr" event={"ID":"4134e849-bc83-4bba-a580-de513b2ef76b","Type":"ContainerStarted","Data":"f08d549b4c32d17d03bc4f4fb7fb4897166f9eeaba0d79918f9d8b9522654fcb"} Feb 02 12:34:42 crc kubenswrapper[4909]: I0202 12:34:42.379200 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpxr" event={"ID":"4134e849-bc83-4bba-a580-de513b2ef76b","Type":"ContainerStarted","Data":"c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d"} Feb 02 12:34:43 crc kubenswrapper[4909]: I0202 12:34:43.396372 4909 generic.go:334] "Generic (PLEG): container finished" podID="4134e849-bc83-4bba-a580-de513b2ef76b" containerID="c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d" exitCode=0 Feb 02 12:34:43 crc kubenswrapper[4909]: I0202 12:34:43.396459 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpxr" event={"ID":"4134e849-bc83-4bba-a580-de513b2ef76b","Type":"ContainerDied","Data":"c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d"} Feb 02 12:34:44 crc kubenswrapper[4909]: I0202 12:34:44.412106 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpxr" event={"ID":"4134e849-bc83-4bba-a580-de513b2ef76b","Type":"ContainerStarted","Data":"fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e"} Feb 02 12:34:44 crc kubenswrapper[4909]: I0202 12:34:44.462526 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bqpxr" podStartSLOduration=3.03886832 podStartE2EDuration="5.462501057s" podCreationTimestamp="2026-02-02 12:34:39 +0000 UTC" firstStartedPulling="2026-02-02 12:34:41.370795541 +0000 UTC m=+7407.116896286" lastFinishedPulling="2026-02-02 12:34:43.794428278 +0000 UTC m=+7409.540529023" observedRunningTime="2026-02-02 12:34:44.452439911 +0000 UTC m=+7410.198540646" watchObservedRunningTime="2026-02-02 12:34:44.462501057 +0000 UTC m=+7410.208601792" Feb 02 12:34:50 crc kubenswrapper[4909]: I0202 12:34:50.204750 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:50 crc kubenswrapper[4909]: I0202 12:34:50.205518 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:50 crc kubenswrapper[4909]: I0202 12:34:50.256199 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:50 crc kubenswrapper[4909]: I0202 12:34:50.507429 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:50 crc kubenswrapper[4909]: I0202 12:34:50.598250 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bqpxr"] Feb 02 12:34:52 crc kubenswrapper[4909]: I0202 12:34:52.471155 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bqpxr" podUID="4134e849-bc83-4bba-a580-de513b2ef76b" containerName="registry-server" containerID="cri-o://fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e" gracePeriod=2 Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.000571 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.131787 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-utilities\") pod \"4134e849-bc83-4bba-a580-de513b2ef76b\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.131869 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-catalog-content\") pod \"4134e849-bc83-4bba-a580-de513b2ef76b\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.131902 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drjqj\" (UniqueName: \"kubernetes.io/projected/4134e849-bc83-4bba-a580-de513b2ef76b-kube-api-access-drjqj\") pod \"4134e849-bc83-4bba-a580-de513b2ef76b\" (UID: \"4134e849-bc83-4bba-a580-de513b2ef76b\") " Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.132969 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-utilities" (OuterVolumeSpecName: "utilities") pod "4134e849-bc83-4bba-a580-de513b2ef76b" (UID: "4134e849-bc83-4bba-a580-de513b2ef76b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.145447 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4134e849-bc83-4bba-a580-de513b2ef76b-kube-api-access-drjqj" (OuterVolumeSpecName: "kube-api-access-drjqj") pod "4134e849-bc83-4bba-a580-de513b2ef76b" (UID: "4134e849-bc83-4bba-a580-de513b2ef76b"). InnerVolumeSpecName "kube-api-access-drjqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.181026 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4134e849-bc83-4bba-a580-de513b2ef76b" (UID: "4134e849-bc83-4bba-a580-de513b2ef76b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.234901 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.235144 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4134e849-bc83-4bba-a580-de513b2ef76b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.235211 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drjqj\" (UniqueName: \"kubernetes.io/projected/4134e849-bc83-4bba-a580-de513b2ef76b-kube-api-access-drjqj\") on node \"crc\" DevicePath \"\"" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.486429 4909 generic.go:334] "Generic (PLEG): container finished" podID="4134e849-bc83-4bba-a580-de513b2ef76b" containerID="fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e" exitCode=0 Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.486475 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpxr" event={"ID":"4134e849-bc83-4bba-a580-de513b2ef76b","Type":"ContainerDied","Data":"fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e"} Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.486509 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bqpxr" event={"ID":"4134e849-bc83-4bba-a580-de513b2ef76b","Type":"ContainerDied","Data":"f08d549b4c32d17d03bc4f4fb7fb4897166f9eeaba0d79918f9d8b9522654fcb"} Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.486529 4909 scope.go:117] "RemoveContainer" containerID="fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.486566 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bqpxr" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.517172 4909 scope.go:117] "RemoveContainer" containerID="c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.544957 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bqpxr"] Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.558479 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bqpxr"] Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.566451 4909 scope.go:117] "RemoveContainer" containerID="e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.607071 4909 scope.go:117] "RemoveContainer" containerID="fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e" Feb 02 12:34:53 crc kubenswrapper[4909]: E0202 12:34:53.607943 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e\": container with ID starting with fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e not found: ID does not exist" containerID="fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.608023 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e"} err="failed to get container status \"fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e\": rpc error: code = NotFound desc = could not find container \"fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e\": container with ID starting with fa732a5a82e9633d8264bd213e603af1acd37b5d850387604d37ab86ec66791e not found: ID does not exist" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.608093 4909 scope.go:117] "RemoveContainer" containerID="c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d" Feb 02 12:34:53 crc kubenswrapper[4909]: E0202 12:34:53.608731 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d\": container with ID starting with c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d not found: ID does not exist" containerID="c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.608796 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d"} err="failed to get container status \"c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d\": rpc error: code = NotFound desc = could not find container \"c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d\": container with ID starting with c22dc6c65d42e5a6580206883e8a5ff1307ae9230358d70fab283c1ab257d16d not found: ID does not exist" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.608873 4909 scope.go:117] "RemoveContainer" containerID="e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f" Feb 02 12:34:53 crc kubenswrapper[4909]: E0202 12:34:53.610416 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f\": container with ID starting with e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f not found: ID does not exist" containerID="e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f" Feb 02 12:34:53 crc kubenswrapper[4909]: I0202 12:34:53.610451 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f"} err="failed to get container status \"e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f\": rpc error: code = NotFound desc = could not find container \"e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f\": container with ID starting with e6f57748923e08ef7351ea5a8b53d81608d495a80ba0cef46e5242691a802a9f not found: ID does not exist" Feb 02 12:34:55 crc kubenswrapper[4909]: I0202 12:34:55.033759 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4134e849-bc83-4bba-a580-de513b2ef76b" path="/var/lib/kubelet/pods/4134e849-bc83-4bba-a580-de513b2ef76b/volumes" Feb 02 12:35:33 crc kubenswrapper[4909]: I0202 12:35:33.870990 4909 generic.go:334] "Generic (PLEG): container finished" podID="44aa8cf1-5580-47f1-b457-c1afd10ffa00" containerID="1d9726de0c5d5754669cdc9f6e5e51207fa5247001f05f68f2f39ebde35576f6" exitCode=0 Feb 02 12:35:33 crc kubenswrapper[4909]: I0202 12:35:33.871069 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" event={"ID":"44aa8cf1-5580-47f1-b457-c1afd10ffa00","Type":"ContainerDied","Data":"1d9726de0c5d5754669cdc9f6e5e51207fa5247001f05f68f2f39ebde35576f6"} Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.422991 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.607159 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-ssh-key-openstack-cell1\") pod \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.607478 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-inventory\") pod \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.607513 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nklv\" (UniqueName: \"kubernetes.io/projected/44aa8cf1-5580-47f1-b457-c1afd10ffa00-kube-api-access-6nklv\") pod \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\" (UID: \"44aa8cf1-5580-47f1-b457-c1afd10ffa00\") " Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.612898 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44aa8cf1-5580-47f1-b457-c1afd10ffa00-kube-api-access-6nklv" (OuterVolumeSpecName: "kube-api-access-6nklv") pod "44aa8cf1-5580-47f1-b457-c1afd10ffa00" (UID: "44aa8cf1-5580-47f1-b457-c1afd10ffa00"). InnerVolumeSpecName "kube-api-access-6nklv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.635547 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-inventory" (OuterVolumeSpecName: "inventory") pod "44aa8cf1-5580-47f1-b457-c1afd10ffa00" (UID: "44aa8cf1-5580-47f1-b457-c1afd10ffa00"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.636693 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "44aa8cf1-5580-47f1-b457-c1afd10ffa00" (UID: "44aa8cf1-5580-47f1-b457-c1afd10ffa00"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.710215 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.710249 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44aa8cf1-5580-47f1-b457-c1afd10ffa00-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.710260 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nklv\" (UniqueName: \"kubernetes.io/projected/44aa8cf1-5580-47f1-b457-c1afd10ffa00-kube-api-access-6nklv\") on node \"crc\" DevicePath \"\"" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.894450 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" event={"ID":"44aa8cf1-5580-47f1-b457-c1afd10ffa00","Type":"ContainerDied","Data":"bf6c470ab9419a3e99bc8716d408af29506a569e66fffb6fab0a89d35fb45bac"} Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.894505 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf6c470ab9419a3e99bc8716d408af29506a569e66fffb6fab0a89d35fb45bac" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.894516 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-7pdh7" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.977045 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-n2j66"] Feb 02 12:35:35 crc kubenswrapper[4909]: E0202 12:35:35.977562 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4134e849-bc83-4bba-a580-de513b2ef76b" containerName="registry-server" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.977587 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4134e849-bc83-4bba-a580-de513b2ef76b" containerName="registry-server" Feb 02 12:35:35 crc kubenswrapper[4909]: E0202 12:35:35.977614 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4134e849-bc83-4bba-a580-de513b2ef76b" containerName="extract-content" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.977622 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4134e849-bc83-4bba-a580-de513b2ef76b" containerName="extract-content" Feb 02 12:35:35 crc kubenswrapper[4909]: E0202 12:35:35.977637 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4134e849-bc83-4bba-a580-de513b2ef76b" containerName="extract-utilities" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.977645 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4134e849-bc83-4bba-a580-de513b2ef76b" containerName="extract-utilities" Feb 02 12:35:35 crc kubenswrapper[4909]: E0202 12:35:35.977660 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44aa8cf1-5580-47f1-b457-c1afd10ffa00" containerName="download-cache-openstack-openstack-cell1" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.977667 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="44aa8cf1-5580-47f1-b457-c1afd10ffa00" containerName="download-cache-openstack-openstack-cell1" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.977962 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4134e849-bc83-4bba-a580-de513b2ef76b" containerName="registry-server" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.977982 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="44aa8cf1-5580-47f1-b457-c1afd10ffa00" containerName="download-cache-openstack-openstack-cell1" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.978884 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.981138 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.981334 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.981461 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:35:35 crc kubenswrapper[4909]: I0202 12:35:35.983419 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.010341 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-n2j66"] Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.017355 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px4qk\" (UniqueName: \"kubernetes.io/projected/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-kube-api-access-px4qk\") pod \"configure-network-openstack-openstack-cell1-n2j66\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.017525 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-n2j66\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.017596 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-inventory\") pod \"configure-network-openstack-openstack-cell1-n2j66\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.119521 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-n2j66\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.119607 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-inventory\") pod \"configure-network-openstack-openstack-cell1-n2j66\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.119734 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px4qk\" (UniqueName: \"kubernetes.io/projected/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-kube-api-access-px4qk\") pod \"configure-network-openstack-openstack-cell1-n2j66\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.122862 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-inventory\") pod \"configure-network-openstack-openstack-cell1-n2j66\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.123018 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-n2j66\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.135052 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px4qk\" (UniqueName: \"kubernetes.io/projected/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-kube-api-access-px4qk\") pod \"configure-network-openstack-openstack-cell1-n2j66\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.300376 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.862954 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-n2j66"] Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.873405 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 12:35:36 crc kubenswrapper[4909]: I0202 12:35:36.905921 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-n2j66" event={"ID":"e174bd5f-7fab-4440-aeb4-a5bcf55273b1","Type":"ContainerStarted","Data":"6a31d7d89060fbc58c70234154d3c2bd155db15c1fc272d82dad221cbd01d193"} Feb 02 12:35:37 crc kubenswrapper[4909]: I0202 12:35:37.916655 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-n2j66" event={"ID":"e174bd5f-7fab-4440-aeb4-a5bcf55273b1","Type":"ContainerStarted","Data":"0705f0f2fbcfcf873f630060826641f2a9368b2d808c71c41115e302debf451a"} Feb 02 12:35:37 crc kubenswrapper[4909]: I0202 12:35:37.937305 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-n2j66" podStartSLOduration=2.5470322469999997 podStartE2EDuration="2.937284978s" podCreationTimestamp="2026-02-02 12:35:35 +0000 UTC" firstStartedPulling="2026-02-02 12:35:36.873147123 +0000 UTC m=+7462.619247858" lastFinishedPulling="2026-02-02 12:35:37.263399854 +0000 UTC m=+7463.009500589" observedRunningTime="2026-02-02 12:35:37.9334764 +0000 UTC m=+7463.679577145" watchObservedRunningTime="2026-02-02 12:35:37.937284978 +0000 UTC m=+7463.683385713" Feb 02 12:36:19 crc kubenswrapper[4909]: I0202 12:36:19.510716 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:36:19 crc kubenswrapper[4909]: I0202 12:36:19.511275 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:36:49 crc kubenswrapper[4909]: I0202 12:36:49.510681 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:36:49 crc kubenswrapper[4909]: I0202 12:36:49.511212 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:36:55 crc kubenswrapper[4909]: I0202 12:36:55.621633 4909 generic.go:334] "Generic (PLEG): container finished" podID="e174bd5f-7fab-4440-aeb4-a5bcf55273b1" containerID="0705f0f2fbcfcf873f630060826641f2a9368b2d808c71c41115e302debf451a" exitCode=0 Feb 02 12:36:55 crc kubenswrapper[4909]: I0202 12:36:55.621799 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-n2j66" event={"ID":"e174bd5f-7fab-4440-aeb4-a5bcf55273b1","Type":"ContainerDied","Data":"0705f0f2fbcfcf873f630060826641f2a9368b2d808c71c41115e302debf451a"} Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.094925 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.214382 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-ssh-key-openstack-cell1\") pod \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.214439 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px4qk\" (UniqueName: \"kubernetes.io/projected/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-kube-api-access-px4qk\") pod \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.214480 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-inventory\") pod \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\" (UID: \"e174bd5f-7fab-4440-aeb4-a5bcf55273b1\") " Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.220017 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-kube-api-access-px4qk" (OuterVolumeSpecName: "kube-api-access-px4qk") pod "e174bd5f-7fab-4440-aeb4-a5bcf55273b1" (UID: "e174bd5f-7fab-4440-aeb4-a5bcf55273b1"). InnerVolumeSpecName "kube-api-access-px4qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.243618 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-inventory" (OuterVolumeSpecName: "inventory") pod "e174bd5f-7fab-4440-aeb4-a5bcf55273b1" (UID: "e174bd5f-7fab-4440-aeb4-a5bcf55273b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.245299 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e174bd5f-7fab-4440-aeb4-a5bcf55273b1" (UID: "e174bd5f-7fab-4440-aeb4-a5bcf55273b1"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.317137 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.317178 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px4qk\" (UniqueName: \"kubernetes.io/projected/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-kube-api-access-px4qk\") on node \"crc\" DevicePath \"\"" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.317189 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e174bd5f-7fab-4440-aeb4-a5bcf55273b1-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.639930 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-n2j66" event={"ID":"e174bd5f-7fab-4440-aeb4-a5bcf55273b1","Type":"ContainerDied","Data":"6a31d7d89060fbc58c70234154d3c2bd155db15c1fc272d82dad221cbd01d193"} Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.639970 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-n2j66" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.639978 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a31d7d89060fbc58c70234154d3c2bd155db15c1fc272d82dad221cbd01d193" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.730393 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-pnnsn"] Feb 02 12:36:57 crc kubenswrapper[4909]: E0202 12:36:57.730838 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e174bd5f-7fab-4440-aeb4-a5bcf55273b1" containerName="configure-network-openstack-openstack-cell1" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.730856 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e174bd5f-7fab-4440-aeb4-a5bcf55273b1" containerName="configure-network-openstack-openstack-cell1" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.731078 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e174bd5f-7fab-4440-aeb4-a5bcf55273b1" containerName="configure-network-openstack-openstack-cell1" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.731791 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.734244 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.734734 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.735029 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.735353 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.746179 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-pnnsn"] Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.927943 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-pnnsn\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.928090 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgwkn\" (UniqueName: \"kubernetes.io/projected/ad64480d-8473-47ba-9879-94a19f802dbf-kube-api-access-xgwkn\") pod \"validate-network-openstack-openstack-cell1-pnnsn\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:57 crc kubenswrapper[4909]: I0202 12:36:57.928134 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-inventory\") pod \"validate-network-openstack-openstack-cell1-pnnsn\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:58 crc kubenswrapper[4909]: I0202 12:36:58.030591 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgwkn\" (UniqueName: \"kubernetes.io/projected/ad64480d-8473-47ba-9879-94a19f802dbf-kube-api-access-xgwkn\") pod \"validate-network-openstack-openstack-cell1-pnnsn\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:58 crc kubenswrapper[4909]: I0202 12:36:58.030694 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-inventory\") pod \"validate-network-openstack-openstack-cell1-pnnsn\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:58 crc kubenswrapper[4909]: I0202 12:36:58.030884 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-pnnsn\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:58 crc kubenswrapper[4909]: I0202 12:36:58.039304 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-pnnsn\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:58 crc kubenswrapper[4909]: I0202 12:36:58.040147 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-inventory\") pod \"validate-network-openstack-openstack-cell1-pnnsn\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:58 crc kubenswrapper[4909]: I0202 12:36:58.055049 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgwkn\" (UniqueName: \"kubernetes.io/projected/ad64480d-8473-47ba-9879-94a19f802dbf-kube-api-access-xgwkn\") pod \"validate-network-openstack-openstack-cell1-pnnsn\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:58 crc kubenswrapper[4909]: I0202 12:36:58.085830 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:36:58 crc kubenswrapper[4909]: I0202 12:36:58.623164 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-pnnsn"] Feb 02 12:36:58 crc kubenswrapper[4909]: I0202 12:36:58.657515 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" event={"ID":"ad64480d-8473-47ba-9879-94a19f802dbf","Type":"ContainerStarted","Data":"37da32c6a32199a61beaac78d2125164b6a5355d139cf8cc37d53fe6fd178a22"} Feb 02 12:36:59 crc kubenswrapper[4909]: I0202 12:36:59.667771 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" event={"ID":"ad64480d-8473-47ba-9879-94a19f802dbf","Type":"ContainerStarted","Data":"f18d991c9ac4b0e3597940acf801ab81dbd8f2b5836e6517f4a7e95dd6af3785"} Feb 02 12:36:59 crc kubenswrapper[4909]: I0202 12:36:59.688797 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" podStartSLOduration=2.283491626 podStartE2EDuration="2.688778464s" podCreationTimestamp="2026-02-02 12:36:57 +0000 UTC" firstStartedPulling="2026-02-02 12:36:58.629627631 +0000 UTC m=+7544.375728366" lastFinishedPulling="2026-02-02 12:36:59.034914469 +0000 UTC m=+7544.781015204" observedRunningTime="2026-02-02 12:36:59.681128367 +0000 UTC m=+7545.427229102" watchObservedRunningTime="2026-02-02 12:36:59.688778464 +0000 UTC m=+7545.434879199" Feb 02 12:37:04 crc kubenswrapper[4909]: I0202 12:37:04.710200 4909 generic.go:334] "Generic (PLEG): container finished" podID="ad64480d-8473-47ba-9879-94a19f802dbf" containerID="f18d991c9ac4b0e3597940acf801ab81dbd8f2b5836e6517f4a7e95dd6af3785" exitCode=0 Feb 02 12:37:04 crc kubenswrapper[4909]: I0202 12:37:04.710287 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" event={"ID":"ad64480d-8473-47ba-9879-94a19f802dbf","Type":"ContainerDied","Data":"f18d991c9ac4b0e3597940acf801ab81dbd8f2b5836e6517f4a7e95dd6af3785"} Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.179559 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.302548 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-inventory\") pod \"ad64480d-8473-47ba-9879-94a19f802dbf\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.302739 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgwkn\" (UniqueName: \"kubernetes.io/projected/ad64480d-8473-47ba-9879-94a19f802dbf-kube-api-access-xgwkn\") pod \"ad64480d-8473-47ba-9879-94a19f802dbf\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.303032 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-ssh-key-openstack-cell1\") pod \"ad64480d-8473-47ba-9879-94a19f802dbf\" (UID: \"ad64480d-8473-47ba-9879-94a19f802dbf\") " Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.309011 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad64480d-8473-47ba-9879-94a19f802dbf-kube-api-access-xgwkn" (OuterVolumeSpecName: "kube-api-access-xgwkn") pod "ad64480d-8473-47ba-9879-94a19f802dbf" (UID: "ad64480d-8473-47ba-9879-94a19f802dbf"). InnerVolumeSpecName "kube-api-access-xgwkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.333397 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-inventory" (OuterVolumeSpecName: "inventory") pod "ad64480d-8473-47ba-9879-94a19f802dbf" (UID: "ad64480d-8473-47ba-9879-94a19f802dbf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.338220 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ad64480d-8473-47ba-9879-94a19f802dbf" (UID: "ad64480d-8473-47ba-9879-94a19f802dbf"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.405396 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.405432 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad64480d-8473-47ba-9879-94a19f802dbf-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.405443 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgwkn\" (UniqueName: \"kubernetes.io/projected/ad64480d-8473-47ba-9879-94a19f802dbf-kube-api-access-xgwkn\") on node \"crc\" DevicePath \"\"" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.739208 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" event={"ID":"ad64480d-8473-47ba-9879-94a19f802dbf","Type":"ContainerDied","Data":"37da32c6a32199a61beaac78d2125164b6a5355d139cf8cc37d53fe6fd178a22"} Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.739523 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37da32c6a32199a61beaac78d2125164b6a5355d139cf8cc37d53fe6fd178a22" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.739307 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-pnnsn" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.804258 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-2cwfg"] Feb 02 12:37:06 crc kubenswrapper[4909]: E0202 12:37:06.804846 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad64480d-8473-47ba-9879-94a19f802dbf" containerName="validate-network-openstack-openstack-cell1" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.804862 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad64480d-8473-47ba-9879-94a19f802dbf" containerName="validate-network-openstack-openstack-cell1" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.805139 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad64480d-8473-47ba-9879-94a19f802dbf" containerName="validate-network-openstack-openstack-cell1" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.806128 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.809190 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.809324 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.809449 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.809463 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.828284 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-2cwfg"] Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.920633 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-inventory\") pod \"install-os-openstack-openstack-cell1-2cwfg\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.920867 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh64d\" (UniqueName: \"kubernetes.io/projected/7057aa50-a591-411f-8c66-1abd54d955b6-kube-api-access-qh64d\") pod \"install-os-openstack-openstack-cell1-2cwfg\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:06 crc kubenswrapper[4909]: I0202 12:37:06.920906 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-2cwfg\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:07 crc kubenswrapper[4909]: I0202 12:37:07.022492 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-inventory\") pod \"install-os-openstack-openstack-cell1-2cwfg\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:07 crc kubenswrapper[4909]: I0202 12:37:07.022552 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh64d\" (UniqueName: \"kubernetes.io/projected/7057aa50-a591-411f-8c66-1abd54d955b6-kube-api-access-qh64d\") pod \"install-os-openstack-openstack-cell1-2cwfg\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:07 crc kubenswrapper[4909]: I0202 12:37:07.022575 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-2cwfg\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:07 crc kubenswrapper[4909]: I0202 12:37:07.027152 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-inventory\") pod \"install-os-openstack-openstack-cell1-2cwfg\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:07 crc kubenswrapper[4909]: I0202 12:37:07.028663 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-2cwfg\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:07 crc kubenswrapper[4909]: I0202 12:37:07.042030 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh64d\" (UniqueName: \"kubernetes.io/projected/7057aa50-a591-411f-8c66-1abd54d955b6-kube-api-access-qh64d\") pod \"install-os-openstack-openstack-cell1-2cwfg\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:07 crc kubenswrapper[4909]: I0202 12:37:07.136760 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:07 crc kubenswrapper[4909]: I0202 12:37:07.670318 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-2cwfg"] Feb 02 12:37:07 crc kubenswrapper[4909]: I0202 12:37:07.749768 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-2cwfg" event={"ID":"7057aa50-a591-411f-8c66-1abd54d955b6","Type":"ContainerStarted","Data":"fb2293b14000e2b914de3efe522f5365549638787cca9f951e7dda5ffe1131d9"} Feb 02 12:37:08 crc kubenswrapper[4909]: I0202 12:37:08.765829 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-2cwfg" event={"ID":"7057aa50-a591-411f-8c66-1abd54d955b6","Type":"ContainerStarted","Data":"714b454326e060d2b2ed821459fa5fdee9ff3791e2a043a376de6e4bad7d4b1a"} Feb 02 12:37:08 crc kubenswrapper[4909]: I0202 12:37:08.788118 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-2cwfg" podStartSLOduration=2.208929885 podStartE2EDuration="2.78809342s" podCreationTimestamp="2026-02-02 12:37:06 +0000 UTC" firstStartedPulling="2026-02-02 12:37:07.693892261 +0000 UTC m=+7553.439992986" lastFinishedPulling="2026-02-02 12:37:08.273055786 +0000 UTC m=+7554.019156521" observedRunningTime="2026-02-02 12:37:08.784377694 +0000 UTC m=+7554.530478439" watchObservedRunningTime="2026-02-02 12:37:08.78809342 +0000 UTC m=+7554.534194155" Feb 02 12:37:19 crc kubenswrapper[4909]: I0202 12:37:19.511350 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:37:19 crc kubenswrapper[4909]: I0202 12:37:19.512028 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:37:19 crc kubenswrapper[4909]: I0202 12:37:19.512088 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 12:37:19 crc kubenswrapper[4909]: I0202 12:37:19.512981 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c721b1aabb3d968c8ddbe1edde1cd3f18f6d49e271134714d86a9cde069c5c4"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 12:37:19 crc kubenswrapper[4909]: I0202 12:37:19.513038 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://3c721b1aabb3d968c8ddbe1edde1cd3f18f6d49e271134714d86a9cde069c5c4" gracePeriod=600 Feb 02 12:37:19 crc kubenswrapper[4909]: I0202 12:37:19.897432 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="3c721b1aabb3d968c8ddbe1edde1cd3f18f6d49e271134714d86a9cde069c5c4" exitCode=0 Feb 02 12:37:19 crc kubenswrapper[4909]: I0202 12:37:19.897583 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"3c721b1aabb3d968c8ddbe1edde1cd3f18f6d49e271134714d86a9cde069c5c4"} Feb 02 12:37:19 crc kubenswrapper[4909]: I0202 12:37:19.897955 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d"} Feb 02 12:37:19 crc kubenswrapper[4909]: I0202 12:37:19.897984 4909 scope.go:117] "RemoveContainer" containerID="627d39d809eb616e52bb26085c119708331174493bf8a00574ddd468243881ba" Feb 02 12:37:24 crc kubenswrapper[4909]: I0202 12:37:24.915739 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nhnr8"] Feb 02 12:37:24 crc kubenswrapper[4909]: I0202 12:37:24.918353 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:24 crc kubenswrapper[4909]: I0202 12:37:24.929853 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhnr8"] Feb 02 12:37:24 crc kubenswrapper[4909]: I0202 12:37:24.969690 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4226h\" (UniqueName: \"kubernetes.io/projected/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-kube-api-access-4226h\") pod \"redhat-marketplace-nhnr8\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:24 crc kubenswrapper[4909]: I0202 12:37:24.969777 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-catalog-content\") pod \"redhat-marketplace-nhnr8\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:24 crc kubenswrapper[4909]: I0202 12:37:24.970416 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-utilities\") pod \"redhat-marketplace-nhnr8\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:25 crc kubenswrapper[4909]: I0202 12:37:25.072986 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-utilities\") pod \"redhat-marketplace-nhnr8\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:25 crc kubenswrapper[4909]: I0202 12:37:25.073074 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4226h\" (UniqueName: \"kubernetes.io/projected/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-kube-api-access-4226h\") pod \"redhat-marketplace-nhnr8\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:25 crc kubenswrapper[4909]: I0202 12:37:25.073204 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-catalog-content\") pod \"redhat-marketplace-nhnr8\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:25 crc kubenswrapper[4909]: I0202 12:37:25.073553 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-utilities\") pod \"redhat-marketplace-nhnr8\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:25 crc kubenswrapper[4909]: I0202 12:37:25.074238 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-catalog-content\") pod \"redhat-marketplace-nhnr8\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:25 crc kubenswrapper[4909]: I0202 12:37:25.103163 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4226h\" (UniqueName: \"kubernetes.io/projected/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-kube-api-access-4226h\") pod \"redhat-marketplace-nhnr8\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:25 crc kubenswrapper[4909]: I0202 12:37:25.237787 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:25 crc kubenswrapper[4909]: I0202 12:37:25.814444 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhnr8"] Feb 02 12:37:25 crc kubenswrapper[4909]: I0202 12:37:25.957539 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhnr8" event={"ID":"91e43732-7cd1-4d4b-995e-21d0e8eb5b78","Type":"ContainerStarted","Data":"2799103f8f823d8a47a35f7be6fcb59659aabf1ee55c0d10a21a96acc185dab9"} Feb 02 12:37:26 crc kubenswrapper[4909]: I0202 12:37:26.968367 4909 generic.go:334] "Generic (PLEG): container finished" podID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerID="2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603" exitCode=0 Feb 02 12:37:26 crc kubenswrapper[4909]: I0202 12:37:26.968429 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhnr8" event={"ID":"91e43732-7cd1-4d4b-995e-21d0e8eb5b78","Type":"ContainerDied","Data":"2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603"} Feb 02 12:37:27 crc kubenswrapper[4909]: I0202 12:37:27.980281 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhnr8" event={"ID":"91e43732-7cd1-4d4b-995e-21d0e8eb5b78","Type":"ContainerStarted","Data":"7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c"} Feb 02 12:37:28 crc kubenswrapper[4909]: I0202 12:37:28.991800 4909 generic.go:334] "Generic (PLEG): container finished" podID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerID="7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c" exitCode=0 Feb 02 12:37:28 crc kubenswrapper[4909]: I0202 12:37:28.991871 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhnr8" event={"ID":"91e43732-7cd1-4d4b-995e-21d0e8eb5b78","Type":"ContainerDied","Data":"7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c"} Feb 02 12:37:30 crc kubenswrapper[4909]: I0202 12:37:30.002762 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhnr8" event={"ID":"91e43732-7cd1-4d4b-995e-21d0e8eb5b78","Type":"ContainerStarted","Data":"b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2"} Feb 02 12:37:30 crc kubenswrapper[4909]: I0202 12:37:30.026473 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nhnr8" podStartSLOduration=3.581076423 podStartE2EDuration="6.026454486s" podCreationTimestamp="2026-02-02 12:37:24 +0000 UTC" firstStartedPulling="2026-02-02 12:37:26.970928629 +0000 UTC m=+7572.717029374" lastFinishedPulling="2026-02-02 12:37:29.416306702 +0000 UTC m=+7575.162407437" observedRunningTime="2026-02-02 12:37:30.024972604 +0000 UTC m=+7575.771073339" watchObservedRunningTime="2026-02-02 12:37:30.026454486 +0000 UTC m=+7575.772555221" Feb 02 12:37:35 crc kubenswrapper[4909]: I0202 12:37:35.238631 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:35 crc kubenswrapper[4909]: I0202 12:37:35.240529 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:35 crc kubenswrapper[4909]: I0202 12:37:35.282918 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:36 crc kubenswrapper[4909]: I0202 12:37:36.102676 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:36 crc kubenswrapper[4909]: I0202 12:37:36.155304 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhnr8"] Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.071565 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nhnr8" podUID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerName="registry-server" containerID="cri-o://b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2" gracePeriod=2 Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.598857 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.699689 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-catalog-content\") pod \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.699795 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4226h\" (UniqueName: \"kubernetes.io/projected/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-kube-api-access-4226h\") pod \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.699834 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-utilities\") pod \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\" (UID: \"91e43732-7cd1-4d4b-995e-21d0e8eb5b78\") " Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.701276 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-utilities" (OuterVolumeSpecName: "utilities") pod "91e43732-7cd1-4d4b-995e-21d0e8eb5b78" (UID: "91e43732-7cd1-4d4b-995e-21d0e8eb5b78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.707237 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-kube-api-access-4226h" (OuterVolumeSpecName: "kube-api-access-4226h") pod "91e43732-7cd1-4d4b-995e-21d0e8eb5b78" (UID: "91e43732-7cd1-4d4b-995e-21d0e8eb5b78"). InnerVolumeSpecName "kube-api-access-4226h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.725699 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91e43732-7cd1-4d4b-995e-21d0e8eb5b78" (UID: "91e43732-7cd1-4d4b-995e-21d0e8eb5b78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.801357 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.801413 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4226h\" (UniqueName: \"kubernetes.io/projected/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-kube-api-access-4226h\") on node \"crc\" DevicePath \"\"" Feb 02 12:37:38 crc kubenswrapper[4909]: I0202 12:37:38.801428 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91e43732-7cd1-4d4b-995e-21d0e8eb5b78-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.083883 4909 generic.go:334] "Generic (PLEG): container finished" podID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerID="b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2" exitCode=0 Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.083931 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhnr8" event={"ID":"91e43732-7cd1-4d4b-995e-21d0e8eb5b78","Type":"ContainerDied","Data":"b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2"} Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.083958 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nhnr8" Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.083985 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nhnr8" event={"ID":"91e43732-7cd1-4d4b-995e-21d0e8eb5b78","Type":"ContainerDied","Data":"2799103f8f823d8a47a35f7be6fcb59659aabf1ee55c0d10a21a96acc185dab9"} Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.084011 4909 scope.go:117] "RemoveContainer" containerID="b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2" Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.120797 4909 scope.go:117] "RemoveContainer" containerID="7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c" Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.123733 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhnr8"] Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.136392 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nhnr8"] Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.139939 4909 scope.go:117] "RemoveContainer" containerID="2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603" Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.195568 4909 scope.go:117] "RemoveContainer" containerID="b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2" Feb 02 12:37:39 crc kubenswrapper[4909]: E0202 12:37:39.196695 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2\": container with ID starting with b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2 not found: ID does not exist" containerID="b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2" Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.196737 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2"} err="failed to get container status \"b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2\": rpc error: code = NotFound desc = could not find container \"b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2\": container with ID starting with b9f04f53d9791569472f5efccc0a97feaf46e3c08b98b11fa5b94269265af2b2 not found: ID does not exist" Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.196763 4909 scope.go:117] "RemoveContainer" containerID="7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c" Feb 02 12:37:39 crc kubenswrapper[4909]: E0202 12:37:39.197092 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c\": container with ID starting with 7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c not found: ID does not exist" containerID="7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c" Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.197147 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c"} err="failed to get container status \"7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c\": rpc error: code = NotFound desc = could not find container \"7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c\": container with ID starting with 7a606f9f109dbae5bc7b154e5758c973155bf28fe3d9f37e9d088f83c872884c not found: ID does not exist" Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.197190 4909 scope.go:117] "RemoveContainer" containerID="2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603" Feb 02 12:37:39 crc kubenswrapper[4909]: E0202 12:37:39.197714 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603\": container with ID starting with 2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603 not found: ID does not exist" containerID="2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603" Feb 02 12:37:39 crc kubenswrapper[4909]: I0202 12:37:39.197747 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603"} err="failed to get container status \"2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603\": rpc error: code = NotFound desc = could not find container \"2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603\": container with ID starting with 2d9f7504bfb666d111d0b5fdefb328dc093956b2f01563d3b7a7ee8eca169603 not found: ID does not exist" Feb 02 12:37:41 crc kubenswrapper[4909]: I0202 12:37:41.027891 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" path="/var/lib/kubelet/pods/91e43732-7cd1-4d4b-995e-21d0e8eb5b78/volumes" Feb 02 12:37:57 crc kubenswrapper[4909]: I0202 12:37:57.269845 4909 generic.go:334] "Generic (PLEG): container finished" podID="7057aa50-a591-411f-8c66-1abd54d955b6" containerID="714b454326e060d2b2ed821459fa5fdee9ff3791e2a043a376de6e4bad7d4b1a" exitCode=0 Feb 02 12:37:57 crc kubenswrapper[4909]: I0202 12:37:57.269963 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-2cwfg" event={"ID":"7057aa50-a591-411f-8c66-1abd54d955b6","Type":"ContainerDied","Data":"714b454326e060d2b2ed821459fa5fdee9ff3791e2a043a376de6e4bad7d4b1a"} Feb 02 12:37:58 crc kubenswrapper[4909]: I0202 12:37:58.728228 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:58 crc kubenswrapper[4909]: I0202 12:37:58.833190 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh64d\" (UniqueName: \"kubernetes.io/projected/7057aa50-a591-411f-8c66-1abd54d955b6-kube-api-access-qh64d\") pod \"7057aa50-a591-411f-8c66-1abd54d955b6\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " Feb 02 12:37:58 crc kubenswrapper[4909]: I0202 12:37:58.833283 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-ssh-key-openstack-cell1\") pod \"7057aa50-a591-411f-8c66-1abd54d955b6\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " Feb 02 12:37:58 crc kubenswrapper[4909]: I0202 12:37:58.833320 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-inventory\") pod \"7057aa50-a591-411f-8c66-1abd54d955b6\" (UID: \"7057aa50-a591-411f-8c66-1abd54d955b6\") " Feb 02 12:37:58 crc kubenswrapper[4909]: I0202 12:37:58.839718 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7057aa50-a591-411f-8c66-1abd54d955b6-kube-api-access-qh64d" (OuterVolumeSpecName: "kube-api-access-qh64d") pod "7057aa50-a591-411f-8c66-1abd54d955b6" (UID: "7057aa50-a591-411f-8c66-1abd54d955b6"). InnerVolumeSpecName "kube-api-access-qh64d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:37:58 crc kubenswrapper[4909]: I0202 12:37:58.865049 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-inventory" (OuterVolumeSpecName: "inventory") pod "7057aa50-a591-411f-8c66-1abd54d955b6" (UID: "7057aa50-a591-411f-8c66-1abd54d955b6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:37:58 crc kubenswrapper[4909]: I0202 12:37:58.866848 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "7057aa50-a591-411f-8c66-1abd54d955b6" (UID: "7057aa50-a591-411f-8c66-1abd54d955b6"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:37:58 crc kubenswrapper[4909]: I0202 12:37:58.936095 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh64d\" (UniqueName: \"kubernetes.io/projected/7057aa50-a591-411f-8c66-1abd54d955b6-kube-api-access-qh64d\") on node \"crc\" DevicePath \"\"" Feb 02 12:37:58 crc kubenswrapper[4909]: I0202 12:37:58.936125 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:37:58 crc kubenswrapper[4909]: I0202 12:37:58.936136 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7057aa50-a591-411f-8c66-1abd54d955b6-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.290480 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-2cwfg" event={"ID":"7057aa50-a591-411f-8c66-1abd54d955b6","Type":"ContainerDied","Data":"fb2293b14000e2b914de3efe522f5365549638787cca9f951e7dda5ffe1131d9"} Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.290518 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb2293b14000e2b914de3efe522f5365549638787cca9f951e7dda5ffe1131d9" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.290519 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-2cwfg" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.447598 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-2mxpm"] Feb 02 12:37:59 crc kubenswrapper[4909]: E0202 12:37:59.448908 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerName="extract-content" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.449030 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerName="extract-content" Feb 02 12:37:59 crc kubenswrapper[4909]: E0202 12:37:59.449143 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerName="registry-server" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.449208 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerName="registry-server" Feb 02 12:37:59 crc kubenswrapper[4909]: E0202 12:37:59.449294 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerName="extract-utilities" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.449371 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerName="extract-utilities" Feb 02 12:37:59 crc kubenswrapper[4909]: E0202 12:37:59.449454 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7057aa50-a591-411f-8c66-1abd54d955b6" containerName="install-os-openstack-openstack-cell1" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.449531 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="7057aa50-a591-411f-8c66-1abd54d955b6" containerName="install-os-openstack-openstack-cell1" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.449928 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="7057aa50-a591-411f-8c66-1abd54d955b6" containerName="install-os-openstack-openstack-cell1" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.450029 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e43732-7cd1-4d4b-995e-21d0e8eb5b78" containerName="registry-server" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.451150 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.453750 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.454205 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.455722 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.456244 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.463389 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-2mxpm"] Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.553448 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msvb6\" (UniqueName: \"kubernetes.io/projected/d090f64e-e6da-45b6-9f4a-3ee8106a132c-kube-api-access-msvb6\") pod \"configure-os-openstack-openstack-cell1-2mxpm\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.553838 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-inventory\") pod \"configure-os-openstack-openstack-cell1-2mxpm\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.553892 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-2mxpm\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.655485 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msvb6\" (UniqueName: \"kubernetes.io/projected/d090f64e-e6da-45b6-9f4a-3ee8106a132c-kube-api-access-msvb6\") pod \"configure-os-openstack-openstack-cell1-2mxpm\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.655566 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-inventory\") pod \"configure-os-openstack-openstack-cell1-2mxpm\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.655611 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-2mxpm\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.659138 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-2mxpm\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.659415 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-inventory\") pod \"configure-os-openstack-openstack-cell1-2mxpm\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.676537 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msvb6\" (UniqueName: \"kubernetes.io/projected/d090f64e-e6da-45b6-9f4a-3ee8106a132c-kube-api-access-msvb6\") pod \"configure-os-openstack-openstack-cell1-2mxpm\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:37:59 crc kubenswrapper[4909]: I0202 12:37:59.768150 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:38:00 crc kubenswrapper[4909]: I0202 12:38:00.413353 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-2mxpm"] Feb 02 12:38:01 crc kubenswrapper[4909]: I0202 12:38:01.314126 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" event={"ID":"d090f64e-e6da-45b6-9f4a-3ee8106a132c","Type":"ContainerStarted","Data":"66e2c08f1f7720a6abd70972468a24b71996d9cefd51aa5bf655a1eebb626e82"} Feb 02 12:38:01 crc kubenswrapper[4909]: I0202 12:38:01.315597 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" event={"ID":"d090f64e-e6da-45b6-9f4a-3ee8106a132c","Type":"ContainerStarted","Data":"0ab28558aceca4c8bbba10c441af69532445b5c74eae1177df5a7f8282b1c9b1"} Feb 02 12:38:01 crc kubenswrapper[4909]: I0202 12:38:01.341013 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" podStartSLOduration=1.819541769 podStartE2EDuration="2.340992015s" podCreationTimestamp="2026-02-02 12:37:59 +0000 UTC" firstStartedPulling="2026-02-02 12:38:00.403475235 +0000 UTC m=+7606.149575970" lastFinishedPulling="2026-02-02 12:38:00.924925481 +0000 UTC m=+7606.671026216" observedRunningTime="2026-02-02 12:38:01.334340716 +0000 UTC m=+7607.080441451" watchObservedRunningTime="2026-02-02 12:38:01.340992015 +0000 UTC m=+7607.087092750" Feb 02 12:38:46 crc kubenswrapper[4909]: I0202 12:38:46.706370 4909 generic.go:334] "Generic (PLEG): container finished" podID="d090f64e-e6da-45b6-9f4a-3ee8106a132c" containerID="66e2c08f1f7720a6abd70972468a24b71996d9cefd51aa5bf655a1eebb626e82" exitCode=0 Feb 02 12:38:46 crc kubenswrapper[4909]: I0202 12:38:46.707052 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" event={"ID":"d090f64e-e6da-45b6-9f4a-3ee8106a132c","Type":"ContainerDied","Data":"66e2c08f1f7720a6abd70972468a24b71996d9cefd51aa5bf655a1eebb626e82"} Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.172966 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.337020 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msvb6\" (UniqueName: \"kubernetes.io/projected/d090f64e-e6da-45b6-9f4a-3ee8106a132c-kube-api-access-msvb6\") pod \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.337367 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-inventory\") pod \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.337421 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-ssh-key-openstack-cell1\") pod \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\" (UID: \"d090f64e-e6da-45b6-9f4a-3ee8106a132c\") " Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.343399 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d090f64e-e6da-45b6-9f4a-3ee8106a132c-kube-api-access-msvb6" (OuterVolumeSpecName: "kube-api-access-msvb6") pod "d090f64e-e6da-45b6-9f4a-3ee8106a132c" (UID: "d090f64e-e6da-45b6-9f4a-3ee8106a132c"). InnerVolumeSpecName "kube-api-access-msvb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.372256 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d090f64e-e6da-45b6-9f4a-3ee8106a132c" (UID: "d090f64e-e6da-45b6-9f4a-3ee8106a132c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.373534 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-inventory" (OuterVolumeSpecName: "inventory") pod "d090f64e-e6da-45b6-9f4a-3ee8106a132c" (UID: "d090f64e-e6da-45b6-9f4a-3ee8106a132c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.441201 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.441274 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d090f64e-e6da-45b6-9f4a-3ee8106a132c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.441294 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msvb6\" (UniqueName: \"kubernetes.io/projected/d090f64e-e6da-45b6-9f4a-3ee8106a132c-kube-api-access-msvb6\") on node \"crc\" DevicePath \"\"" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.723740 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" event={"ID":"d090f64e-e6da-45b6-9f4a-3ee8106a132c","Type":"ContainerDied","Data":"0ab28558aceca4c8bbba10c441af69532445b5c74eae1177df5a7f8282b1c9b1"} Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.723795 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ab28558aceca4c8bbba10c441af69532445b5c74eae1177df5a7f8282b1c9b1" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.723846 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-2mxpm" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.825867 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-mlt78"] Feb 02 12:38:48 crc kubenswrapper[4909]: E0202 12:38:48.828889 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d090f64e-e6da-45b6-9f4a-3ee8106a132c" containerName="configure-os-openstack-openstack-cell1" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.828923 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d090f64e-e6da-45b6-9f4a-3ee8106a132c" containerName="configure-os-openstack-openstack-cell1" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.829152 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d090f64e-e6da-45b6-9f4a-3ee8106a132c" containerName="configure-os-openstack-openstack-cell1" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.830188 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.832965 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.833225 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.833374 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.837124 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.847619 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-mlt78"] Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.851680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-mlt78\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.851731 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtmd5\" (UniqueName: \"kubernetes.io/projected/a6987177-0c44-4124-bb94-c6883fa3ed07-kube-api-access-qtmd5\") pod \"ssh-known-hosts-openstack-mlt78\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.852963 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-inventory-0\") pod \"ssh-known-hosts-openstack-mlt78\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.953771 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-mlt78\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.953829 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtmd5\" (UniqueName: \"kubernetes.io/projected/a6987177-0c44-4124-bb94-c6883fa3ed07-kube-api-access-qtmd5\") pod \"ssh-known-hosts-openstack-mlt78\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.953936 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-inventory-0\") pod \"ssh-known-hosts-openstack-mlt78\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.962616 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-mlt78\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.964596 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-inventory-0\") pod \"ssh-known-hosts-openstack-mlt78\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:48 crc kubenswrapper[4909]: I0202 12:38:48.970912 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtmd5\" (UniqueName: \"kubernetes.io/projected/a6987177-0c44-4124-bb94-c6883fa3ed07-kube-api-access-qtmd5\") pod \"ssh-known-hosts-openstack-mlt78\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:49 crc kubenswrapper[4909]: I0202 12:38:49.151887 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:38:49 crc kubenswrapper[4909]: I0202 12:38:49.727942 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-mlt78"] Feb 02 12:38:50 crc kubenswrapper[4909]: I0202 12:38:50.750785 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-mlt78" event={"ID":"a6987177-0c44-4124-bb94-c6883fa3ed07","Type":"ContainerStarted","Data":"f7b6cb2410c881728dd5f984cdf40a44bbf6b691fee9f148a422836ebb312196"} Feb 02 12:38:50 crc kubenswrapper[4909]: I0202 12:38:50.751250 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-mlt78" event={"ID":"a6987177-0c44-4124-bb94-c6883fa3ed07","Type":"ContainerStarted","Data":"15c907611a31b2abb97a540fced264e100b4bf24d54c37a4e94dbb4f724aa09b"} Feb 02 12:38:50 crc kubenswrapper[4909]: I0202 12:38:50.773947 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-mlt78" podStartSLOduration=2.372381941 podStartE2EDuration="2.773932092s" podCreationTimestamp="2026-02-02 12:38:48 +0000 UTC" firstStartedPulling="2026-02-02 12:38:49.730198007 +0000 UTC m=+7655.476298732" lastFinishedPulling="2026-02-02 12:38:50.131748148 +0000 UTC m=+7655.877848883" observedRunningTime="2026-02-02 12:38:50.769441105 +0000 UTC m=+7656.515541840" watchObservedRunningTime="2026-02-02 12:38:50.773932092 +0000 UTC m=+7656.520032817" Feb 02 12:38:59 crc kubenswrapper[4909]: I0202 12:38:59.832402 4909 generic.go:334] "Generic (PLEG): container finished" podID="a6987177-0c44-4124-bb94-c6883fa3ed07" containerID="f7b6cb2410c881728dd5f984cdf40a44bbf6b691fee9f148a422836ebb312196" exitCode=0 Feb 02 12:38:59 crc kubenswrapper[4909]: I0202 12:38:59.832482 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-mlt78" event={"ID":"a6987177-0c44-4124-bb94-c6883fa3ed07","Type":"ContainerDied","Data":"f7b6cb2410c881728dd5f984cdf40a44bbf6b691fee9f148a422836ebb312196"} Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.295300 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.353308 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-ssh-key-openstack-cell1\") pod \"a6987177-0c44-4124-bb94-c6883fa3ed07\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.353489 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtmd5\" (UniqueName: \"kubernetes.io/projected/a6987177-0c44-4124-bb94-c6883fa3ed07-kube-api-access-qtmd5\") pod \"a6987177-0c44-4124-bb94-c6883fa3ed07\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.353530 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-inventory-0\") pod \"a6987177-0c44-4124-bb94-c6883fa3ed07\" (UID: \"a6987177-0c44-4124-bb94-c6883fa3ed07\") " Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.359456 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6987177-0c44-4124-bb94-c6883fa3ed07-kube-api-access-qtmd5" (OuterVolumeSpecName: "kube-api-access-qtmd5") pod "a6987177-0c44-4124-bb94-c6883fa3ed07" (UID: "a6987177-0c44-4124-bb94-c6883fa3ed07"). InnerVolumeSpecName "kube-api-access-qtmd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.382095 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a6987177-0c44-4124-bb94-c6883fa3ed07" (UID: "a6987177-0c44-4124-bb94-c6883fa3ed07"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.393186 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "a6987177-0c44-4124-bb94-c6883fa3ed07" (UID: "a6987177-0c44-4124-bb94-c6883fa3ed07"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.456793 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.456848 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtmd5\" (UniqueName: \"kubernetes.io/projected/a6987177-0c44-4124-bb94-c6883fa3ed07-kube-api-access-qtmd5\") on node \"crc\" DevicePath \"\"" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.456862 4909 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/a6987177-0c44-4124-bb94-c6883fa3ed07-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.851963 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-mlt78" event={"ID":"a6987177-0c44-4124-bb94-c6883fa3ed07","Type":"ContainerDied","Data":"15c907611a31b2abb97a540fced264e100b4bf24d54c37a4e94dbb4f724aa09b"} Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.852359 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15c907611a31b2abb97a540fced264e100b4bf24d54c37a4e94dbb4f724aa09b" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.852036 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-mlt78" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.923223 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-dxxzz"] Feb 02 12:39:01 crc kubenswrapper[4909]: E0202 12:39:01.923785 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6987177-0c44-4124-bb94-c6883fa3ed07" containerName="ssh-known-hosts-openstack" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.923824 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6987177-0c44-4124-bb94-c6883fa3ed07" containerName="ssh-known-hosts-openstack" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.924016 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6987177-0c44-4124-bb94-c6883fa3ed07" containerName="ssh-known-hosts-openstack" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.924779 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.927832 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.928047 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.929106 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.929847 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.933000 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-dxxzz"] Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.967522 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsqdp\" (UniqueName: \"kubernetes.io/projected/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-kube-api-access-vsqdp\") pod \"run-os-openstack-openstack-cell1-dxxzz\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.967600 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-dxxzz\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:01 crc kubenswrapper[4909]: I0202 12:39:01.967753 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-inventory\") pod \"run-os-openstack-openstack-cell1-dxxzz\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:02 crc kubenswrapper[4909]: I0202 12:39:02.069637 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-inventory\") pod \"run-os-openstack-openstack-cell1-dxxzz\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:02 crc kubenswrapper[4909]: I0202 12:39:02.069849 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsqdp\" (UniqueName: \"kubernetes.io/projected/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-kube-api-access-vsqdp\") pod \"run-os-openstack-openstack-cell1-dxxzz\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:02 crc kubenswrapper[4909]: I0202 12:39:02.069935 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-dxxzz\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:02 crc kubenswrapper[4909]: I0202 12:39:02.075142 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-dxxzz\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:02 crc kubenswrapper[4909]: I0202 12:39:02.075366 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-inventory\") pod \"run-os-openstack-openstack-cell1-dxxzz\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:02 crc kubenswrapper[4909]: I0202 12:39:02.086522 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsqdp\" (UniqueName: \"kubernetes.io/projected/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-kube-api-access-vsqdp\") pod \"run-os-openstack-openstack-cell1-dxxzz\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:02 crc kubenswrapper[4909]: I0202 12:39:02.239709 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:02 crc kubenswrapper[4909]: I0202 12:39:02.784510 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-dxxzz"] Feb 02 12:39:02 crc kubenswrapper[4909]: I0202 12:39:02.860705 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-dxxzz" event={"ID":"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c","Type":"ContainerStarted","Data":"8d893200da0714986f9b587a58fd0ea191136db3ae4e754f3c5d91a8fb7efc0d"} Feb 02 12:39:03 crc kubenswrapper[4909]: I0202 12:39:03.869998 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-dxxzz" event={"ID":"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c","Type":"ContainerStarted","Data":"296dbb94c410fedcbfc37528d009c723cfcb5614d3f3eb9cd155cc334b7ab6d2"} Feb 02 12:39:03 crc kubenswrapper[4909]: I0202 12:39:03.895951 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-dxxzz" podStartSLOduration=2.277335442 podStartE2EDuration="2.895933746s" podCreationTimestamp="2026-02-02 12:39:01 +0000 UTC" firstStartedPulling="2026-02-02 12:39:02.788221454 +0000 UTC m=+7668.534322189" lastFinishedPulling="2026-02-02 12:39:03.406819758 +0000 UTC m=+7669.152920493" observedRunningTime="2026-02-02 12:39:03.889165924 +0000 UTC m=+7669.635266659" watchObservedRunningTime="2026-02-02 12:39:03.895933746 +0000 UTC m=+7669.642034481" Feb 02 12:39:11 crc kubenswrapper[4909]: I0202 12:39:11.937855 4909 generic.go:334] "Generic (PLEG): container finished" podID="f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c" containerID="296dbb94c410fedcbfc37528d009c723cfcb5614d3f3eb9cd155cc334b7ab6d2" exitCode=0 Feb 02 12:39:11 crc kubenswrapper[4909]: I0202 12:39:11.937933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-dxxzz" event={"ID":"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c","Type":"ContainerDied","Data":"296dbb94c410fedcbfc37528d009c723cfcb5614d3f3eb9cd155cc334b7ab6d2"} Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.469159 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.623439 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsqdp\" (UniqueName: \"kubernetes.io/projected/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-kube-api-access-vsqdp\") pod \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.623648 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-inventory\") pod \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.623759 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-ssh-key-openstack-cell1\") pod \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\" (UID: \"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c\") " Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.636000 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-kube-api-access-vsqdp" (OuterVolumeSpecName: "kube-api-access-vsqdp") pod "f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c" (UID: "f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c"). InnerVolumeSpecName "kube-api-access-vsqdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.658754 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c" (UID: "f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.661837 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-inventory" (OuterVolumeSpecName: "inventory") pod "f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c" (UID: "f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.726562 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsqdp\" (UniqueName: \"kubernetes.io/projected/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-kube-api-access-vsqdp\") on node \"crc\" DevicePath \"\"" Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.726608 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.726621 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.959511 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-dxxzz" event={"ID":"f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c","Type":"ContainerDied","Data":"8d893200da0714986f9b587a58fd0ea191136db3ae4e754f3c5d91a8fb7efc0d"} Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.959551 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d893200da0714986f9b587a58fd0ea191136db3ae4e754f3c5d91a8fb7efc0d" Feb 02 12:39:13 crc kubenswrapper[4909]: I0202 12:39:13.959653 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-dxxzz" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.048994 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-vks94"] Feb 02 12:39:14 crc kubenswrapper[4909]: E0202 12:39:14.049452 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c" containerName="run-os-openstack-openstack-cell1" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.049469 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c" containerName="run-os-openstack-openstack-cell1" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.049654 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c" containerName="run-os-openstack-openstack-cell1" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.050443 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.064797 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.065022 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.065086 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.065726 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.068383 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-vks94"] Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.134556 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-vks94\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.134680 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-inventory\") pod \"reboot-os-openstack-openstack-cell1-vks94\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.134709 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn6vl\" (UniqueName: \"kubernetes.io/projected/609b4bdb-129e-4a27-841e-a83e453dfd79-kube-api-access-nn6vl\") pod \"reboot-os-openstack-openstack-cell1-vks94\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.236193 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-vks94\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.236317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-inventory\") pod \"reboot-os-openstack-openstack-cell1-vks94\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.236353 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn6vl\" (UniqueName: \"kubernetes.io/projected/609b4bdb-129e-4a27-841e-a83e453dfd79-kube-api-access-nn6vl\") pod \"reboot-os-openstack-openstack-cell1-vks94\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.241059 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-vks94\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.241315 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-inventory\") pod \"reboot-os-openstack-openstack-cell1-vks94\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.264768 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn6vl\" (UniqueName: \"kubernetes.io/projected/609b4bdb-129e-4a27-841e-a83e453dfd79-kube-api-access-nn6vl\") pod \"reboot-os-openstack-openstack-cell1-vks94\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.366651 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:14 crc kubenswrapper[4909]: I0202 12:39:14.991931 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-vks94"] Feb 02 12:39:15 crc kubenswrapper[4909]: I0202 12:39:15.461622 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:39:15 crc kubenswrapper[4909]: I0202 12:39:15.983025 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-vks94" event={"ID":"609b4bdb-129e-4a27-841e-a83e453dfd79","Type":"ContainerStarted","Data":"a24112831a5b01ca54e5207485110e9f008f466b7b78728abaf3f41574b4ce21"} Feb 02 12:39:15 crc kubenswrapper[4909]: I0202 12:39:15.983429 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-vks94" event={"ID":"609b4bdb-129e-4a27-841e-a83e453dfd79","Type":"ContainerStarted","Data":"737bbe708a41c912c0c5755cd69101f61b19302a55c3edbbb5be8c62ceee37b6"} Feb 02 12:39:16 crc kubenswrapper[4909]: I0202 12:39:16.007227 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-vks94" podStartSLOduration=1.54544259 podStartE2EDuration="2.00721023s" podCreationTimestamp="2026-02-02 12:39:14 +0000 UTC" firstStartedPulling="2026-02-02 12:39:14.994175057 +0000 UTC m=+7680.740275792" lastFinishedPulling="2026-02-02 12:39:15.455942697 +0000 UTC m=+7681.202043432" observedRunningTime="2026-02-02 12:39:16.001964011 +0000 UTC m=+7681.748064746" watchObservedRunningTime="2026-02-02 12:39:16.00721023 +0000 UTC m=+7681.753310965" Feb 02 12:39:19 crc kubenswrapper[4909]: I0202 12:39:19.510974 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:39:19 crc kubenswrapper[4909]: I0202 12:39:19.511319 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:39:32 crc kubenswrapper[4909]: I0202 12:39:32.129496 4909 generic.go:334] "Generic (PLEG): container finished" podID="609b4bdb-129e-4a27-841e-a83e453dfd79" containerID="a24112831a5b01ca54e5207485110e9f008f466b7b78728abaf3f41574b4ce21" exitCode=0 Feb 02 12:39:32 crc kubenswrapper[4909]: I0202 12:39:32.129593 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-vks94" event={"ID":"609b4bdb-129e-4a27-841e-a83e453dfd79","Type":"ContainerDied","Data":"a24112831a5b01ca54e5207485110e9f008f466b7b78728abaf3f41574b4ce21"} Feb 02 12:39:33 crc kubenswrapper[4909]: I0202 12:39:33.651546 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:33 crc kubenswrapper[4909]: I0202 12:39:33.794296 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-ssh-key-openstack-cell1\") pod \"609b4bdb-129e-4a27-841e-a83e453dfd79\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " Feb 02 12:39:33 crc kubenswrapper[4909]: I0202 12:39:33.794525 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-inventory\") pod \"609b4bdb-129e-4a27-841e-a83e453dfd79\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " Feb 02 12:39:33 crc kubenswrapper[4909]: I0202 12:39:33.794584 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn6vl\" (UniqueName: \"kubernetes.io/projected/609b4bdb-129e-4a27-841e-a83e453dfd79-kube-api-access-nn6vl\") pod \"609b4bdb-129e-4a27-841e-a83e453dfd79\" (UID: \"609b4bdb-129e-4a27-841e-a83e453dfd79\") " Feb 02 12:39:33 crc kubenswrapper[4909]: I0202 12:39:33.800938 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609b4bdb-129e-4a27-841e-a83e453dfd79-kube-api-access-nn6vl" (OuterVolumeSpecName: "kube-api-access-nn6vl") pod "609b4bdb-129e-4a27-841e-a83e453dfd79" (UID: "609b4bdb-129e-4a27-841e-a83e453dfd79"). InnerVolumeSpecName "kube-api-access-nn6vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:39:33 crc kubenswrapper[4909]: I0202 12:39:33.834085 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-inventory" (OuterVolumeSpecName: "inventory") pod "609b4bdb-129e-4a27-841e-a83e453dfd79" (UID: "609b4bdb-129e-4a27-841e-a83e453dfd79"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:39:33 crc kubenswrapper[4909]: I0202 12:39:33.841327 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "609b4bdb-129e-4a27-841e-a83e453dfd79" (UID: "609b4bdb-129e-4a27-841e-a83e453dfd79"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:39:33 crc kubenswrapper[4909]: I0202 12:39:33.897914 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:39:33 crc kubenswrapper[4909]: I0202 12:39:33.897964 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/609b4bdb-129e-4a27-841e-a83e453dfd79-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:39:33 crc kubenswrapper[4909]: I0202 12:39:33.897977 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn6vl\" (UniqueName: \"kubernetes.io/projected/609b4bdb-129e-4a27-841e-a83e453dfd79-kube-api-access-nn6vl\") on node \"crc\" DevicePath \"\"" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.148836 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-vks94" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.148828 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-vks94" event={"ID":"609b4bdb-129e-4a27-841e-a83e453dfd79","Type":"ContainerDied","Data":"737bbe708a41c912c0c5755cd69101f61b19302a55c3edbbb5be8c62ceee37b6"} Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.148973 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="737bbe708a41c912c0c5755cd69101f61b19302a55c3edbbb5be8c62ceee37b6" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.246772 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-rdn67"] Feb 02 12:39:34 crc kubenswrapper[4909]: E0202 12:39:34.247266 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609b4bdb-129e-4a27-841e-a83e453dfd79" containerName="reboot-os-openstack-openstack-cell1" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.247284 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="609b4bdb-129e-4a27-841e-a83e453dfd79" containerName="reboot-os-openstack-openstack-cell1" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.247545 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="609b4bdb-129e-4a27-841e-a83e453dfd79" containerName="reboot-os-openstack-openstack-cell1" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.248437 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.251164 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.251358 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-telemetry-default-certs-0" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.251479 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.251579 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-neutron-metadata-default-certs-0" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.251890 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-ovn-default-certs-0" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.252029 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.252144 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-libvirt-default-certs-0" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.254418 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.265375 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-rdn67"] Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.306848 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.306916 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.306947 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.306966 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307300 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307426 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307483 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307573 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307619 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307649 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307679 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcn7m\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-kube-api-access-tcn7m\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307758 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-inventory\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307799 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307882 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.307957 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409614 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409665 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409688 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409715 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409740 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcn7m\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-kube-api-access-tcn7m\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409760 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409783 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409859 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-inventory\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409897 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409928 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.409975 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.410037 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.410080 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.410108 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.410126 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.414535 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.414723 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.415747 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.416103 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-inventory\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.416554 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.416732 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.416900 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.417215 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.417446 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.417875 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.417375 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.418591 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.419322 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.420898 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.429241 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcn7m\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-kube-api-access-tcn7m\") pod \"install-certs-openstack-openstack-cell1-rdn67\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:34 crc kubenswrapper[4909]: I0202 12:39:34.568518 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:39:35 crc kubenswrapper[4909]: I0202 12:39:35.116639 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-rdn67"] Feb 02 12:39:35 crc kubenswrapper[4909]: I0202 12:39:35.182238 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rdn67" event={"ID":"46ce9be6-c203-4da8-a82c-e81e6e48ef99","Type":"ContainerStarted","Data":"2ce997dbd7022b06cc939ea577927fd58860d13e4008faa7c484696f750ac3ca"} Feb 02 12:39:36 crc kubenswrapper[4909]: I0202 12:39:36.192351 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rdn67" event={"ID":"46ce9be6-c203-4da8-a82c-e81e6e48ef99","Type":"ContainerStarted","Data":"4abc97b580f91115b2e1a0f9459fe344476148987f0db7ce1dbb76d23fe054cb"} Feb 02 12:39:36 crc kubenswrapper[4909]: I0202 12:39:36.221665 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-rdn67" podStartSLOduration=1.783311471 podStartE2EDuration="2.221645561s" podCreationTimestamp="2026-02-02 12:39:34 +0000 UTC" firstStartedPulling="2026-02-02 12:39:35.131368344 +0000 UTC m=+7700.877469079" lastFinishedPulling="2026-02-02 12:39:35.569702434 +0000 UTC m=+7701.315803169" observedRunningTime="2026-02-02 12:39:36.212346777 +0000 UTC m=+7701.958447522" watchObservedRunningTime="2026-02-02 12:39:36.221645561 +0000 UTC m=+7701.967746296" Feb 02 12:39:49 crc kubenswrapper[4909]: I0202 12:39:49.510933 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:39:49 crc kubenswrapper[4909]: I0202 12:39:49.512671 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:40:11 crc kubenswrapper[4909]: I0202 12:40:11.504746 4909 generic.go:334] "Generic (PLEG): container finished" podID="46ce9be6-c203-4da8-a82c-e81e6e48ef99" containerID="4abc97b580f91115b2e1a0f9459fe344476148987f0db7ce1dbb76d23fe054cb" exitCode=0 Feb 02 12:40:11 crc kubenswrapper[4909]: I0202 12:40:11.504929 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rdn67" event={"ID":"46ce9be6-c203-4da8-a82c-e81e6e48ef99","Type":"ContainerDied","Data":"4abc97b580f91115b2e1a0f9459fe344476148987f0db7ce1dbb76d23fe054cb"} Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.130218 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.136728 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-telemetry-combined-ca-bundle\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.136878 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-metadata-combined-ca-bundle\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.136965 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-neutron-metadata-default-certs-0\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137041 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-ovn-default-certs-0\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137102 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcn7m\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-kube-api-access-tcn7m\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137196 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-sriov-combined-ca-bundle\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137228 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ovn-combined-ca-bundle\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137308 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-libvirt-default-certs-0\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137401 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ssh-key-openstack-cell1\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137429 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-telemetry-default-certs-0\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137474 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-bootstrap-combined-ca-bundle\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137524 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-inventory\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137573 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-libvirt-combined-ca-bundle\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137656 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-nova-combined-ca-bundle\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.137697 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-dhcp-combined-ca-bundle\") pod \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\" (UID: \"46ce9be6-c203-4da8-a82c-e81e6e48ef99\") " Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.145486 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.145643 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-neutron-metadata-default-certs-0") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "openstack-cell1-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.147457 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-telemetry-default-certs-0") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "openstack-cell1-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.147825 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.150001 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-kube-api-access-tcn7m" (OuterVolumeSpecName: "kube-api-access-tcn7m") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "kube-api-access-tcn7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.151319 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.153101 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.156620 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-libvirt-default-certs-0") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "openstack-cell1-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.156913 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-ovn-default-certs-0") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "openstack-cell1-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.158114 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.158699 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.167199 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.218665 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.221312 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-inventory" (OuterVolumeSpecName: "inventory") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.223777 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "46ce9be6-c203-4da8-a82c-e81e6e48ef99" (UID: "46ce9be6-c203-4da8-a82c-e81e6e48ef99"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240619 4909 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240668 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240680 4909 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240691 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240704 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240722 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240732 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcn7m\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-kube-api-access-tcn7m\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240742 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240751 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240761 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240770 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240780 4909 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/46ce9be6-c203-4da8-a82c-e81e6e48ef99-openstack-cell1-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240790 4909 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.240808 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.241012 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46ce9be6-c203-4da8-a82c-e81e6e48ef99-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.543958 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-rdn67" event={"ID":"46ce9be6-c203-4da8-a82c-e81e6e48ef99","Type":"ContainerDied","Data":"2ce997dbd7022b06cc939ea577927fd58860d13e4008faa7c484696f750ac3ca"} Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.544003 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ce997dbd7022b06cc939ea577927fd58860d13e4008faa7c484696f750ac3ca" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.544065 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-rdn67" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.637170 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pc7fn"] Feb 02 12:40:13 crc kubenswrapper[4909]: E0202 12:40:13.638093 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46ce9be6-c203-4da8-a82c-e81e6e48ef99" containerName="install-certs-openstack-openstack-cell1" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.638121 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="46ce9be6-c203-4da8-a82c-e81e6e48ef99" containerName="install-certs-openstack-openstack-cell1" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.638542 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="46ce9be6-c203-4da8-a82c-e81e6e48ef99" containerName="install-certs-openstack-openstack-cell1" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.639708 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.642504 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.642504 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.642784 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.643068 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.645129 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.648350 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pc7fn"] Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.650307 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-inventory\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.650449 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.650551 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.650576 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h27pw\" (UniqueName: \"kubernetes.io/projected/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-kube-api-access-h27pw\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.650748 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.752731 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-inventory\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.752831 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.752886 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.752910 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h27pw\" (UniqueName: \"kubernetes.io/projected/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-kube-api-access-h27pw\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.752989 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.754141 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.757130 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.757515 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.758395 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-inventory\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.769681 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h27pw\" (UniqueName: \"kubernetes.io/projected/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-kube-api-access-h27pw\") pod \"ovn-openstack-openstack-cell1-pc7fn\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:13 crc kubenswrapper[4909]: I0202 12:40:13.968762 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:40:14 crc kubenswrapper[4909]: I0202 12:40:14.543606 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pc7fn"] Feb 02 12:40:14 crc kubenswrapper[4909]: I0202 12:40:14.555845 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pc7fn" event={"ID":"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5","Type":"ContainerStarted","Data":"08a62ad6df1a8cc6359b8de5c6e82fd07f2a9b39a93c1c58925a1b408a5c3bc3"} Feb 02 12:40:15 crc kubenswrapper[4909]: I0202 12:40:15.565881 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pc7fn" event={"ID":"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5","Type":"ContainerStarted","Data":"39f82693b3aea84bb7bdd7e7f029cbc6494e501bea0641c4ffa70ff817e63b5f"} Feb 02 12:40:15 crc kubenswrapper[4909]: I0202 12:40:15.590439 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-pc7fn" podStartSLOduration=2.158469829 podStartE2EDuration="2.590421007s" podCreationTimestamp="2026-02-02 12:40:13 +0000 UTC" firstStartedPulling="2026-02-02 12:40:14.548516444 +0000 UTC m=+7740.294617179" lastFinishedPulling="2026-02-02 12:40:14.980467622 +0000 UTC m=+7740.726568357" observedRunningTime="2026-02-02 12:40:15.583936603 +0000 UTC m=+7741.330037348" watchObservedRunningTime="2026-02-02 12:40:15.590421007 +0000 UTC m=+7741.336521732" Feb 02 12:40:19 crc kubenswrapper[4909]: I0202 12:40:19.510836 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:40:19 crc kubenswrapper[4909]: I0202 12:40:19.511371 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:40:19 crc kubenswrapper[4909]: I0202 12:40:19.511417 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 12:40:19 crc kubenswrapper[4909]: I0202 12:40:19.512256 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 12:40:19 crc kubenswrapper[4909]: I0202 12:40:19.512310 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" gracePeriod=600 Feb 02 12:40:19 crc kubenswrapper[4909]: E0202 12:40:19.639131 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:40:20 crc kubenswrapper[4909]: I0202 12:40:20.644064 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" exitCode=0 Feb 02 12:40:20 crc kubenswrapper[4909]: I0202 12:40:20.644162 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d"} Feb 02 12:40:20 crc kubenswrapper[4909]: I0202 12:40:20.644440 4909 scope.go:117] "RemoveContainer" containerID="3c721b1aabb3d968c8ddbe1edde1cd3f18f6d49e271134714d86a9cde069c5c4" Feb 02 12:40:20 crc kubenswrapper[4909]: I0202 12:40:20.645589 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:40:20 crc kubenswrapper[4909]: E0202 12:40:20.646267 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.157758 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b8lc9"] Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.160662 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.175560 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b8lc9"] Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.261970 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-utilities\") pod \"redhat-operators-b8lc9\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.262025 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vhmh\" (UniqueName: \"kubernetes.io/projected/f868edb6-1382-464a-8ad0-360cd833d2a7-kube-api-access-2vhmh\") pod \"redhat-operators-b8lc9\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.262338 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-catalog-content\") pod \"redhat-operators-b8lc9\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.364203 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-catalog-content\") pod \"redhat-operators-b8lc9\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.364485 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-utilities\") pod \"redhat-operators-b8lc9\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.364537 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vhmh\" (UniqueName: \"kubernetes.io/projected/f868edb6-1382-464a-8ad0-360cd833d2a7-kube-api-access-2vhmh\") pod \"redhat-operators-b8lc9\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.364739 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-catalog-content\") pod \"redhat-operators-b8lc9\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.365088 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-utilities\") pod \"redhat-operators-b8lc9\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.386587 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vhmh\" (UniqueName: \"kubernetes.io/projected/f868edb6-1382-464a-8ad0-360cd833d2a7-kube-api-access-2vhmh\") pod \"redhat-operators-b8lc9\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.486397 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:31 crc kubenswrapper[4909]: I0202 12:40:31.955508 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b8lc9"] Feb 02 12:40:31 crc kubenswrapper[4909]: W0202 12:40:31.960920 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf868edb6_1382_464a_8ad0_360cd833d2a7.slice/crio-a69a31f9fc38b14dc55181fdcefa48bfb4a60be7bcc811a5217d0ca743a89ffc WatchSource:0}: Error finding container a69a31f9fc38b14dc55181fdcefa48bfb4a60be7bcc811a5217d0ca743a89ffc: Status 404 returned error can't find the container with id a69a31f9fc38b14dc55181fdcefa48bfb4a60be7bcc811a5217d0ca743a89ffc Feb 02 12:40:32 crc kubenswrapper[4909]: I0202 12:40:32.780407 4909 generic.go:334] "Generic (PLEG): container finished" podID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerID="5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702" exitCode=0 Feb 02 12:40:32 crc kubenswrapper[4909]: I0202 12:40:32.780481 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8lc9" event={"ID":"f868edb6-1382-464a-8ad0-360cd833d2a7","Type":"ContainerDied","Data":"5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702"} Feb 02 12:40:32 crc kubenswrapper[4909]: I0202 12:40:32.780707 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8lc9" event={"ID":"f868edb6-1382-464a-8ad0-360cd833d2a7","Type":"ContainerStarted","Data":"a69a31f9fc38b14dc55181fdcefa48bfb4a60be7bcc811a5217d0ca743a89ffc"} Feb 02 12:40:34 crc kubenswrapper[4909]: I0202 12:40:34.803244 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8lc9" event={"ID":"f868edb6-1382-464a-8ad0-360cd833d2a7","Type":"ContainerStarted","Data":"202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5"} Feb 02 12:40:35 crc kubenswrapper[4909]: I0202 12:40:35.023195 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:40:35 crc kubenswrapper[4909]: E0202 12:40:35.023789 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:40:38 crc kubenswrapper[4909]: I0202 12:40:38.841683 4909 generic.go:334] "Generic (PLEG): container finished" podID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerID="202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5" exitCode=0 Feb 02 12:40:38 crc kubenswrapper[4909]: I0202 12:40:38.841771 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8lc9" event={"ID":"f868edb6-1382-464a-8ad0-360cd833d2a7","Type":"ContainerDied","Data":"202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5"} Feb 02 12:40:38 crc kubenswrapper[4909]: I0202 12:40:38.845261 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 12:40:39 crc kubenswrapper[4909]: I0202 12:40:39.854650 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8lc9" event={"ID":"f868edb6-1382-464a-8ad0-360cd833d2a7","Type":"ContainerStarted","Data":"0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f"} Feb 02 12:40:39 crc kubenswrapper[4909]: I0202 12:40:39.879304 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b8lc9" podStartSLOduration=2.375166314 podStartE2EDuration="8.87928151s" podCreationTimestamp="2026-02-02 12:40:31 +0000 UTC" firstStartedPulling="2026-02-02 12:40:32.784555609 +0000 UTC m=+7758.530656344" lastFinishedPulling="2026-02-02 12:40:39.288670805 +0000 UTC m=+7765.034771540" observedRunningTime="2026-02-02 12:40:39.869502423 +0000 UTC m=+7765.615603168" watchObservedRunningTime="2026-02-02 12:40:39.87928151 +0000 UTC m=+7765.625382245" Feb 02 12:40:41 crc kubenswrapper[4909]: I0202 12:40:41.486968 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:41 crc kubenswrapper[4909]: I0202 12:40:41.487231 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:40:42 crc kubenswrapper[4909]: I0202 12:40:42.541318 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b8lc9" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerName="registry-server" probeResult="failure" output=< Feb 02 12:40:42 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:40:42 crc kubenswrapper[4909]: > Feb 02 12:40:48 crc kubenswrapper[4909]: I0202 12:40:48.016348 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:40:48 crc kubenswrapper[4909]: E0202 12:40:48.017206 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:40:52 crc kubenswrapper[4909]: I0202 12:40:52.528507 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b8lc9" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerName="registry-server" probeResult="failure" output=< Feb 02 12:40:52 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:40:52 crc kubenswrapper[4909]: > Feb 02 12:40:59 crc kubenswrapper[4909]: I0202 12:40:59.016845 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:40:59 crc kubenswrapper[4909]: E0202 12:40:59.017691 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:41:01 crc kubenswrapper[4909]: I0202 12:41:01.539723 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:41:01 crc kubenswrapper[4909]: I0202 12:41:01.588391 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:41:02 crc kubenswrapper[4909]: I0202 12:41:02.350069 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b8lc9"] Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.069989 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b8lc9" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerName="registry-server" containerID="cri-o://0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f" gracePeriod=2 Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.642611 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.762504 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vhmh\" (UniqueName: \"kubernetes.io/projected/f868edb6-1382-464a-8ad0-360cd833d2a7-kube-api-access-2vhmh\") pod \"f868edb6-1382-464a-8ad0-360cd833d2a7\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.762605 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-utilities\") pod \"f868edb6-1382-464a-8ad0-360cd833d2a7\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.762652 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-catalog-content\") pod \"f868edb6-1382-464a-8ad0-360cd833d2a7\" (UID: \"f868edb6-1382-464a-8ad0-360cd833d2a7\") " Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.763631 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-utilities" (OuterVolumeSpecName: "utilities") pod "f868edb6-1382-464a-8ad0-360cd833d2a7" (UID: "f868edb6-1382-464a-8ad0-360cd833d2a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.772102 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f868edb6-1382-464a-8ad0-360cd833d2a7-kube-api-access-2vhmh" (OuterVolumeSpecName: "kube-api-access-2vhmh") pod "f868edb6-1382-464a-8ad0-360cd833d2a7" (UID: "f868edb6-1382-464a-8ad0-360cd833d2a7"). InnerVolumeSpecName "kube-api-access-2vhmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.865159 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vhmh\" (UniqueName: \"kubernetes.io/projected/f868edb6-1382-464a-8ad0-360cd833d2a7-kube-api-access-2vhmh\") on node \"crc\" DevicePath \"\"" Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.865490 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.884911 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f868edb6-1382-464a-8ad0-360cd833d2a7" (UID: "f868edb6-1382-464a-8ad0-360cd833d2a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:41:03 crc kubenswrapper[4909]: I0202 12:41:03.967258 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f868edb6-1382-464a-8ad0-360cd833d2a7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.106422 4909 generic.go:334] "Generic (PLEG): container finished" podID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerID="0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f" exitCode=0 Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.106477 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8lc9" event={"ID":"f868edb6-1382-464a-8ad0-360cd833d2a7","Type":"ContainerDied","Data":"0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f"} Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.106514 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b8lc9" event={"ID":"f868edb6-1382-464a-8ad0-360cd833d2a7","Type":"ContainerDied","Data":"a69a31f9fc38b14dc55181fdcefa48bfb4a60be7bcc811a5217d0ca743a89ffc"} Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.107044 4909 scope.go:117] "RemoveContainer" containerID="0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f" Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.107111 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b8lc9" Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.134574 4909 scope.go:117] "RemoveContainer" containerID="202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5" Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.155085 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b8lc9"] Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.162071 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b8lc9"] Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.175233 4909 scope.go:117] "RemoveContainer" containerID="5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702" Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.231504 4909 scope.go:117] "RemoveContainer" containerID="0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f" Feb 02 12:41:04 crc kubenswrapper[4909]: E0202 12:41:04.232112 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f\": container with ID starting with 0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f not found: ID does not exist" containerID="0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f" Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.232193 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f"} err="failed to get container status \"0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f\": rpc error: code = NotFound desc = could not find container \"0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f\": container with ID starting with 0387ba14ede6db61f3691b5197a6bf8f07aaec20361959ec57736cafecdf379f not found: ID does not exist" Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.232229 4909 scope.go:117] "RemoveContainer" containerID="202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5" Feb 02 12:41:04 crc kubenswrapper[4909]: E0202 12:41:04.232797 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5\": container with ID starting with 202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5 not found: ID does not exist" containerID="202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5" Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.233862 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5"} err="failed to get container status \"202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5\": rpc error: code = NotFound desc = could not find container \"202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5\": container with ID starting with 202046b9cf3c03ca9f24bc5d380b22d379390b5ba1a8bed17494be47681e39b5 not found: ID does not exist" Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.233907 4909 scope.go:117] "RemoveContainer" containerID="5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702" Feb 02 12:41:04 crc kubenswrapper[4909]: E0202 12:41:04.234946 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702\": container with ID starting with 5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702 not found: ID does not exist" containerID="5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702" Feb 02 12:41:04 crc kubenswrapper[4909]: I0202 12:41:04.235001 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702"} err="failed to get container status \"5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702\": rpc error: code = NotFound desc = could not find container \"5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702\": container with ID starting with 5c53d01fb848ecda8bc002e16997a02fef748711f6537df54e45da6fa7d46702 not found: ID does not exist" Feb 02 12:41:05 crc kubenswrapper[4909]: I0202 12:41:05.029717 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" path="/var/lib/kubelet/pods/f868edb6-1382-464a-8ad0-360cd833d2a7/volumes" Feb 02 12:41:11 crc kubenswrapper[4909]: I0202 12:41:11.017209 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:41:11 crc kubenswrapper[4909]: E0202 12:41:11.018178 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:41:16 crc kubenswrapper[4909]: I0202 12:41:16.231139 4909 generic.go:334] "Generic (PLEG): container finished" podID="f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5" containerID="39f82693b3aea84bb7bdd7e7f029cbc6494e501bea0641c4ffa70ff817e63b5f" exitCode=0 Feb 02 12:41:16 crc kubenswrapper[4909]: I0202 12:41:16.231221 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pc7fn" event={"ID":"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5","Type":"ContainerDied","Data":"39f82693b3aea84bb7bdd7e7f029cbc6494e501bea0641c4ffa70ff817e63b5f"} Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.762120 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.885502 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-inventory\") pod \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.885717 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovncontroller-config-0\") pod \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.885879 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ssh-key-openstack-cell1\") pod \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.885934 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovn-combined-ca-bundle\") pod \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.885966 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h27pw\" (UniqueName: \"kubernetes.io/projected/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-kube-api-access-h27pw\") pod \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\" (UID: \"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5\") " Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.893126 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5" (UID: "f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.897336 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-kube-api-access-h27pw" (OuterVolumeSpecName: "kube-api-access-h27pw") pod "f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5" (UID: "f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5"). InnerVolumeSpecName "kube-api-access-h27pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.916453 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5" (UID: "f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.920173 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5" (UID: "f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.922606 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-inventory" (OuterVolumeSpecName: "inventory") pod "f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5" (UID: "f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.988906 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.988938 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h27pw\" (UniqueName: \"kubernetes.io/projected/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-kube-api-access-h27pw\") on node \"crc\" DevicePath \"\"" Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.988947 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.988957 4909 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:41:17 crc kubenswrapper[4909]: I0202 12:41:17.988965 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.251545 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pc7fn" event={"ID":"f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5","Type":"ContainerDied","Data":"08a62ad6df1a8cc6359b8de5c6e82fd07f2a9b39a93c1c58925a1b408a5c3bc3"} Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.251607 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08a62ad6df1a8cc6359b8de5c6e82fd07f2a9b39a93c1c58925a1b408a5c3bc3" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.251636 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pc7fn" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.352384 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-cmt4b"] Feb 02 12:41:18 crc kubenswrapper[4909]: E0202 12:41:18.353031 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5" containerName="ovn-openstack-openstack-cell1" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.353053 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5" containerName="ovn-openstack-openstack-cell1" Feb 02 12:41:18 crc kubenswrapper[4909]: E0202 12:41:18.353081 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerName="extract-content" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.353089 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerName="extract-content" Feb 02 12:41:18 crc kubenswrapper[4909]: E0202 12:41:18.353105 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerName="registry-server" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.353111 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerName="registry-server" Feb 02 12:41:18 crc kubenswrapper[4909]: E0202 12:41:18.353136 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerName="extract-utilities" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.353142 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerName="extract-utilities" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.353363 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5" containerName="ovn-openstack-openstack-cell1" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.353388 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="f868edb6-1382-464a-8ad0-360cd833d2a7" containerName="registry-server" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.354320 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.361252 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.362125 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.362451 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.362699 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.365561 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.365739 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.368789 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-cmt4b"] Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.500615 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.500673 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.500845 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.501359 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.501420 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzp5h\" (UniqueName: \"kubernetes.io/projected/5588b6ad-43e1-489d-8157-b9ed3a7da9de-kube-api-access-vzp5h\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.501475 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.603056 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.603117 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzp5h\" (UniqueName: \"kubernetes.io/projected/5588b6ad-43e1-489d-8157-b9ed3a7da9de-kube-api-access-vzp5h\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.603153 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.603244 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.603292 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.603322 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.607000 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.607066 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.607283 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.608085 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.618331 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.621952 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzp5h\" (UniqueName: \"kubernetes.io/projected/5588b6ad-43e1-489d-8157-b9ed3a7da9de-kube-api-access-vzp5h\") pod \"neutron-metadata-openstack-openstack-cell1-cmt4b\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:18 crc kubenswrapper[4909]: I0202 12:41:18.674753 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:41:19 crc kubenswrapper[4909]: I0202 12:41:19.250447 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-cmt4b"] Feb 02 12:41:20 crc kubenswrapper[4909]: I0202 12:41:20.272127 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" event={"ID":"5588b6ad-43e1-489d-8157-b9ed3a7da9de","Type":"ContainerStarted","Data":"3a6762e2c15ebdc3d2d32a26c8694d555de0a9ebced279bfc02ea96ffa9a4f05"} Feb 02 12:41:20 crc kubenswrapper[4909]: I0202 12:41:20.272698 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" event={"ID":"5588b6ad-43e1-489d-8157-b9ed3a7da9de","Type":"ContainerStarted","Data":"38bf5c6f82433775d758d5ee2e84a07f63241aaeed20e2546aa5caf434d0397e"} Feb 02 12:41:20 crc kubenswrapper[4909]: I0202 12:41:20.295998 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" podStartSLOduration=1.874080111 podStartE2EDuration="2.295977344s" podCreationTimestamp="2026-02-02 12:41:18 +0000 UTC" firstStartedPulling="2026-02-02 12:41:19.258431014 +0000 UTC m=+7805.004531749" lastFinishedPulling="2026-02-02 12:41:19.680328247 +0000 UTC m=+7805.426428982" observedRunningTime="2026-02-02 12:41:20.288496201 +0000 UTC m=+7806.034596946" watchObservedRunningTime="2026-02-02 12:41:20.295977344 +0000 UTC m=+7806.042078079" Feb 02 12:41:24 crc kubenswrapper[4909]: I0202 12:41:24.017086 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:41:24 crc kubenswrapper[4909]: E0202 12:41:24.017876 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:41:38 crc kubenswrapper[4909]: I0202 12:41:38.017212 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:41:38 crc kubenswrapper[4909]: E0202 12:41:38.018079 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:41:52 crc kubenswrapper[4909]: I0202 12:41:52.016924 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:41:52 crc kubenswrapper[4909]: E0202 12:41:52.017679 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:42:05 crc kubenswrapper[4909]: I0202 12:42:05.035112 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:42:05 crc kubenswrapper[4909]: E0202 12:42:05.037024 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:42:08 crc kubenswrapper[4909]: I0202 12:42:08.729150 4909 generic.go:334] "Generic (PLEG): container finished" podID="5588b6ad-43e1-489d-8157-b9ed3a7da9de" containerID="3a6762e2c15ebdc3d2d32a26c8694d555de0a9ebced279bfc02ea96ffa9a4f05" exitCode=0 Feb 02 12:42:08 crc kubenswrapper[4909]: I0202 12:42:08.729218 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" event={"ID":"5588b6ad-43e1-489d-8157-b9ed3a7da9de","Type":"ContainerDied","Data":"3a6762e2c15ebdc3d2d32a26c8694d555de0a9ebced279bfc02ea96ffa9a4f05"} Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.802359 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" event={"ID":"5588b6ad-43e1-489d-8157-b9ed3a7da9de","Type":"ContainerDied","Data":"38bf5c6f82433775d758d5ee2e84a07f63241aaeed20e2546aa5caf434d0397e"} Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.802983 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38bf5c6f82433775d758d5ee2e84a07f63241aaeed20e2546aa5caf434d0397e" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.832082 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.861357 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-metadata-combined-ca-bundle\") pod \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.861398 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-ovn-metadata-agent-neutron-config-0\") pod \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.861536 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzp5h\" (UniqueName: \"kubernetes.io/projected/5588b6ad-43e1-489d-8157-b9ed3a7da9de-kube-api-access-vzp5h\") pod \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.861713 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-inventory\") pod \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.861749 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-nova-metadata-neutron-config-0\") pod \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.861769 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-ssh-key-openstack-cell1\") pod \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\" (UID: \"5588b6ad-43e1-489d-8157-b9ed3a7da9de\") " Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.893068 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5588b6ad-43e1-489d-8157-b9ed3a7da9de" (UID: "5588b6ad-43e1-489d-8157-b9ed3a7da9de"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.910177 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5588b6ad-43e1-489d-8157-b9ed3a7da9de-kube-api-access-vzp5h" (OuterVolumeSpecName: "kube-api-access-vzp5h") pod "5588b6ad-43e1-489d-8157-b9ed3a7da9de" (UID: "5588b6ad-43e1-489d-8157-b9ed3a7da9de"). InnerVolumeSpecName "kube-api-access-vzp5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.912562 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "5588b6ad-43e1-489d-8157-b9ed3a7da9de" (UID: "5588b6ad-43e1-489d-8157-b9ed3a7da9de"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.914539 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "5588b6ad-43e1-489d-8157-b9ed3a7da9de" (UID: "5588b6ad-43e1-489d-8157-b9ed3a7da9de"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.935725 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-inventory" (OuterVolumeSpecName: "inventory") pod "5588b6ad-43e1-489d-8157-b9ed3a7da9de" (UID: "5588b6ad-43e1-489d-8157-b9ed3a7da9de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.941174 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5588b6ad-43e1-489d-8157-b9ed3a7da9de" (UID: "5588b6ad-43e1-489d-8157-b9ed3a7da9de"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.964795 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzp5h\" (UniqueName: \"kubernetes.io/projected/5588b6ad-43e1-489d-8157-b9ed3a7da9de-kube-api-access-vzp5h\") on node \"crc\" DevicePath \"\"" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.965061 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.965078 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.965090 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.965102 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:10.965118 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5588b6ad-43e1-489d-8157-b9ed3a7da9de-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.810622 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-cmt4b" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.951459 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-hpv94"] Feb 02 12:42:11 crc kubenswrapper[4909]: E0202 12:42:11.952170 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5588b6ad-43e1-489d-8157-b9ed3a7da9de" containerName="neutron-metadata-openstack-openstack-cell1" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.952200 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5588b6ad-43e1-489d-8157-b9ed3a7da9de" containerName="neutron-metadata-openstack-openstack-cell1" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.952482 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5588b6ad-43e1-489d-8157-b9ed3a7da9de" containerName="neutron-metadata-openstack-openstack-cell1" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.953395 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.955396 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.957963 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.958159 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.958206 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.958375 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.968283 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-hpv94"] Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.983467 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.983508 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.983545 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-inventory\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.983586 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb489\" (UniqueName: \"kubernetes.io/projected/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-kube-api-access-sb489\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:11 crc kubenswrapper[4909]: I0202 12:42:11.983763 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.084948 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.085025 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.085054 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.085095 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-inventory\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.085131 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb489\" (UniqueName: \"kubernetes.io/projected/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-kube-api-access-sb489\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.091493 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.094181 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-inventory\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.094218 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.095682 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.104372 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb489\" (UniqueName: \"kubernetes.io/projected/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-kube-api-access-sb489\") pod \"libvirt-openstack-openstack-cell1-hpv94\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.277429 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.793437 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-hpv94"] Feb 02 12:42:12 crc kubenswrapper[4909]: I0202 12:42:12.828754 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-hpv94" event={"ID":"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c","Type":"ContainerStarted","Data":"841fc09cc4e0b2e00a77e885df092056b05e0f369f5e512dceacc53e854b7e4f"} Feb 02 12:42:13 crc kubenswrapper[4909]: I0202 12:42:13.839702 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-hpv94" event={"ID":"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c","Type":"ContainerStarted","Data":"e1111120727949ad69b2664f1188cf98fb24f94536787a0bbd47c53777250a50"} Feb 02 12:42:13 crc kubenswrapper[4909]: I0202 12:42:13.862039 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-hpv94" podStartSLOduration=2.363860276 podStartE2EDuration="2.862018095s" podCreationTimestamp="2026-02-02 12:42:11 +0000 UTC" firstStartedPulling="2026-02-02 12:42:12.798520578 +0000 UTC m=+7858.544621313" lastFinishedPulling="2026-02-02 12:42:13.296678397 +0000 UTC m=+7859.042779132" observedRunningTime="2026-02-02 12:42:13.858770533 +0000 UTC m=+7859.604871298" watchObservedRunningTime="2026-02-02 12:42:13.862018095 +0000 UTC m=+7859.608118820" Feb 02 12:42:17 crc kubenswrapper[4909]: I0202 12:42:17.017444 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:42:17 crc kubenswrapper[4909]: E0202 12:42:17.018653 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:42:31 crc kubenswrapper[4909]: I0202 12:42:31.017464 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:42:31 crc kubenswrapper[4909]: E0202 12:42:31.018475 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:42:42 crc kubenswrapper[4909]: I0202 12:42:42.017330 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:42:42 crc kubenswrapper[4909]: E0202 12:42:42.018275 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:42:53 crc kubenswrapper[4909]: I0202 12:42:53.017301 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:42:53 crc kubenswrapper[4909]: E0202 12:42:53.019171 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:43:08 crc kubenswrapper[4909]: I0202 12:43:08.016721 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:43:08 crc kubenswrapper[4909]: E0202 12:43:08.017768 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:43:21 crc kubenswrapper[4909]: I0202 12:43:21.016615 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:43:21 crc kubenswrapper[4909]: E0202 12:43:21.017461 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:43:36 crc kubenswrapper[4909]: I0202 12:43:36.017164 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:43:36 crc kubenswrapper[4909]: E0202 12:43:36.018043 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:43:48 crc kubenswrapper[4909]: I0202 12:43:48.017791 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:43:48 crc kubenswrapper[4909]: E0202 12:43:48.018682 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:43:59 crc kubenswrapper[4909]: I0202 12:43:59.017248 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:43:59 crc kubenswrapper[4909]: E0202 12:43:59.018141 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:44:10 crc kubenswrapper[4909]: I0202 12:44:10.016691 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:44:10 crc kubenswrapper[4909]: E0202 12:44:10.017494 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:44:23 crc kubenswrapper[4909]: I0202 12:44:23.017285 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:44:23 crc kubenswrapper[4909]: E0202 12:44:23.032309 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:44:34 crc kubenswrapper[4909]: I0202 12:44:34.017071 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:44:34 crc kubenswrapper[4909]: E0202 12:44:34.018346 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:44:49 crc kubenswrapper[4909]: I0202 12:44:49.017092 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:44:49 crc kubenswrapper[4909]: E0202 12:44:49.017949 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.173603 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb"] Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.176773 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.179996 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.180692 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.184255 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb"] Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.352239 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9f30aa8-8e80-41c5-97c5-983382a8d996-secret-volume\") pod \"collect-profiles-29500605-nmwjb\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.352888 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9f30aa8-8e80-41c5-97c5-983382a8d996-config-volume\") pod \"collect-profiles-29500605-nmwjb\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.352929 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmmr2\" (UniqueName: \"kubernetes.io/projected/b9f30aa8-8e80-41c5-97c5-983382a8d996-kube-api-access-xmmr2\") pod \"collect-profiles-29500605-nmwjb\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.454630 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9f30aa8-8e80-41c5-97c5-983382a8d996-config-volume\") pod \"collect-profiles-29500605-nmwjb\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.454684 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmmr2\" (UniqueName: \"kubernetes.io/projected/b9f30aa8-8e80-41c5-97c5-983382a8d996-kube-api-access-xmmr2\") pod \"collect-profiles-29500605-nmwjb\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.454782 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9f30aa8-8e80-41c5-97c5-983382a8d996-secret-volume\") pod \"collect-profiles-29500605-nmwjb\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.455570 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9f30aa8-8e80-41c5-97c5-983382a8d996-config-volume\") pod \"collect-profiles-29500605-nmwjb\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.470693 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9f30aa8-8e80-41c5-97c5-983382a8d996-secret-volume\") pod \"collect-profiles-29500605-nmwjb\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.471005 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmmr2\" (UniqueName: \"kubernetes.io/projected/b9f30aa8-8e80-41c5-97c5-983382a8d996-kube-api-access-xmmr2\") pod \"collect-profiles-29500605-nmwjb\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.504193 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:00 crc kubenswrapper[4909]: I0202 12:45:00.944980 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb"] Feb 02 12:45:01 crc kubenswrapper[4909]: I0202 12:45:01.398631 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" event={"ID":"b9f30aa8-8e80-41c5-97c5-983382a8d996","Type":"ContainerStarted","Data":"b342a1fd91e7d754e92b6175a85a6e8ddd6f66aaa529d5447ff6170e1c0f5db3"} Feb 02 12:45:01 crc kubenswrapper[4909]: I0202 12:45:01.400309 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" event={"ID":"b9f30aa8-8e80-41c5-97c5-983382a8d996","Type":"ContainerStarted","Data":"40169fc797f905ecfa31888a57d0cb1c9d67642cf77d2d3b545a0b5f1a222cc0"} Feb 02 12:45:01 crc kubenswrapper[4909]: I0202 12:45:01.417371 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" podStartSLOduration=1.417353064 podStartE2EDuration="1.417353064s" podCreationTimestamp="2026-02-02 12:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:45:01.415966655 +0000 UTC m=+8027.162067380" watchObservedRunningTime="2026-02-02 12:45:01.417353064 +0000 UTC m=+8027.163453799" Feb 02 12:45:02 crc kubenswrapper[4909]: I0202 12:45:02.408972 4909 generic.go:334] "Generic (PLEG): container finished" podID="b9f30aa8-8e80-41c5-97c5-983382a8d996" containerID="b342a1fd91e7d754e92b6175a85a6e8ddd6f66aaa529d5447ff6170e1c0f5db3" exitCode=0 Feb 02 12:45:02 crc kubenswrapper[4909]: I0202 12:45:02.409036 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" event={"ID":"b9f30aa8-8e80-41c5-97c5-983382a8d996","Type":"ContainerDied","Data":"b342a1fd91e7d754e92b6175a85a6e8ddd6f66aaa529d5447ff6170e1c0f5db3"} Feb 02 12:45:03 crc kubenswrapper[4909]: I0202 12:45:03.016759 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:45:03 crc kubenswrapper[4909]: E0202 12:45:03.017373 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:45:03 crc kubenswrapper[4909]: I0202 12:45:03.811361 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:03 crc kubenswrapper[4909]: I0202 12:45:03.943754 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9f30aa8-8e80-41c5-97c5-983382a8d996-secret-volume\") pod \"b9f30aa8-8e80-41c5-97c5-983382a8d996\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " Feb 02 12:45:03 crc kubenswrapper[4909]: I0202 12:45:03.943965 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmmr2\" (UniqueName: \"kubernetes.io/projected/b9f30aa8-8e80-41c5-97c5-983382a8d996-kube-api-access-xmmr2\") pod \"b9f30aa8-8e80-41c5-97c5-983382a8d996\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " Feb 02 12:45:03 crc kubenswrapper[4909]: I0202 12:45:03.944022 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9f30aa8-8e80-41c5-97c5-983382a8d996-config-volume\") pod \"b9f30aa8-8e80-41c5-97c5-983382a8d996\" (UID: \"b9f30aa8-8e80-41c5-97c5-983382a8d996\") " Feb 02 12:45:03 crc kubenswrapper[4909]: I0202 12:45:03.944900 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9f30aa8-8e80-41c5-97c5-983382a8d996-config-volume" (OuterVolumeSpecName: "config-volume") pod "b9f30aa8-8e80-41c5-97c5-983382a8d996" (UID: "b9f30aa8-8e80-41c5-97c5-983382a8d996"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:45:03 crc kubenswrapper[4909]: I0202 12:45:03.951599 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9f30aa8-8e80-41c5-97c5-983382a8d996-kube-api-access-xmmr2" (OuterVolumeSpecName: "kube-api-access-xmmr2") pod "b9f30aa8-8e80-41c5-97c5-983382a8d996" (UID: "b9f30aa8-8e80-41c5-97c5-983382a8d996"). InnerVolumeSpecName "kube-api-access-xmmr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:45:03 crc kubenswrapper[4909]: I0202 12:45:03.951917 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9f30aa8-8e80-41c5-97c5-983382a8d996-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b9f30aa8-8e80-41c5-97c5-983382a8d996" (UID: "b9f30aa8-8e80-41c5-97c5-983382a8d996"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:45:04 crc kubenswrapper[4909]: I0202 12:45:04.048979 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9f30aa8-8e80-41c5-97c5-983382a8d996-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 12:45:04 crc kubenswrapper[4909]: I0202 12:45:04.049242 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmmr2\" (UniqueName: \"kubernetes.io/projected/b9f30aa8-8e80-41c5-97c5-983382a8d996-kube-api-access-xmmr2\") on node \"crc\" DevicePath \"\"" Feb 02 12:45:04 crc kubenswrapper[4909]: I0202 12:45:04.049253 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9f30aa8-8e80-41c5-97c5-983382a8d996-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 12:45:04 crc kubenswrapper[4909]: I0202 12:45:04.433366 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" event={"ID":"b9f30aa8-8e80-41c5-97c5-983382a8d996","Type":"ContainerDied","Data":"40169fc797f905ecfa31888a57d0cb1c9d67642cf77d2d3b545a0b5f1a222cc0"} Feb 02 12:45:04 crc kubenswrapper[4909]: I0202 12:45:04.433426 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40169fc797f905ecfa31888a57d0cb1c9d67642cf77d2d3b545a0b5f1a222cc0" Feb 02 12:45:04 crc kubenswrapper[4909]: I0202 12:45:04.433423 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500605-nmwjb" Feb 02 12:45:04 crc kubenswrapper[4909]: I0202 12:45:04.500824 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq"] Feb 02 12:45:04 crc kubenswrapper[4909]: I0202 12:45:04.510480 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500560-vx5lq"] Feb 02 12:45:05 crc kubenswrapper[4909]: I0202 12:45:05.028444 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1adc3405-27a3-4632-bc44-bb039b4764f7" path="/var/lib/kubelet/pods/1adc3405-27a3-4632-bc44-bb039b4764f7/volumes" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.391615 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v72xv"] Feb 02 12:45:09 crc kubenswrapper[4909]: E0202 12:45:09.393757 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9f30aa8-8e80-41c5-97c5-983382a8d996" containerName="collect-profiles" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.393776 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9f30aa8-8e80-41c5-97c5-983382a8d996" containerName="collect-profiles" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.393987 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9f30aa8-8e80-41c5-97c5-983382a8d996" containerName="collect-profiles" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.395714 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.407210 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v72xv"] Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.482433 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-catalog-content\") pod \"community-operators-v72xv\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.482494 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb7ck\" (UniqueName: \"kubernetes.io/projected/6fbae060-49a5-4365-9e66-2148fd0d7db8-kube-api-access-kb7ck\") pod \"community-operators-v72xv\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.482619 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-utilities\") pod \"community-operators-v72xv\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.585080 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-utilities\") pod \"community-operators-v72xv\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.585218 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-catalog-content\") pod \"community-operators-v72xv\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.585269 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb7ck\" (UniqueName: \"kubernetes.io/projected/6fbae060-49a5-4365-9e66-2148fd0d7db8-kube-api-access-kb7ck\") pod \"community-operators-v72xv\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.586172 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-utilities\") pod \"community-operators-v72xv\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.586489 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-catalog-content\") pod \"community-operators-v72xv\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.613708 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb7ck\" (UniqueName: \"kubernetes.io/projected/6fbae060-49a5-4365-9e66-2148fd0d7db8-kube-api-access-kb7ck\") pod \"community-operators-v72xv\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:09 crc kubenswrapper[4909]: I0202 12:45:09.717560 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:10 crc kubenswrapper[4909]: I0202 12:45:10.456029 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v72xv"] Feb 02 12:45:10 crc kubenswrapper[4909]: I0202 12:45:10.497894 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v72xv" event={"ID":"6fbae060-49a5-4365-9e66-2148fd0d7db8","Type":"ContainerStarted","Data":"ea4e8f5dd971d00d3a299c04df54e4ff1eba9b968dc4dd54c786922e7a730e15"} Feb 02 12:45:11 crc kubenswrapper[4909]: I0202 12:45:11.507365 4909 generic.go:334] "Generic (PLEG): container finished" podID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerID="d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f" exitCode=0 Feb 02 12:45:11 crc kubenswrapper[4909]: I0202 12:45:11.507467 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v72xv" event={"ID":"6fbae060-49a5-4365-9e66-2148fd0d7db8","Type":"ContainerDied","Data":"d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f"} Feb 02 12:45:12 crc kubenswrapper[4909]: I0202 12:45:12.518410 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v72xv" event={"ID":"6fbae060-49a5-4365-9e66-2148fd0d7db8","Type":"ContainerStarted","Data":"ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d"} Feb 02 12:45:13 crc kubenswrapper[4909]: I0202 12:45:13.529544 4909 generic.go:334] "Generic (PLEG): container finished" podID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerID="ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d" exitCode=0 Feb 02 12:45:13 crc kubenswrapper[4909]: I0202 12:45:13.529604 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v72xv" event={"ID":"6fbae060-49a5-4365-9e66-2148fd0d7db8","Type":"ContainerDied","Data":"ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d"} Feb 02 12:45:14 crc kubenswrapper[4909]: I0202 12:45:14.549104 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v72xv" event={"ID":"6fbae060-49a5-4365-9e66-2148fd0d7db8","Type":"ContainerStarted","Data":"965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14"} Feb 02 12:45:14 crc kubenswrapper[4909]: I0202 12:45:14.573543 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v72xv" podStartSLOduration=3.091631495 podStartE2EDuration="5.573523417s" podCreationTimestamp="2026-02-02 12:45:09 +0000 UTC" firstStartedPulling="2026-02-02 12:45:11.510262772 +0000 UTC m=+8037.256363517" lastFinishedPulling="2026-02-02 12:45:13.992154704 +0000 UTC m=+8039.738255439" observedRunningTime="2026-02-02 12:45:14.56694684 +0000 UTC m=+8040.313047585" watchObservedRunningTime="2026-02-02 12:45:14.573523417 +0000 UTC m=+8040.319624152" Feb 02 12:45:18 crc kubenswrapper[4909]: I0202 12:45:18.016788 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:45:18 crc kubenswrapper[4909]: E0202 12:45:18.017690 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:45:19 crc kubenswrapper[4909]: I0202 12:45:19.718313 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:19 crc kubenswrapper[4909]: I0202 12:45:19.718671 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:19 crc kubenswrapper[4909]: I0202 12:45:19.771581 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:20 crc kubenswrapper[4909]: I0202 12:45:20.644934 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:20 crc kubenswrapper[4909]: I0202 12:45:20.743799 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v72xv"] Feb 02 12:45:22 crc kubenswrapper[4909]: I0202 12:45:22.626902 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v72xv" podUID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerName="registry-server" containerID="cri-o://965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14" gracePeriod=2 Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.092299 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.282695 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-utilities\") pod \"6fbae060-49a5-4365-9e66-2148fd0d7db8\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.283090 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb7ck\" (UniqueName: \"kubernetes.io/projected/6fbae060-49a5-4365-9e66-2148fd0d7db8-kube-api-access-kb7ck\") pod \"6fbae060-49a5-4365-9e66-2148fd0d7db8\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.283257 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-catalog-content\") pod \"6fbae060-49a5-4365-9e66-2148fd0d7db8\" (UID: \"6fbae060-49a5-4365-9e66-2148fd0d7db8\") " Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.283914 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-utilities" (OuterVolumeSpecName: "utilities") pod "6fbae060-49a5-4365-9e66-2148fd0d7db8" (UID: "6fbae060-49a5-4365-9e66-2148fd0d7db8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.284202 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.290288 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fbae060-49a5-4365-9e66-2148fd0d7db8-kube-api-access-kb7ck" (OuterVolumeSpecName: "kube-api-access-kb7ck") pod "6fbae060-49a5-4365-9e66-2148fd0d7db8" (UID: "6fbae060-49a5-4365-9e66-2148fd0d7db8"). InnerVolumeSpecName "kube-api-access-kb7ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.339797 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6fbae060-49a5-4365-9e66-2148fd0d7db8" (UID: "6fbae060-49a5-4365-9e66-2148fd0d7db8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.386755 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb7ck\" (UniqueName: \"kubernetes.io/projected/6fbae060-49a5-4365-9e66-2148fd0d7db8-kube-api-access-kb7ck\") on node \"crc\" DevicePath \"\"" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.386792 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6fbae060-49a5-4365-9e66-2148fd0d7db8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.639954 4909 generic.go:334] "Generic (PLEG): container finished" podID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerID="965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14" exitCode=0 Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.640005 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v72xv" event={"ID":"6fbae060-49a5-4365-9e66-2148fd0d7db8","Type":"ContainerDied","Data":"965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14"} Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.640016 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v72xv" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.640032 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v72xv" event={"ID":"6fbae060-49a5-4365-9e66-2148fd0d7db8","Type":"ContainerDied","Data":"ea4e8f5dd971d00d3a299c04df54e4ff1eba9b968dc4dd54c786922e7a730e15"} Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.640048 4909 scope.go:117] "RemoveContainer" containerID="965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.664563 4909 scope.go:117] "RemoveContainer" containerID="ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.678005 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v72xv"] Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.691471 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v72xv"] Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.708651 4909 scope.go:117] "RemoveContainer" containerID="d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.768099 4909 scope.go:117] "RemoveContainer" containerID="965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14" Feb 02 12:45:23 crc kubenswrapper[4909]: E0202 12:45:23.768526 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14\": container with ID starting with 965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14 not found: ID does not exist" containerID="965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.768563 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14"} err="failed to get container status \"965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14\": rpc error: code = NotFound desc = could not find container \"965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14\": container with ID starting with 965d162bde603766474c44c3e7ec4aaf2572c39569a4406ecae625748affcb14 not found: ID does not exist" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.768589 4909 scope.go:117] "RemoveContainer" containerID="ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d" Feb 02 12:45:23 crc kubenswrapper[4909]: E0202 12:45:23.768773 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d\": container with ID starting with ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d not found: ID does not exist" containerID="ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.768793 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d"} err="failed to get container status \"ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d\": rpc error: code = NotFound desc = could not find container \"ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d\": container with ID starting with ae7bf723940205af2df4439b97adbe81394174e8cf5a46984db7c5a385735f8d not found: ID does not exist" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.768827 4909 scope.go:117] "RemoveContainer" containerID="d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f" Feb 02 12:45:23 crc kubenswrapper[4909]: E0202 12:45:23.769000 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f\": container with ID starting with d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f not found: ID does not exist" containerID="d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f" Feb 02 12:45:23 crc kubenswrapper[4909]: I0202 12:45:23.769019 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f"} err="failed to get container status \"d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f\": rpc error: code = NotFound desc = could not find container \"d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f\": container with ID starting with d1129b5c27aa06cb39a98d6e50e355c1751a7fc7442b6266414ca959a392560f not found: ID does not exist" Feb 02 12:45:23 crc kubenswrapper[4909]: E0202 12:45:23.866563 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fbae060_49a5_4365_9e66_2148fd0d7db8.slice/crio-ea4e8f5dd971d00d3a299c04df54e4ff1eba9b968dc4dd54c786922e7a730e15\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fbae060_49a5_4365_9e66_2148fd0d7db8.slice\": RecentStats: unable to find data in memory cache]" Feb 02 12:45:25 crc kubenswrapper[4909]: I0202 12:45:25.027935 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fbae060-49a5-4365-9e66-2148fd0d7db8" path="/var/lib/kubelet/pods/6fbae060-49a5-4365-9e66-2148fd0d7db8/volumes" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.469424 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x6fvm"] Feb 02 12:45:27 crc kubenswrapper[4909]: E0202 12:45:27.480143 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerName="extract-utilities" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.480168 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerName="extract-utilities" Feb 02 12:45:27 crc kubenswrapper[4909]: E0202 12:45:27.480209 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerName="registry-server" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.480217 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerName="registry-server" Feb 02 12:45:27 crc kubenswrapper[4909]: E0202 12:45:27.480235 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerName="extract-content" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.480243 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerName="extract-content" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.480483 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fbae060-49a5-4365-9e66-2148fd0d7db8" containerName="registry-server" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.483254 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.486084 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6fvm"] Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.670822 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-catalog-content\") pod \"certified-operators-x6fvm\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.670914 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-utilities\") pod \"certified-operators-x6fvm\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.670989 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4txjk\" (UniqueName: \"kubernetes.io/projected/3cf98484-153c-477b-a528-deb7ff9290ad-kube-api-access-4txjk\") pod \"certified-operators-x6fvm\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.773650 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-utilities\") pod \"certified-operators-x6fvm\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.774019 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4txjk\" (UniqueName: \"kubernetes.io/projected/3cf98484-153c-477b-a528-deb7ff9290ad-kube-api-access-4txjk\") pod \"certified-operators-x6fvm\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.774303 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-catalog-content\") pod \"certified-operators-x6fvm\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.774454 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-utilities\") pod \"certified-operators-x6fvm\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.775277 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-catalog-content\") pod \"certified-operators-x6fvm\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.804743 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4txjk\" (UniqueName: \"kubernetes.io/projected/3cf98484-153c-477b-a528-deb7ff9290ad-kube-api-access-4txjk\") pod \"certified-operators-x6fvm\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:27 crc kubenswrapper[4909]: I0202 12:45:27.818379 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:28 crc kubenswrapper[4909]: I0202 12:45:28.405495 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x6fvm"] Feb 02 12:45:28 crc kubenswrapper[4909]: I0202 12:45:28.691003 4909 generic.go:334] "Generic (PLEG): container finished" podID="3cf98484-153c-477b-a528-deb7ff9290ad" containerID="b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85" exitCode=0 Feb 02 12:45:28 crc kubenswrapper[4909]: I0202 12:45:28.691048 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6fvm" event={"ID":"3cf98484-153c-477b-a528-deb7ff9290ad","Type":"ContainerDied","Data":"b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85"} Feb 02 12:45:28 crc kubenswrapper[4909]: I0202 12:45:28.691079 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6fvm" event={"ID":"3cf98484-153c-477b-a528-deb7ff9290ad","Type":"ContainerStarted","Data":"b8d1f16bbdc975ec85b71fc78ad1ad73e1b9a2c43a80f87e8daffdb28c55c7c3"} Feb 02 12:45:29 crc kubenswrapper[4909]: I0202 12:45:29.022176 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:45:29 crc kubenswrapper[4909]: I0202 12:45:29.730901 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6fvm" event={"ID":"3cf98484-153c-477b-a528-deb7ff9290ad","Type":"ContainerStarted","Data":"7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68"} Feb 02 12:45:29 crc kubenswrapper[4909]: I0202 12:45:29.743640 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"ebf8edfb73e1f5eb8eed7f92f5322fadefb9d0e39dcda7f77c1d9a574544f0e8"} Feb 02 12:45:31 crc kubenswrapper[4909]: I0202 12:45:31.771648 4909 generic.go:334] "Generic (PLEG): container finished" podID="3cf98484-153c-477b-a528-deb7ff9290ad" containerID="7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68" exitCode=0 Feb 02 12:45:31 crc kubenswrapper[4909]: I0202 12:45:31.771716 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6fvm" event={"ID":"3cf98484-153c-477b-a528-deb7ff9290ad","Type":"ContainerDied","Data":"7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68"} Feb 02 12:45:32 crc kubenswrapper[4909]: I0202 12:45:32.119158 4909 scope.go:117] "RemoveContainer" containerID="5f7f8e5dfb27c1152235d5f36d78d1bb515b4f28f5ea150c9413f7c280ba8024" Feb 02 12:45:32 crc kubenswrapper[4909]: I0202 12:45:32.782423 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6fvm" event={"ID":"3cf98484-153c-477b-a528-deb7ff9290ad","Type":"ContainerStarted","Data":"3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8"} Feb 02 12:45:32 crc kubenswrapper[4909]: I0202 12:45:32.812638 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x6fvm" podStartSLOduration=2.310808831 podStartE2EDuration="5.812613792s" podCreationTimestamp="2026-02-02 12:45:27 +0000 UTC" firstStartedPulling="2026-02-02 12:45:28.692643303 +0000 UTC m=+8054.438744028" lastFinishedPulling="2026-02-02 12:45:32.194448254 +0000 UTC m=+8057.940548989" observedRunningTime="2026-02-02 12:45:32.805775538 +0000 UTC m=+8058.551876273" watchObservedRunningTime="2026-02-02 12:45:32.812613792 +0000 UTC m=+8058.558714517" Feb 02 12:45:37 crc kubenswrapper[4909]: I0202 12:45:37.819518 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:37 crc kubenswrapper[4909]: I0202 12:45:37.820266 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:37 crc kubenswrapper[4909]: I0202 12:45:37.872274 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:37 crc kubenswrapper[4909]: I0202 12:45:37.921308 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:38 crc kubenswrapper[4909]: I0202 12:45:38.116728 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6fvm"] Feb 02 12:45:39 crc kubenswrapper[4909]: I0202 12:45:39.883774 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x6fvm" podUID="3cf98484-153c-477b-a528-deb7ff9290ad" containerName="registry-server" containerID="cri-o://3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8" gracePeriod=2 Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.482994 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.569142 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4txjk\" (UniqueName: \"kubernetes.io/projected/3cf98484-153c-477b-a528-deb7ff9290ad-kube-api-access-4txjk\") pod \"3cf98484-153c-477b-a528-deb7ff9290ad\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.569192 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-utilities\") pod \"3cf98484-153c-477b-a528-deb7ff9290ad\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.569243 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-catalog-content\") pod \"3cf98484-153c-477b-a528-deb7ff9290ad\" (UID: \"3cf98484-153c-477b-a528-deb7ff9290ad\") " Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.570493 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-utilities" (OuterVolumeSpecName: "utilities") pod "3cf98484-153c-477b-a528-deb7ff9290ad" (UID: "3cf98484-153c-477b-a528-deb7ff9290ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.575510 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cf98484-153c-477b-a528-deb7ff9290ad-kube-api-access-4txjk" (OuterVolumeSpecName: "kube-api-access-4txjk") pod "3cf98484-153c-477b-a528-deb7ff9290ad" (UID: "3cf98484-153c-477b-a528-deb7ff9290ad"). InnerVolumeSpecName "kube-api-access-4txjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.619589 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cf98484-153c-477b-a528-deb7ff9290ad" (UID: "3cf98484-153c-477b-a528-deb7ff9290ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.671796 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4txjk\" (UniqueName: \"kubernetes.io/projected/3cf98484-153c-477b-a528-deb7ff9290ad-kube-api-access-4txjk\") on node \"crc\" DevicePath \"\"" Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.672131 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.672141 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cf98484-153c-477b-a528-deb7ff9290ad-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.894995 4909 generic.go:334] "Generic (PLEG): container finished" podID="3cf98484-153c-477b-a528-deb7ff9290ad" containerID="3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8" exitCode=0 Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.895059 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6fvm" event={"ID":"3cf98484-153c-477b-a528-deb7ff9290ad","Type":"ContainerDied","Data":"3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8"} Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.895128 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x6fvm" event={"ID":"3cf98484-153c-477b-a528-deb7ff9290ad","Type":"ContainerDied","Data":"b8d1f16bbdc975ec85b71fc78ad1ad73e1b9a2c43a80f87e8daffdb28c55c7c3"} Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.895124 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x6fvm" Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.895142 4909 scope.go:117] "RemoveContainer" containerID="3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8" Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.920657 4909 scope.go:117] "RemoveContainer" containerID="7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68" Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.935021 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x6fvm"] Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.944784 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x6fvm"] Feb 02 12:45:40 crc kubenswrapper[4909]: I0202 12:45:40.960571 4909 scope.go:117] "RemoveContainer" containerID="b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85" Feb 02 12:45:41 crc kubenswrapper[4909]: I0202 12:45:41.013924 4909 scope.go:117] "RemoveContainer" containerID="3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8" Feb 02 12:45:41 crc kubenswrapper[4909]: E0202 12:45:41.014598 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8\": container with ID starting with 3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8 not found: ID does not exist" containerID="3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8" Feb 02 12:45:41 crc kubenswrapper[4909]: I0202 12:45:41.014629 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8"} err="failed to get container status \"3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8\": rpc error: code = NotFound desc = could not find container \"3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8\": container with ID starting with 3a90bb3fe19bd15976a023e2b2906fed1cf20e2d4a50e36d4c05a9784ded2ec8 not found: ID does not exist" Feb 02 12:45:41 crc kubenswrapper[4909]: I0202 12:45:41.014667 4909 scope.go:117] "RemoveContainer" containerID="7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68" Feb 02 12:45:41 crc kubenswrapper[4909]: E0202 12:45:41.015147 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68\": container with ID starting with 7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68 not found: ID does not exist" containerID="7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68" Feb 02 12:45:41 crc kubenswrapper[4909]: I0202 12:45:41.015212 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68"} err="failed to get container status \"7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68\": rpc error: code = NotFound desc = could not find container \"7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68\": container with ID starting with 7d2da5b8d1ceb4bd73515c6cf6e4ce4b77ea3c31846d12bf4604e8b409318b68 not found: ID does not exist" Feb 02 12:45:41 crc kubenswrapper[4909]: I0202 12:45:41.015259 4909 scope.go:117] "RemoveContainer" containerID="b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85" Feb 02 12:45:41 crc kubenswrapper[4909]: E0202 12:45:41.016891 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85\": container with ID starting with b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85 not found: ID does not exist" containerID="b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85" Feb 02 12:45:41 crc kubenswrapper[4909]: I0202 12:45:41.016949 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85"} err="failed to get container status \"b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85\": rpc error: code = NotFound desc = could not find container \"b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85\": container with ID starting with b5490d0879ca445ae3d25955867b4f68117e8e27bc3ccbc960ffbaae5687ad85 not found: ID does not exist" Feb 02 12:45:41 crc kubenswrapper[4909]: I0202 12:45:41.030401 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cf98484-153c-477b-a528-deb7ff9290ad" path="/var/lib/kubelet/pods/3cf98484-153c-477b-a528-deb7ff9290ad/volumes" Feb 02 12:46:38 crc kubenswrapper[4909]: I0202 12:46:38.399833 4909 generic.go:334] "Generic (PLEG): container finished" podID="fda9b4fc-6fb1-46c0-8bc4-90b443735e5c" containerID="e1111120727949ad69b2664f1188cf98fb24f94536787a0bbd47c53777250a50" exitCode=0 Feb 02 12:46:38 crc kubenswrapper[4909]: I0202 12:46:38.399912 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-hpv94" event={"ID":"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c","Type":"ContainerDied","Data":"e1111120727949ad69b2664f1188cf98fb24f94536787a0bbd47c53777250a50"} Feb 02 12:46:39 crc kubenswrapper[4909]: I0202 12:46:39.923891 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.037743 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-inventory\") pod \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.037802 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-combined-ca-bundle\") pod \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.037878 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-secret-0\") pod \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.037946 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-ssh-key-openstack-cell1\") pod \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.037981 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb489\" (UniqueName: \"kubernetes.io/projected/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-kube-api-access-sb489\") pod \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.043791 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c" (UID: "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.044385 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-kube-api-access-sb489" (OuterVolumeSpecName: "kube-api-access-sb489") pod "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c" (UID: "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c"). InnerVolumeSpecName "kube-api-access-sb489". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:46:40 crc kubenswrapper[4909]: E0202 12:46:40.068954 4909 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-inventory podName:fda9b4fc-6fb1-46c0-8bc4-90b443735e5c nodeName:}" failed. No retries permitted until 2026-02-02 12:46:40.568929175 +0000 UTC m=+8126.315029910 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-inventory") pod "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c" (UID: "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c") : error deleting /var/lib/kubelet/pods/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c/volume-subpaths: remove /var/lib/kubelet/pods/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c/volume-subpaths: no such file or directory Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.072218 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c" (UID: "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.072862 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c" (UID: "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.143538 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.143593 4909 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.143606 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.143618 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb489\" (UniqueName: \"kubernetes.io/projected/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-kube-api-access-sb489\") on node \"crc\" DevicePath \"\"" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.420604 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-hpv94" event={"ID":"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c","Type":"ContainerDied","Data":"841fc09cc4e0b2e00a77e885df092056b05e0f369f5e512dceacc53e854b7e4f"} Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.420988 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="841fc09cc4e0b2e00a77e885df092056b05e0f369f5e512dceacc53e854b7e4f" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.420647 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-hpv94" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.512138 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-sjz6r"] Feb 02 12:46:40 crc kubenswrapper[4909]: E0202 12:46:40.512560 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf98484-153c-477b-a528-deb7ff9290ad" containerName="extract-utilities" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.512578 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf98484-153c-477b-a528-deb7ff9290ad" containerName="extract-utilities" Feb 02 12:46:40 crc kubenswrapper[4909]: E0202 12:46:40.512595 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf98484-153c-477b-a528-deb7ff9290ad" containerName="extract-content" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.512602 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf98484-153c-477b-a528-deb7ff9290ad" containerName="extract-content" Feb 02 12:46:40 crc kubenswrapper[4909]: E0202 12:46:40.512621 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cf98484-153c-477b-a528-deb7ff9290ad" containerName="registry-server" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.512627 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cf98484-153c-477b-a528-deb7ff9290ad" containerName="registry-server" Feb 02 12:46:40 crc kubenswrapper[4909]: E0202 12:46:40.512645 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fda9b4fc-6fb1-46c0-8bc4-90b443735e5c" containerName="libvirt-openstack-openstack-cell1" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.512651 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="fda9b4fc-6fb1-46c0-8bc4-90b443735e5c" containerName="libvirt-openstack-openstack-cell1" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.512861 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="fda9b4fc-6fb1-46c0-8bc4-90b443735e5c" containerName="libvirt-openstack-openstack-cell1" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.512874 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cf98484-153c-477b-a528-deb7ff9290ad" containerName="registry-server" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.513607 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.517571 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.517823 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.517936 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.532835 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-sjz6r"] Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.555444 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.555506 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.555558 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.555591 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.555683 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.555723 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-inventory\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.555756 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.555899 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcq7m\" (UniqueName: \"kubernetes.io/projected/900e031b-9fe3-4b77-b7e1-91bfde689d44-kube-api-access-gcq7m\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.555969 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.657317 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-inventory\") pod \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\" (UID: \"fda9b4fc-6fb1-46c0-8bc4-90b443735e5c\") " Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.657651 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.657709 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-inventory\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.657732 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.657882 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcq7m\" (UniqueName: \"kubernetes.io/projected/900e031b-9fe3-4b77-b7e1-91bfde689d44-kube-api-access-gcq7m\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.657951 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.658557 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.658605 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.658643 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.658682 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.658795 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.662571 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-inventory" (OuterVolumeSpecName: "inventory") pod "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c" (UID: "fda9b4fc-6fb1-46c0-8bc4-90b443735e5c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.662662 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.663875 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.664036 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.664489 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.665729 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.667250 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-inventory\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.668335 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.676124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcq7m\" (UniqueName: \"kubernetes.io/projected/900e031b-9fe3-4b77-b7e1-91bfde689d44-kube-api-access-gcq7m\") pod \"nova-cell1-openstack-openstack-cell1-sjz6r\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.760305 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fda9b4fc-6fb1-46c0-8bc4-90b443735e5c-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:46:40 crc kubenswrapper[4909]: I0202 12:46:40.853397 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:46:41 crc kubenswrapper[4909]: I0202 12:46:41.366068 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-sjz6r"] Feb 02 12:46:41 crc kubenswrapper[4909]: I0202 12:46:41.369108 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 12:46:41 crc kubenswrapper[4909]: I0202 12:46:41.433094 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" event={"ID":"900e031b-9fe3-4b77-b7e1-91bfde689d44","Type":"ContainerStarted","Data":"d6088124af1f69f6e061a9914110085eda9bd640e68cdd923a56ba72b3a60ec7"} Feb 02 12:46:42 crc kubenswrapper[4909]: I0202 12:46:42.443605 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" event={"ID":"900e031b-9fe3-4b77-b7e1-91bfde689d44","Type":"ContainerStarted","Data":"789a523905adc2e8d603249baf44f44d85db41abc2525cbc22e3a4ce31f7eff8"} Feb 02 12:46:42 crc kubenswrapper[4909]: I0202 12:46:42.465341 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" podStartSLOduration=2.009306517 podStartE2EDuration="2.465306689s" podCreationTimestamp="2026-02-02 12:46:40 +0000 UTC" firstStartedPulling="2026-02-02 12:46:41.368734203 +0000 UTC m=+8127.114834938" lastFinishedPulling="2026-02-02 12:46:41.824734375 +0000 UTC m=+8127.570835110" observedRunningTime="2026-02-02 12:46:42.460173264 +0000 UTC m=+8128.206273999" watchObservedRunningTime="2026-02-02 12:46:42.465306689 +0000 UTC m=+8128.211407424" Feb 02 12:47:49 crc kubenswrapper[4909]: I0202 12:47:49.511133 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:47:49 crc kubenswrapper[4909]: I0202 12:47:49.511943 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:48:19 crc kubenswrapper[4909]: I0202 12:48:19.511747 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:48:19 crc kubenswrapper[4909]: I0202 12:48:19.512417 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.240059 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wtx26"] Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.242692 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.260626 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtx26"] Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.308411 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgcgx\" (UniqueName: \"kubernetes.io/projected/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-kube-api-access-wgcgx\") pod \"redhat-marketplace-wtx26\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.308749 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-utilities\") pod \"redhat-marketplace-wtx26\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.308845 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-catalog-content\") pod \"redhat-marketplace-wtx26\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.411701 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgcgx\" (UniqueName: \"kubernetes.io/projected/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-kube-api-access-wgcgx\") pod \"redhat-marketplace-wtx26\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.411891 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-utilities\") pod \"redhat-marketplace-wtx26\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.411930 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-catalog-content\") pod \"redhat-marketplace-wtx26\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.412508 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-catalog-content\") pod \"redhat-marketplace-wtx26\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.413138 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-utilities\") pod \"redhat-marketplace-wtx26\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.432259 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgcgx\" (UniqueName: \"kubernetes.io/projected/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-kube-api-access-wgcgx\") pod \"redhat-marketplace-wtx26\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:47 crc kubenswrapper[4909]: I0202 12:48:47.569004 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:48 crc kubenswrapper[4909]: I0202 12:48:48.101332 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtx26"] Feb 02 12:48:48 crc kubenswrapper[4909]: W0202 12:48:48.109919 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43adb095_3607_401c_a4ae_dcf3c0d1fcc4.slice/crio-ce490397736f8373b8070828786152c1ddfb2e05a9928ae5517b5a3a9f87b6ef WatchSource:0}: Error finding container ce490397736f8373b8070828786152c1ddfb2e05a9928ae5517b5a3a9f87b6ef: Status 404 returned error can't find the container with id ce490397736f8373b8070828786152c1ddfb2e05a9928ae5517b5a3a9f87b6ef Feb 02 12:48:48 crc kubenswrapper[4909]: I0202 12:48:48.687506 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtx26" event={"ID":"43adb095-3607-401c-a4ae-dcf3c0d1fcc4","Type":"ContainerDied","Data":"532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c"} Feb 02 12:48:48 crc kubenswrapper[4909]: I0202 12:48:48.687414 4909 generic.go:334] "Generic (PLEG): container finished" podID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerID="532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c" exitCode=0 Feb 02 12:48:48 crc kubenswrapper[4909]: I0202 12:48:48.692968 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtx26" event={"ID":"43adb095-3607-401c-a4ae-dcf3c0d1fcc4","Type":"ContainerStarted","Data":"ce490397736f8373b8070828786152c1ddfb2e05a9928ae5517b5a3a9f87b6ef"} Feb 02 12:48:49 crc kubenswrapper[4909]: I0202 12:48:49.511095 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:48:49 crc kubenswrapper[4909]: I0202 12:48:49.511572 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:48:49 crc kubenswrapper[4909]: I0202 12:48:49.511697 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 12:48:49 crc kubenswrapper[4909]: I0202 12:48:49.512483 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ebf8edfb73e1f5eb8eed7f92f5322fadefb9d0e39dcda7f77c1d9a574544f0e8"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 12:48:49 crc kubenswrapper[4909]: I0202 12:48:49.512608 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://ebf8edfb73e1f5eb8eed7f92f5322fadefb9d0e39dcda7f77c1d9a574544f0e8" gracePeriod=600 Feb 02 12:48:49 crc kubenswrapper[4909]: I0202 12:48:49.706684 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtx26" event={"ID":"43adb095-3607-401c-a4ae-dcf3c0d1fcc4","Type":"ContainerStarted","Data":"c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d"} Feb 02 12:48:49 crc kubenswrapper[4909]: I0202 12:48:49.710369 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="ebf8edfb73e1f5eb8eed7f92f5322fadefb9d0e39dcda7f77c1d9a574544f0e8" exitCode=0 Feb 02 12:48:49 crc kubenswrapper[4909]: I0202 12:48:49.710414 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"ebf8edfb73e1f5eb8eed7f92f5322fadefb9d0e39dcda7f77c1d9a574544f0e8"} Feb 02 12:48:49 crc kubenswrapper[4909]: I0202 12:48:49.710446 4909 scope.go:117] "RemoveContainer" containerID="e730e1874a2f1e1761d2d0c1e4b37e7b807b7e4c8960067c1b766aa7b537ed2d" Feb 02 12:48:50 crc kubenswrapper[4909]: I0202 12:48:50.720927 4909 generic.go:334] "Generic (PLEG): container finished" podID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerID="c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d" exitCode=0 Feb 02 12:48:50 crc kubenswrapper[4909]: I0202 12:48:50.721034 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtx26" event={"ID":"43adb095-3607-401c-a4ae-dcf3c0d1fcc4","Type":"ContainerDied","Data":"c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d"} Feb 02 12:48:50 crc kubenswrapper[4909]: I0202 12:48:50.725344 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b"} Feb 02 12:48:51 crc kubenswrapper[4909]: I0202 12:48:51.760316 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtx26" event={"ID":"43adb095-3607-401c-a4ae-dcf3c0d1fcc4","Type":"ContainerStarted","Data":"a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609"} Feb 02 12:48:51 crc kubenswrapper[4909]: I0202 12:48:51.810897 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wtx26" podStartSLOduration=2.244886238 podStartE2EDuration="4.810871509s" podCreationTimestamp="2026-02-02 12:48:47 +0000 UTC" firstStartedPulling="2026-02-02 12:48:48.689844987 +0000 UTC m=+8254.435945722" lastFinishedPulling="2026-02-02 12:48:51.255830258 +0000 UTC m=+8257.001930993" observedRunningTime="2026-02-02 12:48:51.796851461 +0000 UTC m=+8257.542952196" watchObservedRunningTime="2026-02-02 12:48:51.810871509 +0000 UTC m=+8257.556972244" Feb 02 12:48:57 crc kubenswrapper[4909]: I0202 12:48:57.569881 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:57 crc kubenswrapper[4909]: I0202 12:48:57.570385 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:57 crc kubenswrapper[4909]: I0202 12:48:57.622499 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:57 crc kubenswrapper[4909]: I0202 12:48:57.863231 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:48:57 crc kubenswrapper[4909]: I0202 12:48:57.911664 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtx26"] Feb 02 12:48:59 crc kubenswrapper[4909]: I0202 12:48:59.838600 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wtx26" podUID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerName="registry-server" containerID="cri-o://a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609" gracePeriod=2 Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.322412 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.431584 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-utilities\") pod \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.431683 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-catalog-content\") pod \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.431970 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgcgx\" (UniqueName: \"kubernetes.io/projected/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-kube-api-access-wgcgx\") pod \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\" (UID: \"43adb095-3607-401c-a4ae-dcf3c0d1fcc4\") " Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.432859 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-utilities" (OuterVolumeSpecName: "utilities") pod "43adb095-3607-401c-a4ae-dcf3c0d1fcc4" (UID: "43adb095-3607-401c-a4ae-dcf3c0d1fcc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.437206 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-kube-api-access-wgcgx" (OuterVolumeSpecName: "kube-api-access-wgcgx") pod "43adb095-3607-401c-a4ae-dcf3c0d1fcc4" (UID: "43adb095-3607-401c-a4ae-dcf3c0d1fcc4"). InnerVolumeSpecName "kube-api-access-wgcgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.454067 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43adb095-3607-401c-a4ae-dcf3c0d1fcc4" (UID: "43adb095-3607-401c-a4ae-dcf3c0d1fcc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.534501 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgcgx\" (UniqueName: \"kubernetes.io/projected/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-kube-api-access-wgcgx\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.534546 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.534558 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43adb095-3607-401c-a4ae-dcf3c0d1fcc4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.869963 4909 generic.go:334] "Generic (PLEG): container finished" podID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerID="a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609" exitCode=0 Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.870068 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wtx26" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.870068 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtx26" event={"ID":"43adb095-3607-401c-a4ae-dcf3c0d1fcc4","Type":"ContainerDied","Data":"a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609"} Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.871907 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wtx26" event={"ID":"43adb095-3607-401c-a4ae-dcf3c0d1fcc4","Type":"ContainerDied","Data":"ce490397736f8373b8070828786152c1ddfb2e05a9928ae5517b5a3a9f87b6ef"} Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.871967 4909 scope.go:117] "RemoveContainer" containerID="a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.896901 4909 scope.go:117] "RemoveContainer" containerID="c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.910759 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtx26"] Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.919237 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wtx26"] Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.921636 4909 scope.go:117] "RemoveContainer" containerID="532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.975874 4909 scope.go:117] "RemoveContainer" containerID="a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609" Feb 02 12:49:00 crc kubenswrapper[4909]: E0202 12:49:00.976335 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609\": container with ID starting with a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609 not found: ID does not exist" containerID="a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.976383 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609"} err="failed to get container status \"a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609\": rpc error: code = NotFound desc = could not find container \"a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609\": container with ID starting with a5261d56bdd72acca668d7d8b737a9c44859045305ed8fd8d9716bcaa5e8a609 not found: ID does not exist" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.976412 4909 scope.go:117] "RemoveContainer" containerID="c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d" Feb 02 12:49:00 crc kubenswrapper[4909]: E0202 12:49:00.976714 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d\": container with ID starting with c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d not found: ID does not exist" containerID="c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.976903 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d"} err="failed to get container status \"c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d\": rpc error: code = NotFound desc = could not find container \"c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d\": container with ID starting with c16234065af76624fff9718b0c3201b04ac8a676dd4b8120c71c9c437867da9d not found: ID does not exist" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.976945 4909 scope.go:117] "RemoveContainer" containerID="532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c" Feb 02 12:49:00 crc kubenswrapper[4909]: E0202 12:49:00.977307 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c\": container with ID starting with 532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c not found: ID does not exist" containerID="532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c" Feb 02 12:49:00 crc kubenswrapper[4909]: I0202 12:49:00.977336 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c"} err="failed to get container status \"532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c\": rpc error: code = NotFound desc = could not find container \"532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c\": container with ID starting with 532d16f1da32a7915e272ffb8a0d8aa8eeac78c14e3440198e7e423af0f9e98c not found: ID does not exist" Feb 02 12:49:01 crc kubenswrapper[4909]: I0202 12:49:01.030029 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" path="/var/lib/kubelet/pods/43adb095-3607-401c-a4ae-dcf3c0d1fcc4/volumes" Feb 02 12:49:22 crc kubenswrapper[4909]: I0202 12:49:22.050221 4909 generic.go:334] "Generic (PLEG): container finished" podID="900e031b-9fe3-4b77-b7e1-91bfde689d44" containerID="789a523905adc2e8d603249baf44f44d85db41abc2525cbc22e3a4ce31f7eff8" exitCode=0 Feb 02 12:49:22 crc kubenswrapper[4909]: I0202 12:49:22.050306 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" event={"ID":"900e031b-9fe3-4b77-b7e1-91bfde689d44","Type":"ContainerDied","Data":"789a523905adc2e8d603249baf44f44d85db41abc2525cbc22e3a4ce31f7eff8"} Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.540103 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.654079 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-1\") pod \"900e031b-9fe3-4b77-b7e1-91bfde689d44\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.654172 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-ssh-key-openstack-cell1\") pod \"900e031b-9fe3-4b77-b7e1-91bfde689d44\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.654202 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-0\") pod \"900e031b-9fe3-4b77-b7e1-91bfde689d44\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.654259 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cells-global-config-0\") pod \"900e031b-9fe3-4b77-b7e1-91bfde689d44\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.654334 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-0\") pod \"900e031b-9fe3-4b77-b7e1-91bfde689d44\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.654366 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-inventory\") pod \"900e031b-9fe3-4b77-b7e1-91bfde689d44\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.654388 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-combined-ca-bundle\") pod \"900e031b-9fe3-4b77-b7e1-91bfde689d44\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.654410 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcq7m\" (UniqueName: \"kubernetes.io/projected/900e031b-9fe3-4b77-b7e1-91bfde689d44-kube-api-access-gcq7m\") pod \"900e031b-9fe3-4b77-b7e1-91bfde689d44\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.654472 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-1\") pod \"900e031b-9fe3-4b77-b7e1-91bfde689d44\" (UID: \"900e031b-9fe3-4b77-b7e1-91bfde689d44\") " Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.660463 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "900e031b-9fe3-4b77-b7e1-91bfde689d44" (UID: "900e031b-9fe3-4b77-b7e1-91bfde689d44"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.674329 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/900e031b-9fe3-4b77-b7e1-91bfde689d44-kube-api-access-gcq7m" (OuterVolumeSpecName: "kube-api-access-gcq7m") pod "900e031b-9fe3-4b77-b7e1-91bfde689d44" (UID: "900e031b-9fe3-4b77-b7e1-91bfde689d44"). InnerVolumeSpecName "kube-api-access-gcq7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.685619 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "900e031b-9fe3-4b77-b7e1-91bfde689d44" (UID: "900e031b-9fe3-4b77-b7e1-91bfde689d44"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.687205 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "900e031b-9fe3-4b77-b7e1-91bfde689d44" (UID: "900e031b-9fe3-4b77-b7e1-91bfde689d44"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.692847 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "900e031b-9fe3-4b77-b7e1-91bfde689d44" (UID: "900e031b-9fe3-4b77-b7e1-91bfde689d44"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.694241 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "900e031b-9fe3-4b77-b7e1-91bfde689d44" (UID: "900e031b-9fe3-4b77-b7e1-91bfde689d44"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.696339 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "900e031b-9fe3-4b77-b7e1-91bfde689d44" (UID: "900e031b-9fe3-4b77-b7e1-91bfde689d44"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.705109 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "900e031b-9fe3-4b77-b7e1-91bfde689d44" (UID: "900e031b-9fe3-4b77-b7e1-91bfde689d44"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.707941 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-inventory" (OuterVolumeSpecName: "inventory") pod "900e031b-9fe3-4b77-b7e1-91bfde689d44" (UID: "900e031b-9fe3-4b77-b7e1-91bfde689d44"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.756716 4909 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.756757 4909 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.756768 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.756777 4909 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.756788 4909 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.756800 4909 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.756827 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.756838 4909 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/900e031b-9fe3-4b77-b7e1-91bfde689d44-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:23 crc kubenswrapper[4909]: I0202 12:49:23.756847 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcq7m\" (UniqueName: \"kubernetes.io/projected/900e031b-9fe3-4b77-b7e1-91bfde689d44-kube-api-access-gcq7m\") on node \"crc\" DevicePath \"\"" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.071520 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" event={"ID":"900e031b-9fe3-4b77-b7e1-91bfde689d44","Type":"ContainerDied","Data":"d6088124af1f69f6e061a9914110085eda9bd640e68cdd923a56ba72b3a60ec7"} Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.071555 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-sjz6r" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.071593 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6088124af1f69f6e061a9914110085eda9bd640e68cdd923a56ba72b3a60ec7" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.194506 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-mnd88"] Feb 02 12:49:24 crc kubenswrapper[4909]: E0202 12:49:24.195083 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerName="registry-server" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.195101 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerName="registry-server" Feb 02 12:49:24 crc kubenswrapper[4909]: E0202 12:49:24.195133 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerName="extract-content" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.195142 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerName="extract-content" Feb 02 12:49:24 crc kubenswrapper[4909]: E0202 12:49:24.195159 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerName="extract-utilities" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.195168 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerName="extract-utilities" Feb 02 12:49:24 crc kubenswrapper[4909]: E0202 12:49:24.195194 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="900e031b-9fe3-4b77-b7e1-91bfde689d44" containerName="nova-cell1-openstack-openstack-cell1" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.195202 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="900e031b-9fe3-4b77-b7e1-91bfde689d44" containerName="nova-cell1-openstack-openstack-cell1" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.195457 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="43adb095-3607-401c-a4ae-dcf3c0d1fcc4" containerName="registry-server" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.195498 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="900e031b-9fe3-4b77-b7e1-91bfde689d44" containerName="nova-cell1-openstack-openstack-cell1" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.196474 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.200160 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.200563 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.200863 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.200982 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.201086 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.212919 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-mnd88"] Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.368085 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.368165 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.368214 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl9wx\" (UniqueName: \"kubernetes.io/projected/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-kube-api-access-tl9wx\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.368256 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-inventory\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.368296 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.368321 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.368343 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.469923 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.470028 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.470090 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl9wx\" (UniqueName: \"kubernetes.io/projected/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-kube-api-access-tl9wx\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.470146 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-inventory\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.470202 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.470241 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.470273 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.475561 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.475774 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.476571 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.486410 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-inventory\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.486531 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.486628 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.490785 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl9wx\" (UniqueName: \"kubernetes.io/projected/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-kube-api-access-tl9wx\") pod \"telemetry-openstack-openstack-cell1-mnd88\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:24 crc kubenswrapper[4909]: I0202 12:49:24.519791 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:49:25 crc kubenswrapper[4909]: I0202 12:49:25.051166 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-mnd88"] Feb 02 12:49:25 crc kubenswrapper[4909]: I0202 12:49:25.082954 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-mnd88" event={"ID":"ff72a01e-3127-4bd2-bcc8-abddc9be70fc","Type":"ContainerStarted","Data":"02413457b2592d4c6a9a474d486709371f8a697b8e144606a3c631a86c967205"} Feb 02 12:49:26 crc kubenswrapper[4909]: I0202 12:49:26.094451 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-mnd88" event={"ID":"ff72a01e-3127-4bd2-bcc8-abddc9be70fc","Type":"ContainerStarted","Data":"cd154f972e11686ef1744d04cce88976a68e4da0c2ae11345418c2726e9445b3"} Feb 02 12:49:26 crc kubenswrapper[4909]: I0202 12:49:26.127033 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-mnd88" podStartSLOduration=1.611582751 podStartE2EDuration="2.127011869s" podCreationTimestamp="2026-02-02 12:49:24 +0000 UTC" firstStartedPulling="2026-02-02 12:49:25.062265032 +0000 UTC m=+8290.808365767" lastFinishedPulling="2026-02-02 12:49:25.57769415 +0000 UTC m=+8291.323794885" observedRunningTime="2026-02-02 12:49:26.117046426 +0000 UTC m=+8291.863147171" watchObservedRunningTime="2026-02-02 12:49:26.127011869 +0000 UTC m=+8291.873112604" Feb 02 12:50:49 crc kubenswrapper[4909]: I0202 12:50:49.511388 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:50:49 crc kubenswrapper[4909]: I0202 12:50:49.512020 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:51:19 crc kubenswrapper[4909]: I0202 12:51:19.511005 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:51:19 crc kubenswrapper[4909]: I0202 12:51:19.511642 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.014473 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vjbgn"] Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.017480 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.036920 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vjbgn"] Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.158666 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-catalog-content\") pod \"redhat-operators-vjbgn\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.158784 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsx4r\" (UniqueName: \"kubernetes.io/projected/b39d848c-cd94-4cdd-a594-da3779eb3509-kube-api-access-nsx4r\") pod \"redhat-operators-vjbgn\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.158871 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-utilities\") pod \"redhat-operators-vjbgn\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.260529 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-catalog-content\") pod \"redhat-operators-vjbgn\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.260645 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsx4r\" (UniqueName: \"kubernetes.io/projected/b39d848c-cd94-4cdd-a594-da3779eb3509-kube-api-access-nsx4r\") pod \"redhat-operators-vjbgn\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.260679 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-utilities\") pod \"redhat-operators-vjbgn\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.261134 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-catalog-content\") pod \"redhat-operators-vjbgn\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.261163 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-utilities\") pod \"redhat-operators-vjbgn\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.279552 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsx4r\" (UniqueName: \"kubernetes.io/projected/b39d848c-cd94-4cdd-a594-da3779eb3509-kube-api-access-nsx4r\") pod \"redhat-operators-vjbgn\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.346339 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:45 crc kubenswrapper[4909]: I0202 12:51:45.871988 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vjbgn"] Feb 02 12:51:46 crc kubenswrapper[4909]: I0202 12:51:46.458248 4909 generic.go:334] "Generic (PLEG): container finished" podID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerID="790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb" exitCode=0 Feb 02 12:51:46 crc kubenswrapper[4909]: I0202 12:51:46.458297 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjbgn" event={"ID":"b39d848c-cd94-4cdd-a594-da3779eb3509","Type":"ContainerDied","Data":"790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb"} Feb 02 12:51:46 crc kubenswrapper[4909]: I0202 12:51:46.458323 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjbgn" event={"ID":"b39d848c-cd94-4cdd-a594-da3779eb3509","Type":"ContainerStarted","Data":"5d859c247a830755f38a206f21208384346d59e27806081b4f08156019bd0db6"} Feb 02 12:51:46 crc kubenswrapper[4909]: I0202 12:51:46.460095 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 12:51:48 crc kubenswrapper[4909]: I0202 12:51:48.478060 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjbgn" event={"ID":"b39d848c-cd94-4cdd-a594-da3779eb3509","Type":"ContainerStarted","Data":"7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770"} Feb 02 12:51:49 crc kubenswrapper[4909]: I0202 12:51:49.511179 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:51:49 crc kubenswrapper[4909]: I0202 12:51:49.511455 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:51:49 crc kubenswrapper[4909]: I0202 12:51:49.511496 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 12:51:49 crc kubenswrapper[4909]: I0202 12:51:49.512307 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 12:51:49 crc kubenswrapper[4909]: I0202 12:51:49.512359 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" gracePeriod=600 Feb 02 12:51:49 crc kubenswrapper[4909]: E0202 12:51:49.637576 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:51:50 crc kubenswrapper[4909]: I0202 12:51:50.494431 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" exitCode=0 Feb 02 12:51:50 crc kubenswrapper[4909]: I0202 12:51:50.494478 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b"} Feb 02 12:51:50 crc kubenswrapper[4909]: I0202 12:51:50.495081 4909 scope.go:117] "RemoveContainer" containerID="ebf8edfb73e1f5eb8eed7f92f5322fadefb9d0e39dcda7f77c1d9a574544f0e8" Feb 02 12:51:50 crc kubenswrapper[4909]: I0202 12:51:50.495973 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:51:50 crc kubenswrapper[4909]: E0202 12:51:50.496287 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:51:52 crc kubenswrapper[4909]: I0202 12:51:52.518792 4909 generic.go:334] "Generic (PLEG): container finished" podID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerID="7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770" exitCode=0 Feb 02 12:51:52 crc kubenswrapper[4909]: I0202 12:51:52.518935 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjbgn" event={"ID":"b39d848c-cd94-4cdd-a594-da3779eb3509","Type":"ContainerDied","Data":"7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770"} Feb 02 12:51:53 crc kubenswrapper[4909]: I0202 12:51:53.542308 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjbgn" event={"ID":"b39d848c-cd94-4cdd-a594-da3779eb3509","Type":"ContainerStarted","Data":"a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d"} Feb 02 12:51:53 crc kubenswrapper[4909]: I0202 12:51:53.566970 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vjbgn" podStartSLOduration=3.128208395 podStartE2EDuration="9.566951675s" podCreationTimestamp="2026-02-02 12:51:44 +0000 UTC" firstStartedPulling="2026-02-02 12:51:46.459881286 +0000 UTC m=+8432.205982021" lastFinishedPulling="2026-02-02 12:51:52.898624566 +0000 UTC m=+8438.644725301" observedRunningTime="2026-02-02 12:51:53.560257645 +0000 UTC m=+8439.306358380" watchObservedRunningTime="2026-02-02 12:51:53.566951675 +0000 UTC m=+8439.313052410" Feb 02 12:51:55 crc kubenswrapper[4909]: I0202 12:51:55.347273 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:55 crc kubenswrapper[4909]: I0202 12:51:55.347648 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:51:56 crc kubenswrapper[4909]: I0202 12:51:56.389896 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vjbgn" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerName="registry-server" probeResult="failure" output=< Feb 02 12:51:56 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:51:56 crc kubenswrapper[4909]: > Feb 02 12:52:04 crc kubenswrapper[4909]: I0202 12:52:04.017708 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:52:04 crc kubenswrapper[4909]: E0202 12:52:04.018926 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:52:06 crc kubenswrapper[4909]: I0202 12:52:06.390075 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vjbgn" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerName="registry-server" probeResult="failure" output=< Feb 02 12:52:06 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 12:52:06 crc kubenswrapper[4909]: > Feb 02 12:52:15 crc kubenswrapper[4909]: I0202 12:52:15.390301 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:52:15 crc kubenswrapper[4909]: I0202 12:52:15.444681 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:52:16 crc kubenswrapper[4909]: I0202 12:52:16.218930 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vjbgn"] Feb 02 12:52:16 crc kubenswrapper[4909]: I0202 12:52:16.743405 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vjbgn" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerName="registry-server" containerID="cri-o://a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d" gracePeriod=2 Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.220998 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.380429 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-utilities\") pod \"b39d848c-cd94-4cdd-a594-da3779eb3509\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.380586 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-catalog-content\") pod \"b39d848c-cd94-4cdd-a594-da3779eb3509\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.380633 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsx4r\" (UniqueName: \"kubernetes.io/projected/b39d848c-cd94-4cdd-a594-da3779eb3509-kube-api-access-nsx4r\") pod \"b39d848c-cd94-4cdd-a594-da3779eb3509\" (UID: \"b39d848c-cd94-4cdd-a594-da3779eb3509\") " Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.381125 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-utilities" (OuterVolumeSpecName: "utilities") pod "b39d848c-cd94-4cdd-a594-da3779eb3509" (UID: "b39d848c-cd94-4cdd-a594-da3779eb3509"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.381410 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.385727 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39d848c-cd94-4cdd-a594-da3779eb3509-kube-api-access-nsx4r" (OuterVolumeSpecName: "kube-api-access-nsx4r") pod "b39d848c-cd94-4cdd-a594-da3779eb3509" (UID: "b39d848c-cd94-4cdd-a594-da3779eb3509"). InnerVolumeSpecName "kube-api-access-nsx4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.483208 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsx4r\" (UniqueName: \"kubernetes.io/projected/b39d848c-cd94-4cdd-a594-da3779eb3509-kube-api-access-nsx4r\") on node \"crc\" DevicePath \"\"" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.492747 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b39d848c-cd94-4cdd-a594-da3779eb3509" (UID: "b39d848c-cd94-4cdd-a594-da3779eb3509"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.584925 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39d848c-cd94-4cdd-a594-da3779eb3509-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.753673 4909 generic.go:334] "Generic (PLEG): container finished" podID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerID="a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d" exitCode=0 Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.753710 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjbgn" event={"ID":"b39d848c-cd94-4cdd-a594-da3779eb3509","Type":"ContainerDied","Data":"a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d"} Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.753735 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vjbgn" event={"ID":"b39d848c-cd94-4cdd-a594-da3779eb3509","Type":"ContainerDied","Data":"5d859c247a830755f38a206f21208384346d59e27806081b4f08156019bd0db6"} Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.753751 4909 scope.go:117] "RemoveContainer" containerID="a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.753794 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vjbgn" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.774056 4909 scope.go:117] "RemoveContainer" containerID="7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.789925 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vjbgn"] Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.800023 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vjbgn"] Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.817993 4909 scope.go:117] "RemoveContainer" containerID="790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.851228 4909 scope.go:117] "RemoveContainer" containerID="a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d" Feb 02 12:52:17 crc kubenswrapper[4909]: E0202 12:52:17.851788 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d\": container with ID starting with a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d not found: ID does not exist" containerID="a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.851846 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d"} err="failed to get container status \"a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d\": rpc error: code = NotFound desc = could not find container \"a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d\": container with ID starting with a78d112ee660a271aa1685019936d573441e8157e10ef49cf7aaf5928b0f434d not found: ID does not exist" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.851876 4909 scope.go:117] "RemoveContainer" containerID="7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770" Feb 02 12:52:17 crc kubenswrapper[4909]: E0202 12:52:17.852262 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770\": container with ID starting with 7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770 not found: ID does not exist" containerID="7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.852288 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770"} err="failed to get container status \"7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770\": rpc error: code = NotFound desc = could not find container \"7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770\": container with ID starting with 7a0442e4fe22af92fe56a83d63c6898ad4392139f390076d7808e1c85182d770 not found: ID does not exist" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.852305 4909 scope.go:117] "RemoveContainer" containerID="790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb" Feb 02 12:52:17 crc kubenswrapper[4909]: E0202 12:52:17.852567 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb\": container with ID starting with 790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb not found: ID does not exist" containerID="790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb" Feb 02 12:52:17 crc kubenswrapper[4909]: I0202 12:52:17.852592 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb"} err="failed to get container status \"790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb\": rpc error: code = NotFound desc = could not find container \"790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb\": container with ID starting with 790d11dabf6db142bbe6b70ad5bd7fbe4b88034159fe3e14f0b01cb46d8d6ceb not found: ID does not exist" Feb 02 12:52:18 crc kubenswrapper[4909]: I0202 12:52:18.016877 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:52:18 crc kubenswrapper[4909]: E0202 12:52:18.017187 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:52:19 crc kubenswrapper[4909]: I0202 12:52:19.028478 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" path="/var/lib/kubelet/pods/b39d848c-cd94-4cdd-a594-da3779eb3509/volumes" Feb 02 12:52:33 crc kubenswrapper[4909]: I0202 12:52:33.017086 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:52:33 crc kubenswrapper[4909]: E0202 12:52:33.020180 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:52:44 crc kubenswrapper[4909]: I0202 12:52:44.017823 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:52:44 crc kubenswrapper[4909]: E0202 12:52:44.018652 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:52:54 crc kubenswrapper[4909]: I0202 12:52:54.067667 4909 generic.go:334] "Generic (PLEG): container finished" podID="ff72a01e-3127-4bd2-bcc8-abddc9be70fc" containerID="cd154f972e11686ef1744d04cce88976a68e4da0c2ae11345418c2726e9445b3" exitCode=0 Feb 02 12:52:54 crc kubenswrapper[4909]: I0202 12:52:54.067749 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-mnd88" event={"ID":"ff72a01e-3127-4bd2-bcc8-abddc9be70fc","Type":"ContainerDied","Data":"cd154f972e11686ef1744d04cce88976a68e4da0c2ae11345418c2726e9445b3"} Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.579357 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.697610 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ssh-key-openstack-cell1\") pod \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.697666 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-1\") pod \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.697725 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl9wx\" (UniqueName: \"kubernetes.io/projected/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-kube-api-access-tl9wx\") pod \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.697817 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-telemetry-combined-ca-bundle\") pod \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.697954 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-2\") pod \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.698003 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-inventory\") pod \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.698046 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-0\") pod \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\" (UID: \"ff72a01e-3127-4bd2-bcc8-abddc9be70fc\") " Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.703789 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ff72a01e-3127-4bd2-bcc8-abddc9be70fc" (UID: "ff72a01e-3127-4bd2-bcc8-abddc9be70fc"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.716080 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-kube-api-access-tl9wx" (OuterVolumeSpecName: "kube-api-access-tl9wx") pod "ff72a01e-3127-4bd2-bcc8-abddc9be70fc" (UID: "ff72a01e-3127-4bd2-bcc8-abddc9be70fc"). InnerVolumeSpecName "kube-api-access-tl9wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.732044 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "ff72a01e-3127-4bd2-bcc8-abddc9be70fc" (UID: "ff72a01e-3127-4bd2-bcc8-abddc9be70fc"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.732102 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-inventory" (OuterVolumeSpecName: "inventory") pod "ff72a01e-3127-4bd2-bcc8-abddc9be70fc" (UID: "ff72a01e-3127-4bd2-bcc8-abddc9be70fc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.732792 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "ff72a01e-3127-4bd2-bcc8-abddc9be70fc" (UID: "ff72a01e-3127-4bd2-bcc8-abddc9be70fc"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.733964 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ff72a01e-3127-4bd2-bcc8-abddc9be70fc" (UID: "ff72a01e-3127-4bd2-bcc8-abddc9be70fc"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.743011 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "ff72a01e-3127-4bd2-bcc8-abddc9be70fc" (UID: "ff72a01e-3127-4bd2-bcc8-abddc9be70fc"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.800683 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.801046 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.801059 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.801073 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.801084 4909 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.801099 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl9wx\" (UniqueName: \"kubernetes.io/projected/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-kube-api-access-tl9wx\") on node \"crc\" DevicePath \"\"" Feb 02 12:52:55 crc kubenswrapper[4909]: I0202 12:52:55.801113 4909 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff72a01e-3127-4bd2-bcc8-abddc9be70fc-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.093017 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-mnd88" event={"ID":"ff72a01e-3127-4bd2-bcc8-abddc9be70fc","Type":"ContainerDied","Data":"02413457b2592d4c6a9a474d486709371f8a697b8e144606a3c631a86c967205"} Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.093045 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-mnd88" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.093053 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02413457b2592d4c6a9a474d486709371f8a697b8e144606a3c631a86c967205" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.189349 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-h4k8f"] Feb 02 12:52:56 crc kubenswrapper[4909]: E0202 12:52:56.189883 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerName="extract-utilities" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.189904 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerName="extract-utilities" Feb 02 12:52:56 crc kubenswrapper[4909]: E0202 12:52:56.189918 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff72a01e-3127-4bd2-bcc8-abddc9be70fc" containerName="telemetry-openstack-openstack-cell1" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.189926 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff72a01e-3127-4bd2-bcc8-abddc9be70fc" containerName="telemetry-openstack-openstack-cell1" Feb 02 12:52:56 crc kubenswrapper[4909]: E0202 12:52:56.189938 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerName="registry-server" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.189946 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerName="registry-server" Feb 02 12:52:56 crc kubenswrapper[4909]: E0202 12:52:56.189959 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerName="extract-content" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.189966 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerName="extract-content" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.190186 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff72a01e-3127-4bd2-bcc8-abddc9be70fc" containerName="telemetry-openstack-openstack-cell1" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.190202 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39d848c-cd94-4cdd-a594-da3779eb3509" containerName="registry-server" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.191026 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.203190 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.204459 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.204574 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.205996 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.222722 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.237456 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-h4k8f"] Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.318802 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.319028 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.319179 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.319350 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7dmz\" (UniqueName: \"kubernetes.io/projected/223a1c81-ef51-4efa-ae1d-e511e66719b4-kube-api-access-f7dmz\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.319436 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.421126 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.421230 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7dmz\" (UniqueName: \"kubernetes.io/projected/223a1c81-ef51-4efa-ae1d-e511e66719b4-kube-api-access-f7dmz\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.421254 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.421277 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.421342 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.427239 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.428393 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.429911 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.430641 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.440174 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7dmz\" (UniqueName: \"kubernetes.io/projected/223a1c81-ef51-4efa-ae1d-e511e66719b4-kube-api-access-f7dmz\") pod \"neutron-sriov-openstack-openstack-cell1-h4k8f\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:56 crc kubenswrapper[4909]: I0202 12:52:56.526745 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:52:57 crc kubenswrapper[4909]: I0202 12:52:57.017768 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:52:57 crc kubenswrapper[4909]: E0202 12:52:57.018336 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:52:57 crc kubenswrapper[4909]: I0202 12:52:57.066670 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-h4k8f"] Feb 02 12:52:57 crc kubenswrapper[4909]: I0202 12:52:57.103456 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" event={"ID":"223a1c81-ef51-4efa-ae1d-e511e66719b4","Type":"ContainerStarted","Data":"41711f1fdc414d8261d0c2cfc4241ec82747c5e908bc8293e6181d0000133ae0"} Feb 02 12:52:58 crc kubenswrapper[4909]: I0202 12:52:58.119158 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" event={"ID":"223a1c81-ef51-4efa-ae1d-e511e66719b4","Type":"ContainerStarted","Data":"8609a42913afee267754ff6cda7f9a4ee84366f38ea8a3650aa0cc8cf2b8a568"} Feb 02 12:52:58 crc kubenswrapper[4909]: I0202 12:52:58.140479 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" podStartSLOduration=1.543666355 podStartE2EDuration="2.140455693s" podCreationTimestamp="2026-02-02 12:52:56 +0000 UTC" firstStartedPulling="2026-02-02 12:52:57.07763236 +0000 UTC m=+8502.823733095" lastFinishedPulling="2026-02-02 12:52:57.674421698 +0000 UTC m=+8503.420522433" observedRunningTime="2026-02-02 12:52:58.135037519 +0000 UTC m=+8503.881138274" watchObservedRunningTime="2026-02-02 12:52:58.140455693 +0000 UTC m=+8503.886556448" Feb 02 12:53:10 crc kubenswrapper[4909]: I0202 12:53:10.017530 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:53:10 crc kubenswrapper[4909]: E0202 12:53:10.018473 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:53:23 crc kubenswrapper[4909]: I0202 12:53:23.016720 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:53:23 crc kubenswrapper[4909]: E0202 12:53:23.017519 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:53:36 crc kubenswrapper[4909]: I0202 12:53:36.016677 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:53:36 crc kubenswrapper[4909]: E0202 12:53:36.017589 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:53:49 crc kubenswrapper[4909]: I0202 12:53:49.017730 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:53:49 crc kubenswrapper[4909]: E0202 12:53:49.018499 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:53:53 crc kubenswrapper[4909]: I0202 12:53:53.669375 4909 generic.go:334] "Generic (PLEG): container finished" podID="223a1c81-ef51-4efa-ae1d-e511e66719b4" containerID="8609a42913afee267754ff6cda7f9a4ee84366f38ea8a3650aa0cc8cf2b8a568" exitCode=0 Feb 02 12:53:53 crc kubenswrapper[4909]: I0202 12:53:53.669466 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" event={"ID":"223a1c81-ef51-4efa-ae1d-e511e66719b4","Type":"ContainerDied","Data":"8609a42913afee267754ff6cda7f9a4ee84366f38ea8a3650aa0cc8cf2b8a568"} Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.218442 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.371748 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-agent-neutron-config-0\") pod \"223a1c81-ef51-4efa-ae1d-e511e66719b4\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.372040 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-combined-ca-bundle\") pod \"223a1c81-ef51-4efa-ae1d-e511e66719b4\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.372111 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-ssh-key-openstack-cell1\") pod \"223a1c81-ef51-4efa-ae1d-e511e66719b4\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.372224 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-inventory\") pod \"223a1c81-ef51-4efa-ae1d-e511e66719b4\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.372303 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7dmz\" (UniqueName: \"kubernetes.io/projected/223a1c81-ef51-4efa-ae1d-e511e66719b4-kube-api-access-f7dmz\") pod \"223a1c81-ef51-4efa-ae1d-e511e66719b4\" (UID: \"223a1c81-ef51-4efa-ae1d-e511e66719b4\") " Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.377939 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223a1c81-ef51-4efa-ae1d-e511e66719b4-kube-api-access-f7dmz" (OuterVolumeSpecName: "kube-api-access-f7dmz") pod "223a1c81-ef51-4efa-ae1d-e511e66719b4" (UID: "223a1c81-ef51-4efa-ae1d-e511e66719b4"). InnerVolumeSpecName "kube-api-access-f7dmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.398420 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "223a1c81-ef51-4efa-ae1d-e511e66719b4" (UID: "223a1c81-ef51-4efa-ae1d-e511e66719b4"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.425569 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "223a1c81-ef51-4efa-ae1d-e511e66719b4" (UID: "223a1c81-ef51-4efa-ae1d-e511e66719b4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.429325 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-inventory" (OuterVolumeSpecName: "inventory") pod "223a1c81-ef51-4efa-ae1d-e511e66719b4" (UID: "223a1c81-ef51-4efa-ae1d-e511e66719b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.430162 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "223a1c81-ef51-4efa-ae1d-e511e66719b4" (UID: "223a1c81-ef51-4efa-ae1d-e511e66719b4"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.475343 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.475389 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.475409 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.475422 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/223a1c81-ef51-4efa-ae1d-e511e66719b4-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.475435 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7dmz\" (UniqueName: \"kubernetes.io/projected/223a1c81-ef51-4efa-ae1d-e511e66719b4-kube-api-access-f7dmz\") on node \"crc\" DevicePath \"\"" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.688601 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" event={"ID":"223a1c81-ef51-4efa-ae1d-e511e66719b4","Type":"ContainerDied","Data":"41711f1fdc414d8261d0c2cfc4241ec82747c5e908bc8293e6181d0000133ae0"} Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.688955 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41711f1fdc414d8261d0c2cfc4241ec82747c5e908bc8293e6181d0000133ae0" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.688658 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-h4k8f" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.849307 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg"] Feb 02 12:53:55 crc kubenswrapper[4909]: E0202 12:53:55.849783 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223a1c81-ef51-4efa-ae1d-e511e66719b4" containerName="neutron-sriov-openstack-openstack-cell1" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.849799 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="223a1c81-ef51-4efa-ae1d-e511e66719b4" containerName="neutron-sriov-openstack-openstack-cell1" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.850035 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="223a1c81-ef51-4efa-ae1d-e511e66719b4" containerName="neutron-sriov-openstack-openstack-cell1" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.851119 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.855082 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.855416 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.855730 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.855727 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.856201 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.862600 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg"] Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.985012 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxmnb\" (UniqueName: \"kubernetes.io/projected/5ee147f3-fc77-4226-a4c8-50ecc12fe936-kube-api-access-kxmnb\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.985417 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.985594 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.985749 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:55 crc kubenswrapper[4909]: I0202 12:53:55.985893 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.087620 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.087691 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.087824 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxmnb\" (UniqueName: \"kubernetes.io/projected/5ee147f3-fc77-4226-a4c8-50ecc12fe936-kube-api-access-kxmnb\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.087899 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.087986 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.091606 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.091606 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.092320 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.092443 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.107392 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxmnb\" (UniqueName: \"kubernetes.io/projected/5ee147f3-fc77-4226-a4c8-50ecc12fe936-kube-api-access-kxmnb\") pod \"neutron-dhcp-openstack-openstack-cell1-xn5dg\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.177985 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:53:56 crc kubenswrapper[4909]: I0202 12:53:56.738440 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg"] Feb 02 12:53:57 crc kubenswrapper[4909]: W0202 12:53:57.108740 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ee147f3_fc77_4226_a4c8_50ecc12fe936.slice/crio-0976ca1340aa6df6f4a6d4394d1178fb83df64ea133cf4f4b79980d333d031b2 WatchSource:0}: Error finding container 0976ca1340aa6df6f4a6d4394d1178fb83df64ea133cf4f4b79980d333d031b2: Status 404 returned error can't find the container with id 0976ca1340aa6df6f4a6d4394d1178fb83df64ea133cf4f4b79980d333d031b2 Feb 02 12:53:57 crc kubenswrapper[4909]: I0202 12:53:57.708510 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" event={"ID":"5ee147f3-fc77-4226-a4c8-50ecc12fe936","Type":"ContainerStarted","Data":"0976ca1340aa6df6f4a6d4394d1178fb83df64ea133cf4f4b79980d333d031b2"} Feb 02 12:53:58 crc kubenswrapper[4909]: I0202 12:53:58.717465 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" event={"ID":"5ee147f3-fc77-4226-a4c8-50ecc12fe936","Type":"ContainerStarted","Data":"781a2e83ef9999b118a4e8d53b9517b8f26721fb575296a35e90108873b85890"} Feb 02 12:53:58 crc kubenswrapper[4909]: I0202 12:53:58.738951 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" podStartSLOduration=3.193978299 podStartE2EDuration="3.738933665s" podCreationTimestamp="2026-02-02 12:53:55 +0000 UTC" firstStartedPulling="2026-02-02 12:53:57.113789963 +0000 UTC m=+8562.859890708" lastFinishedPulling="2026-02-02 12:53:57.658745339 +0000 UTC m=+8563.404846074" observedRunningTime="2026-02-02 12:53:58.732009348 +0000 UTC m=+8564.478110103" watchObservedRunningTime="2026-02-02 12:53:58.738933665 +0000 UTC m=+8564.485034400" Feb 02 12:54:01 crc kubenswrapper[4909]: I0202 12:54:01.018248 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:54:01 crc kubenswrapper[4909]: E0202 12:54:01.019220 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:54:13 crc kubenswrapper[4909]: I0202 12:54:13.017522 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:54:13 crc kubenswrapper[4909]: E0202 12:54:13.018358 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:54:26 crc kubenswrapper[4909]: I0202 12:54:26.016206 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:54:26 crc kubenswrapper[4909]: E0202 12:54:26.018134 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:54:41 crc kubenswrapper[4909]: I0202 12:54:41.016778 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:54:41 crc kubenswrapper[4909]: E0202 12:54:41.017562 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:54:55 crc kubenswrapper[4909]: I0202 12:54:55.023615 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:54:55 crc kubenswrapper[4909]: E0202 12:54:55.024417 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:55:08 crc kubenswrapper[4909]: I0202 12:55:08.017024 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:55:08 crc kubenswrapper[4909]: E0202 12:55:08.017757 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.836098 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k7rbz"] Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.839235 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.855839 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k7rbz"] Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.882007 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbz4r\" (UniqueName: \"kubernetes.io/projected/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-kube-api-access-jbz4r\") pod \"community-operators-k7rbz\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.882171 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-utilities\") pod \"community-operators-k7rbz\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.882322 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-catalog-content\") pod \"community-operators-k7rbz\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.984317 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbz4r\" (UniqueName: \"kubernetes.io/projected/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-kube-api-access-jbz4r\") pod \"community-operators-k7rbz\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.984415 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-utilities\") pod \"community-operators-k7rbz\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.984448 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-catalog-content\") pod \"community-operators-k7rbz\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.984830 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-catalog-content\") pod \"community-operators-k7rbz\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:09 crc kubenswrapper[4909]: I0202 12:55:09.985058 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-utilities\") pod \"community-operators-k7rbz\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:10 crc kubenswrapper[4909]: I0202 12:55:10.007896 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbz4r\" (UniqueName: \"kubernetes.io/projected/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-kube-api-access-jbz4r\") pod \"community-operators-k7rbz\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:10 crc kubenswrapper[4909]: I0202 12:55:10.170671 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:10 crc kubenswrapper[4909]: I0202 12:55:10.355054 4909 generic.go:334] "Generic (PLEG): container finished" podID="5ee147f3-fc77-4226-a4c8-50ecc12fe936" containerID="781a2e83ef9999b118a4e8d53b9517b8f26721fb575296a35e90108873b85890" exitCode=0 Feb 02 12:55:10 crc kubenswrapper[4909]: I0202 12:55:10.355092 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" event={"ID":"5ee147f3-fc77-4226-a4c8-50ecc12fe936","Type":"ContainerDied","Data":"781a2e83ef9999b118a4e8d53b9517b8f26721fb575296a35e90108873b85890"} Feb 02 12:55:10 crc kubenswrapper[4909]: I0202 12:55:10.738279 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k7rbz"] Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.370146 4909 generic.go:334] "Generic (PLEG): container finished" podID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerID="93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb" exitCode=0 Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.370329 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7rbz" event={"ID":"ccbc8f32-0825-4e6d-99b1-bb8088898d5a","Type":"ContainerDied","Data":"93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb"} Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.370474 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7rbz" event={"ID":"ccbc8f32-0825-4e6d-99b1-bb8088898d5a","Type":"ContainerStarted","Data":"0e1576ea334dc99c72118334e499c967e7049789ba4ea6a74914d156df25102f"} Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.897867 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.928313 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-agent-neutron-config-0\") pod \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.928387 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-ssh-key-openstack-cell1\") pod \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.928482 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-combined-ca-bundle\") pod \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.928569 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxmnb\" (UniqueName: \"kubernetes.io/projected/5ee147f3-fc77-4226-a4c8-50ecc12fe936-kube-api-access-kxmnb\") pod \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.928738 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-inventory\") pod \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\" (UID: \"5ee147f3-fc77-4226-a4c8-50ecc12fe936\") " Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.937421 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee147f3-fc77-4226-a4c8-50ecc12fe936-kube-api-access-kxmnb" (OuterVolumeSpecName: "kube-api-access-kxmnb") pod "5ee147f3-fc77-4226-a4c8-50ecc12fe936" (UID: "5ee147f3-fc77-4226-a4c8-50ecc12fe936"). InnerVolumeSpecName "kube-api-access-kxmnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.953055 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "5ee147f3-fc77-4226-a4c8-50ecc12fe936" (UID: "5ee147f3-fc77-4226-a4c8-50ecc12fe936"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.976606 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "5ee147f3-fc77-4226-a4c8-50ecc12fe936" (UID: "5ee147f3-fc77-4226-a4c8-50ecc12fe936"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.987112 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5ee147f3-fc77-4226-a4c8-50ecc12fe936" (UID: "5ee147f3-fc77-4226-a4c8-50ecc12fe936"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:11 crc kubenswrapper[4909]: I0202 12:55:11.999641 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-inventory" (OuterVolumeSpecName: "inventory") pod "5ee147f3-fc77-4226-a4c8-50ecc12fe936" (UID: "5ee147f3-fc77-4226-a4c8-50ecc12fe936"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:12 crc kubenswrapper[4909]: I0202 12:55:12.032852 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:12 crc kubenswrapper[4909]: I0202 12:55:12.033165 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:12 crc kubenswrapper[4909]: I0202 12:55:12.033184 4909 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:12 crc kubenswrapper[4909]: I0202 12:55:12.033194 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxmnb\" (UniqueName: \"kubernetes.io/projected/5ee147f3-fc77-4226-a4c8-50ecc12fe936-kube-api-access-kxmnb\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:12 crc kubenswrapper[4909]: I0202 12:55:12.033224 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ee147f3-fc77-4226-a4c8-50ecc12fe936-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:12 crc kubenswrapper[4909]: I0202 12:55:12.401582 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" event={"ID":"5ee147f3-fc77-4226-a4c8-50ecc12fe936","Type":"ContainerDied","Data":"0976ca1340aa6df6f4a6d4394d1178fb83df64ea133cf4f4b79980d333d031b2"} Feb 02 12:55:12 crc kubenswrapper[4909]: I0202 12:55:12.402152 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0976ca1340aa6df6f4a6d4394d1178fb83df64ea133cf4f4b79980d333d031b2" Feb 02 12:55:12 crc kubenswrapper[4909]: I0202 12:55:12.402419 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-xn5dg" Feb 02 12:55:13 crc kubenswrapper[4909]: I0202 12:55:13.413004 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7rbz" event={"ID":"ccbc8f32-0825-4e6d-99b1-bb8088898d5a","Type":"ContainerStarted","Data":"5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8"} Feb 02 12:55:14 crc kubenswrapper[4909]: I0202 12:55:14.422833 4909 generic.go:334] "Generic (PLEG): container finished" podID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerID="5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8" exitCode=0 Feb 02 12:55:14 crc kubenswrapper[4909]: I0202 12:55:14.422884 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7rbz" event={"ID":"ccbc8f32-0825-4e6d-99b1-bb8088898d5a","Type":"ContainerDied","Data":"5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8"} Feb 02 12:55:15 crc kubenswrapper[4909]: I0202 12:55:15.434413 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7rbz" event={"ID":"ccbc8f32-0825-4e6d-99b1-bb8088898d5a","Type":"ContainerStarted","Data":"c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7"} Feb 02 12:55:15 crc kubenswrapper[4909]: I0202 12:55:15.457626 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k7rbz" podStartSLOduration=3.017913305 podStartE2EDuration="6.457605046s" podCreationTimestamp="2026-02-02 12:55:09 +0000 UTC" firstStartedPulling="2026-02-02 12:55:11.372351531 +0000 UTC m=+8637.118452256" lastFinishedPulling="2026-02-02 12:55:14.812043272 +0000 UTC m=+8640.558143997" observedRunningTime="2026-02-02 12:55:15.455455065 +0000 UTC m=+8641.201555800" watchObservedRunningTime="2026-02-02 12:55:15.457605046 +0000 UTC m=+8641.203705781" Feb 02 12:55:20 crc kubenswrapper[4909]: I0202 12:55:20.016987 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:55:20 crc kubenswrapper[4909]: E0202 12:55:20.017697 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:55:20 crc kubenswrapper[4909]: I0202 12:55:20.171785 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:20 crc kubenswrapper[4909]: I0202 12:55:20.173081 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:20 crc kubenswrapper[4909]: I0202 12:55:20.222136 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:20 crc kubenswrapper[4909]: I0202 12:55:20.543139 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:27 crc kubenswrapper[4909]: I0202 12:55:27.587444 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k7rbz"] Feb 02 12:55:27 crc kubenswrapper[4909]: I0202 12:55:27.588185 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k7rbz" podUID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerName="registry-server" containerID="cri-o://c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7" gracePeriod=2 Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.147679 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.287517 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-catalog-content\") pod \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.287940 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-utilities\") pod \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.288152 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbz4r\" (UniqueName: \"kubernetes.io/projected/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-kube-api-access-jbz4r\") pod \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\" (UID: \"ccbc8f32-0825-4e6d-99b1-bb8088898d5a\") " Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.288763 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-utilities" (OuterVolumeSpecName: "utilities") pod "ccbc8f32-0825-4e6d-99b1-bb8088898d5a" (UID: "ccbc8f32-0825-4e6d-99b1-bb8088898d5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.289142 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.297528 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-kube-api-access-jbz4r" (OuterVolumeSpecName: "kube-api-access-jbz4r") pod "ccbc8f32-0825-4e6d-99b1-bb8088898d5a" (UID: "ccbc8f32-0825-4e6d-99b1-bb8088898d5a"). InnerVolumeSpecName "kube-api-access-jbz4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.348656 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccbc8f32-0825-4e6d-99b1-bb8088898d5a" (UID: "ccbc8f32-0825-4e6d-99b1-bb8088898d5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.391342 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbz4r\" (UniqueName: \"kubernetes.io/projected/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-kube-api-access-jbz4r\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.391380 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccbc8f32-0825-4e6d-99b1-bb8088898d5a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.555036 4909 generic.go:334] "Generic (PLEG): container finished" podID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerID="c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7" exitCode=0 Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.555089 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7rbz" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.555087 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7rbz" event={"ID":"ccbc8f32-0825-4e6d-99b1-bb8088898d5a","Type":"ContainerDied","Data":"c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7"} Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.555214 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7rbz" event={"ID":"ccbc8f32-0825-4e6d-99b1-bb8088898d5a","Type":"ContainerDied","Data":"0e1576ea334dc99c72118334e499c967e7049789ba4ea6a74914d156df25102f"} Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.555244 4909 scope.go:117] "RemoveContainer" containerID="c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.577453 4909 scope.go:117] "RemoveContainer" containerID="5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.589080 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k7rbz"] Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.599237 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k7rbz"] Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.612552 4909 scope.go:117] "RemoveContainer" containerID="93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.661364 4909 scope.go:117] "RemoveContainer" containerID="c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7" Feb 02 12:55:28 crc kubenswrapper[4909]: E0202 12:55:28.661886 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7\": container with ID starting with c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7 not found: ID does not exist" containerID="c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.661932 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7"} err="failed to get container status \"c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7\": rpc error: code = NotFound desc = could not find container \"c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7\": container with ID starting with c939a0bbae29a9d2cdd2ba4be66356695d29b43d1ba03217d134b4c7933341a7 not found: ID does not exist" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.661960 4909 scope.go:117] "RemoveContainer" containerID="5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8" Feb 02 12:55:28 crc kubenswrapper[4909]: E0202 12:55:28.662223 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8\": container with ID starting with 5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8 not found: ID does not exist" containerID="5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.662262 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8"} err="failed to get container status \"5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8\": rpc error: code = NotFound desc = could not find container \"5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8\": container with ID starting with 5c7d2938a7e09f2b79b03b52cfbeeee482ae3bb78030ae7b2a9b5e16f53e4ad8 not found: ID does not exist" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.662283 4909 scope.go:117] "RemoveContainer" containerID="93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb" Feb 02 12:55:28 crc kubenswrapper[4909]: E0202 12:55:28.662608 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb\": container with ID starting with 93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb not found: ID does not exist" containerID="93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb" Feb 02 12:55:28 crc kubenswrapper[4909]: I0202 12:55:28.662645 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb"} err="failed to get container status \"93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb\": rpc error: code = NotFound desc = could not find container \"93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb\": container with ID starting with 93089e447dd3b98eb6cac8c52c76cab065d3c5e9dc5da4401f4239e0b349f6fb not found: ID does not exist" Feb 02 12:55:29 crc kubenswrapper[4909]: I0202 12:55:29.027092 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" path="/var/lib/kubelet/pods/ccbc8f32-0825-4e6d-99b1-bb8088898d5a/volumes" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.391075 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4xsr9"] Feb 02 12:55:30 crc kubenswrapper[4909]: E0202 12:55:30.391919 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee147f3-fc77-4226-a4c8-50ecc12fe936" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.391938 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee147f3-fc77-4226-a4c8-50ecc12fe936" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 02 12:55:30 crc kubenswrapper[4909]: E0202 12:55:30.391963 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerName="extract-content" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.391971 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerName="extract-content" Feb 02 12:55:30 crc kubenswrapper[4909]: E0202 12:55:30.391985 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerName="extract-utilities" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.391993 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerName="extract-utilities" Feb 02 12:55:30 crc kubenswrapper[4909]: E0202 12:55:30.392007 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerName="registry-server" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.392014 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerName="registry-server" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.392270 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbc8f32-0825-4e6d-99b1-bb8088898d5a" containerName="registry-server" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.392300 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee147f3-fc77-4226-a4c8-50ecc12fe936" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.394212 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.403748 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xsr9"] Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.538098 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srmqb\" (UniqueName: \"kubernetes.io/projected/2e162b98-0b57-4fd9-ace0-8cc598218f33-kube-api-access-srmqb\") pod \"certified-operators-4xsr9\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.538185 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-utilities\") pod \"certified-operators-4xsr9\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.538242 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-catalog-content\") pod \"certified-operators-4xsr9\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.640411 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srmqb\" (UniqueName: \"kubernetes.io/projected/2e162b98-0b57-4fd9-ace0-8cc598218f33-kube-api-access-srmqb\") pod \"certified-operators-4xsr9\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.640472 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-utilities\") pod \"certified-operators-4xsr9\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.640519 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-catalog-content\") pod \"certified-operators-4xsr9\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.641124 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-utilities\") pod \"certified-operators-4xsr9\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.641132 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-catalog-content\") pod \"certified-operators-4xsr9\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.660152 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srmqb\" (UniqueName: \"kubernetes.io/projected/2e162b98-0b57-4fd9-ace0-8cc598218f33-kube-api-access-srmqb\") pod \"certified-operators-4xsr9\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:30 crc kubenswrapper[4909]: I0202 12:55:30.746952 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:31 crc kubenswrapper[4909]: I0202 12:55:31.331532 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4xsr9"] Feb 02 12:55:31 crc kubenswrapper[4909]: I0202 12:55:31.602579 4909 generic.go:334] "Generic (PLEG): container finished" podID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerID="817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01" exitCode=0 Feb 02 12:55:31 crc kubenswrapper[4909]: I0202 12:55:31.602641 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsr9" event={"ID":"2e162b98-0b57-4fd9-ace0-8cc598218f33","Type":"ContainerDied","Data":"817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01"} Feb 02 12:55:31 crc kubenswrapper[4909]: I0202 12:55:31.602708 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsr9" event={"ID":"2e162b98-0b57-4fd9-ace0-8cc598218f33","Type":"ContainerStarted","Data":"4e1b1f28d3ead6aa13d3c00965526be53be35433aad68ec7774795091a699fd3"} Feb 02 12:55:32 crc kubenswrapper[4909]: I0202 12:55:32.618123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsr9" event={"ID":"2e162b98-0b57-4fd9-ace0-8cc598218f33","Type":"ContainerStarted","Data":"833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a"} Feb 02 12:55:33 crc kubenswrapper[4909]: I0202 12:55:33.016781 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:55:33 crc kubenswrapper[4909]: E0202 12:55:33.017089 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:55:33 crc kubenswrapper[4909]: I0202 12:55:33.630405 4909 generic.go:334] "Generic (PLEG): container finished" podID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerID="833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a" exitCode=0 Feb 02 12:55:33 crc kubenswrapper[4909]: I0202 12:55:33.630503 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsr9" event={"ID":"2e162b98-0b57-4fd9-ace0-8cc598218f33","Type":"ContainerDied","Data":"833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a"} Feb 02 12:55:34 crc kubenswrapper[4909]: I0202 12:55:34.647041 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsr9" event={"ID":"2e162b98-0b57-4fd9-ace0-8cc598218f33","Type":"ContainerStarted","Data":"3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a"} Feb 02 12:55:34 crc kubenswrapper[4909]: I0202 12:55:34.673113 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4xsr9" podStartSLOduration=2.1234638 podStartE2EDuration="4.673095235s" podCreationTimestamp="2026-02-02 12:55:30 +0000 UTC" firstStartedPulling="2026-02-02 12:55:31.604195003 +0000 UTC m=+8657.350295738" lastFinishedPulling="2026-02-02 12:55:34.153826428 +0000 UTC m=+8659.899927173" observedRunningTime="2026-02-02 12:55:34.671712245 +0000 UTC m=+8660.417812980" watchObservedRunningTime="2026-02-02 12:55:34.673095235 +0000 UTC m=+8660.419195970" Feb 02 12:55:34 crc kubenswrapper[4909]: I0202 12:55:34.936408 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 12:55:34 crc kubenswrapper[4909]: I0202 12:55:34.936613 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="d686e2d0-26e4-43fc-9e98-36226276b450" containerName="nova-cell0-conductor-conductor" containerID="cri-o://5d6a983ac213af33ce5f76d9939df082c1493a1a5e3016bfcd43a001501e8bff" gracePeriod=30 Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.466201 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.466398 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="41dd6b28-b792-4d80-9cd2-3ab8f738be53" containerName="nova-cell1-conductor-conductor" containerID="cri-o://0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f" gracePeriod=30 Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.633849 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.634334 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerName="nova-api-log" containerID="cri-o://62cbb60193b939a1be8159f1f192aad858644094fda7698c64377c032d79a9b5" gracePeriod=30 Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.634430 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerName="nova-api-api" containerID="cri-o://f40794129432336f08a314a7f97963bb5b13e8594e67bc11cc45f18e1bb9f82e" gracePeriod=30 Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.653612 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.653878 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="01d1231b-15a9-4876-9a03-1c8963164da0" containerName="nova-scheduler-scheduler" containerID="cri-o://a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" gracePeriod=30 Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.688499 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.688736 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerName="nova-metadata-log" containerID="cri-o://367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad" gracePeriod=30 Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.688833 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerName="nova-metadata-metadata" containerID="cri-o://65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51" gracePeriod=30 Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.760594 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm"] Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.762264 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.765924 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-lfs7z" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.769929 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.770068 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.770233 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.770363 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.770454 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.771001 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.785118 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm"] Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.864960 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.864996 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.865033 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.865055 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.865094 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmx2l\" (UniqueName: \"kubernetes.io/projected/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-kube-api-access-xmx2l\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.865120 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.865152 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.865196 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.865299 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.967700 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.967801 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.967840 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.967875 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.967896 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.967935 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmx2l\" (UniqueName: \"kubernetes.io/projected/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-kube-api-access-xmx2l\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.967965 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.968001 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.968044 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.969375 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.976014 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.976167 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.976179 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.976524 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.976947 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.990218 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.991405 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmx2l\" (UniqueName: \"kubernetes.io/projected/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-kube-api-access-xmx2l\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:35 crc kubenswrapper[4909]: I0202 12:55:35.991497 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:36 crc kubenswrapper[4909]: I0202 12:55:36.162169 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:55:36 crc kubenswrapper[4909]: E0202 12:55:36.492546 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 12:55:36 crc kubenswrapper[4909]: E0202 12:55:36.495722 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 12:55:36 crc kubenswrapper[4909]: E0202 12:55:36.497982 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 12:55:36 crc kubenswrapper[4909]: E0202 12:55:36.498053 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="01d1231b-15a9-4876-9a03-1c8963164da0" containerName="nova-scheduler-scheduler" Feb 02 12:55:36 crc kubenswrapper[4909]: I0202 12:55:36.668321 4909 generic.go:334] "Generic (PLEG): container finished" podID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerID="62cbb60193b939a1be8159f1f192aad858644094fda7698c64377c032d79a9b5" exitCode=143 Feb 02 12:55:36 crc kubenswrapper[4909]: I0202 12:55:36.668408 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244","Type":"ContainerDied","Data":"62cbb60193b939a1be8159f1f192aad858644094fda7698c64377c032d79a9b5"} Feb 02 12:55:36 crc kubenswrapper[4909]: I0202 12:55:36.671882 4909 generic.go:334] "Generic (PLEG): container finished" podID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerID="367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad" exitCode=143 Feb 02 12:55:36 crc kubenswrapper[4909]: I0202 12:55:36.671921 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64","Type":"ContainerDied","Data":"367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad"} Feb 02 12:55:36 crc kubenswrapper[4909]: I0202 12:55:36.814959 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm"] Feb 02 12:55:37 crc kubenswrapper[4909]: I0202 12:55:37.699022 4909 generic.go:334] "Generic (PLEG): container finished" podID="d686e2d0-26e4-43fc-9e98-36226276b450" containerID="5d6a983ac213af33ce5f76d9939df082c1493a1a5e3016bfcd43a001501e8bff" exitCode=0 Feb 02 12:55:37 crc kubenswrapper[4909]: I0202 12:55:37.699399 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d686e2d0-26e4-43fc-9e98-36226276b450","Type":"ContainerDied","Data":"5d6a983ac213af33ce5f76d9939df082c1493a1a5e3016bfcd43a001501e8bff"} Feb 02 12:55:37 crc kubenswrapper[4909]: I0202 12:55:37.706971 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" event={"ID":"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926","Type":"ContainerStarted","Data":"86432de1e2a5a002c8402d1513bd8d305de05fc2b935a455aabcc34994df48fa"} Feb 02 12:55:38 crc kubenswrapper[4909]: E0202 12:55:38.077759 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 12:55:38 crc kubenswrapper[4909]: E0202 12:55:38.079393 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 12:55:38 crc kubenswrapper[4909]: E0202 12:55:38.080719 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 02 12:55:38 crc kubenswrapper[4909]: E0202 12:55:38.080770 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="41dd6b28-b792-4d80-9cd2-3ab8f738be53" containerName="nova-cell1-conductor-conductor" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.499881 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.524715 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-config-data\") pod \"d686e2d0-26e4-43fc-9e98-36226276b450\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.524841 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dljv9\" (UniqueName: \"kubernetes.io/projected/d686e2d0-26e4-43fc-9e98-36226276b450-kube-api-access-dljv9\") pod \"d686e2d0-26e4-43fc-9e98-36226276b450\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.524980 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-combined-ca-bundle\") pod \"d686e2d0-26e4-43fc-9e98-36226276b450\" (UID: \"d686e2d0-26e4-43fc-9e98-36226276b450\") " Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.533098 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d686e2d0-26e4-43fc-9e98-36226276b450-kube-api-access-dljv9" (OuterVolumeSpecName: "kube-api-access-dljv9") pod "d686e2d0-26e4-43fc-9e98-36226276b450" (UID: "d686e2d0-26e4-43fc-9e98-36226276b450"). InnerVolumeSpecName "kube-api-access-dljv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.572548 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d686e2d0-26e4-43fc-9e98-36226276b450" (UID: "d686e2d0-26e4-43fc-9e98-36226276b450"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.575651 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-config-data" (OuterVolumeSpecName: "config-data") pod "d686e2d0-26e4-43fc-9e98-36226276b450" (UID: "d686e2d0-26e4-43fc-9e98-36226276b450"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.627666 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.627699 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dljv9\" (UniqueName: \"kubernetes.io/projected/d686e2d0-26e4-43fc-9e98-36226276b450-kube-api-access-dljv9\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.627730 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d686e2d0-26e4-43fc-9e98-36226276b450-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.717092 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" event={"ID":"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926","Type":"ContainerStarted","Data":"4dc4cfbedb16ef5ef5002c0803a869aaab80ef02b4e2a755ccdeb82bca2f9609"} Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.719769 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d686e2d0-26e4-43fc-9e98-36226276b450","Type":"ContainerDied","Data":"1f0b7de235ecef1b5a04ec9d79b3e40d7da93417b36fff7fd0b7c9fde3c57bf7"} Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.719826 4909 scope.go:117] "RemoveContainer" containerID="5d6a983ac213af33ce5f76d9939df082c1493a1a5e3016bfcd43a001501e8bff" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.719842 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.753276 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" podStartSLOduration=3.349600581 podStartE2EDuration="3.753255044s" podCreationTimestamp="2026-02-02 12:55:35 +0000 UTC" firstStartedPulling="2026-02-02 12:55:36.828246847 +0000 UTC m=+8662.574347582" lastFinishedPulling="2026-02-02 12:55:37.23190131 +0000 UTC m=+8662.978002045" observedRunningTime="2026-02-02 12:55:38.739505474 +0000 UTC m=+8664.485606229" watchObservedRunningTime="2026-02-02 12:55:38.753255044 +0000 UTC m=+8664.499355779" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.771850 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.785125 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.811884 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 12:55:38 crc kubenswrapper[4909]: E0202 12:55:38.812396 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d686e2d0-26e4-43fc-9e98-36226276b450" containerName="nova-cell0-conductor-conductor" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.812414 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d686e2d0-26e4-43fc-9e98-36226276b450" containerName="nova-cell0-conductor-conductor" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.812623 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d686e2d0-26e4-43fc-9e98-36226276b450" containerName="nova-cell0-conductor-conductor" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.813561 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.815737 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.816428 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.841392 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a7f0b1-11b1-405e-97ba-0bacc251ef8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"17a7f0b1-11b1-405e-97ba-0bacc251ef8e\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.841474 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6k7b\" (UniqueName: \"kubernetes.io/projected/17a7f0b1-11b1-405e-97ba-0bacc251ef8e-kube-api-access-m6k7b\") pod \"nova-cell0-conductor-0\" (UID: \"17a7f0b1-11b1-405e-97ba-0bacc251ef8e\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.841602 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a7f0b1-11b1-405e-97ba-0bacc251ef8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"17a7f0b1-11b1-405e-97ba-0bacc251ef8e\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.943100 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a7f0b1-11b1-405e-97ba-0bacc251ef8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"17a7f0b1-11b1-405e-97ba-0bacc251ef8e\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.943507 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a7f0b1-11b1-405e-97ba-0bacc251ef8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"17a7f0b1-11b1-405e-97ba-0bacc251ef8e\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.943573 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6k7b\" (UniqueName: \"kubernetes.io/projected/17a7f0b1-11b1-405e-97ba-0bacc251ef8e-kube-api-access-m6k7b\") pod \"nova-cell0-conductor-0\" (UID: \"17a7f0b1-11b1-405e-97ba-0bacc251ef8e\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.947100 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17a7f0b1-11b1-405e-97ba-0bacc251ef8e-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"17a7f0b1-11b1-405e-97ba-0bacc251ef8e\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.948196 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17a7f0b1-11b1-405e-97ba-0bacc251ef8e-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"17a7f0b1-11b1-405e-97ba-0bacc251ef8e\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:38 crc kubenswrapper[4909]: I0202 12:55:38.959825 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6k7b\" (UniqueName: \"kubernetes.io/projected/17a7f0b1-11b1-405e-97ba-0bacc251ef8e-kube-api-access-m6k7b\") pod \"nova-cell0-conductor-0\" (UID: \"17a7f0b1-11b1-405e-97ba-0bacc251ef8e\") " pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.040498 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d686e2d0-26e4-43fc-9e98-36226276b450" path="/var/lib/kubelet/pods/d686e2d0-26e4-43fc-9e98-36226276b450/volumes" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.129721 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.260619 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.352045 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-logs\") pod \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.352652 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbhfs\" (UniqueName: \"kubernetes.io/projected/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-kube-api-access-pbhfs\") pod \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.352739 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-combined-ca-bundle\") pod \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.352893 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-nova-metadata-tls-certs\") pod \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.353022 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-config-data\") pod \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\" (UID: \"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64\") " Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.354347 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-logs" (OuterVolumeSpecName: "logs") pod "8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" (UID: "8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.370098 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-kube-api-access-pbhfs" (OuterVolumeSpecName: "kube-api-access-pbhfs") pod "8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" (UID: "8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64"). InnerVolumeSpecName "kube-api-access-pbhfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.391489 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" (UID: "8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.421326 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-config-data" (OuterVolumeSpecName: "config-data") pod "8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" (UID: "8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.467384 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.467425 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.467440 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbhfs\" (UniqueName: \"kubernetes.io/projected/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-kube-api-access-pbhfs\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.467452 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.469140 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" (UID: "8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.568993 4909 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.730714 4909 generic.go:334] "Generic (PLEG): container finished" podID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerID="f40794129432336f08a314a7f97963bb5b13e8594e67bc11cc45f18e1bb9f82e" exitCode=0 Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.730931 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244","Type":"ContainerDied","Data":"f40794129432336f08a314a7f97963bb5b13e8594e67bc11cc45f18e1bb9f82e"} Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.736744 4909 generic.go:334] "Generic (PLEG): container finished" podID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerID="65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51" exitCode=0 Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.738504 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.759980 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64","Type":"ContainerDied","Data":"65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51"} Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.760077 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64","Type":"ContainerDied","Data":"266c8e5a1e11e509a08531cb9cb8ac85e7c4083d16ce6c004b1a327bcbf5fddb"} Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.760137 4909 scope.go:117] "RemoveContainer" containerID="65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.766578 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.839870 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.867455 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.877302 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:55:39 crc kubenswrapper[4909]: E0202 12:55:39.877979 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerName="nova-metadata-log" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.878058 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerName="nova-metadata-log" Feb 02 12:55:39 crc kubenswrapper[4909]: E0202 12:55:39.878150 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerName="nova-metadata-metadata" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.878202 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerName="nova-metadata-metadata" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.878489 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerName="nova-metadata-metadata" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.878554 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" containerName="nova-metadata-log" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.879827 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.883343 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.883630 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.886561 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:55:39 crc kubenswrapper[4909]: E0202 12:55:39.890951 4909 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8565eb0a_14b7_4f9f_b9ba_9837f3c9ab64.slice/crio-266c8e5a1e11e509a08531cb9cb8ac85e7c4083d16ce6c004b1a327bcbf5fddb\": RecentStats: unable to find data in memory cache]" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.979064 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e59f904-7475-4136-bb41-a907e4855430-config-data\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.979174 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc5h4\" (UniqueName: \"kubernetes.io/projected/2e59f904-7475-4136-bb41-a907e4855430-kube-api-access-hc5h4\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.979222 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e59f904-7475-4136-bb41-a907e4855430-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.979289 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e59f904-7475-4136-bb41-a907e4855430-logs\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:39 crc kubenswrapper[4909]: I0202 12:55:39.979320 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e59f904-7475-4136-bb41-a907e4855430-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.081088 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e59f904-7475-4136-bb41-a907e4855430-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.081245 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e59f904-7475-4136-bb41-a907e4855430-config-data\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.081304 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc5h4\" (UniqueName: \"kubernetes.io/projected/2e59f904-7475-4136-bb41-a907e4855430-kube-api-access-hc5h4\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.081328 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e59f904-7475-4136-bb41-a907e4855430-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.081377 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e59f904-7475-4136-bb41-a907e4855430-logs\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.081873 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e59f904-7475-4136-bb41-a907e4855430-logs\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.098328 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e59f904-7475-4136-bb41-a907e4855430-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.098395 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e59f904-7475-4136-bb41-a907e4855430-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.100595 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e59f904-7475-4136-bb41-a907e4855430-config-data\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.108983 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc5h4\" (UniqueName: \"kubernetes.io/projected/2e59f904-7475-4136-bb41-a907e4855430-kube-api-access-hc5h4\") pod \"nova-metadata-0\" (UID: \"2e59f904-7475-4136-bb41-a907e4855430\") " pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.148261 4909 scope.go:117] "RemoveContainer" containerID="367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.206290 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.431159 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.433114 4909 scope.go:117] "RemoveContainer" containerID="65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51" Feb 02 12:55:40 crc kubenswrapper[4909]: E0202 12:55:40.433455 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51\": container with ID starting with 65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51 not found: ID does not exist" containerID="65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.433480 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51"} err="failed to get container status \"65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51\": rpc error: code = NotFound desc = could not find container \"65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51\": container with ID starting with 65783f58d87d0a0e4ff9166650bb11a84b934388a454d3a42bc6b55f6bd48f51 not found: ID does not exist" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.433501 4909 scope.go:117] "RemoveContainer" containerID="367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad" Feb 02 12:55:40 crc kubenswrapper[4909]: E0202 12:55:40.433800 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad\": container with ID starting with 367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad not found: ID does not exist" containerID="367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.433889 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad"} err="failed to get container status \"367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad\": rpc error: code = NotFound desc = could not find container \"367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad\": container with ID starting with 367124b20b45d240fd18ce0f053f4b7619daca5fc2fde669f570bda3d8950cad not found: ID does not exist" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.622492 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvq2w\" (UniqueName: \"kubernetes.io/projected/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-kube-api-access-tvq2w\") pod \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.622890 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-internal-tls-certs\") pod \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.623178 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-combined-ca-bundle\") pod \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.623252 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-logs\") pod \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.623279 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-config-data\") pod \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.623312 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-public-tls-certs\") pod \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\" (UID: \"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244\") " Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.634556 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-logs" (OuterVolumeSpecName: "logs") pod "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" (UID: "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.643954 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-kube-api-access-tvq2w" (OuterVolumeSpecName: "kube-api-access-tvq2w") pod "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" (UID: "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244"). InnerVolumeSpecName "kube-api-access-tvq2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.652673 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.685200 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-config-data" (OuterVolumeSpecName: "config-data") pod "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" (UID: "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.685256 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" (UID: "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.696354 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" (UID: "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.728146 4909 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-logs\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.728197 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.728210 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvq2w\" (UniqueName: \"kubernetes.io/projected/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-kube-api-access-tvq2w\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.728223 4909 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.728234 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.739212 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" (UID: "4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.747205 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.747253 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.749933 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244","Type":"ContainerDied","Data":"2d2294367c378b4b1e1c258ac4dfa2fd669a4b3ae4096190e3135975076c8fbb"} Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.749991 4909 scope.go:117] "RemoveContainer" containerID="f40794129432336f08a314a7f97963bb5b13e8594e67bc11cc45f18e1bb9f82e" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.750120 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.754466 4909 generic.go:334] "Generic (PLEG): container finished" podID="41dd6b28-b792-4d80-9cd2-3ab8f738be53" containerID="0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f" exitCode=0 Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.754604 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.754613 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"41dd6b28-b792-4d80-9cd2-3ab8f738be53","Type":"ContainerDied","Data":"0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f"} Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.754658 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"41dd6b28-b792-4d80-9cd2-3ab8f738be53","Type":"ContainerDied","Data":"5326420884cca1e66181c396a42c94bb9e82bc331939135d827e46fc78875aae"} Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.759315 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"17a7f0b1-11b1-405e-97ba-0bacc251ef8e","Type":"ContainerStarted","Data":"bf1bb1858e5e12d8f346db58725f34ece84273c42341bed5f9542f23a25b1863"} Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.759362 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"17a7f0b1-11b1-405e-97ba-0bacc251ef8e","Type":"ContainerStarted","Data":"148065b14acf7160e161c7552b01819939e3ca0c9206a04ac1ea86394e671cfa"} Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.759962 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.795652 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.795626723 podStartE2EDuration="2.795626723s" podCreationTimestamp="2026-02-02 12:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:55:40.77966331 +0000 UTC m=+8666.525764055" watchObservedRunningTime="2026-02-02 12:55:40.795626723 +0000 UTC m=+8666.541727458" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.808770 4909 scope.go:117] "RemoveContainer" containerID="62cbb60193b939a1be8159f1f192aad858644094fda7698c64377c032d79a9b5" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.830098 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-config-data\") pod \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.830383 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wv8d\" (UniqueName: \"kubernetes.io/projected/41dd6b28-b792-4d80-9cd2-3ab8f738be53-kube-api-access-7wv8d\") pod \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.830402 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-combined-ca-bundle\") pod \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\" (UID: \"41dd6b28-b792-4d80-9cd2-3ab8f738be53\") " Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.830991 4909 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.833006 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.835873 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.844131 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41dd6b28-b792-4d80-9cd2-3ab8f738be53-kube-api-access-7wv8d" (OuterVolumeSpecName: "kube-api-access-7wv8d") pod "41dd6b28-b792-4d80-9cd2-3ab8f738be53" (UID: "41dd6b28-b792-4d80-9cd2-3ab8f738be53"). InnerVolumeSpecName "kube-api-access-7wv8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.863915 4909 scope.go:117] "RemoveContainer" containerID="0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.877415 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.894147 4909 scope.go:117] "RemoveContainer" containerID="0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f" Feb 02 12:55:40 crc kubenswrapper[4909]: E0202 12:55:40.906736 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f\": container with ID starting with 0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f not found: ID does not exist" containerID="0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.906783 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f"} err="failed to get container status \"0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f\": rpc error: code = NotFound desc = could not find container \"0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f\": container with ID starting with 0d1ec12c1908b0801c016824cd56b0a335d90c6309de5877b5eb35631b10f47f not found: ID does not exist" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.912731 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 02 12:55:40 crc kubenswrapper[4909]: E0202 12:55:40.913341 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerName="nova-api-api" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.913360 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerName="nova-api-api" Feb 02 12:55:40 crc kubenswrapper[4909]: E0202 12:55:40.913382 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerName="nova-api-log" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.913388 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerName="nova-api-log" Feb 02 12:55:40 crc kubenswrapper[4909]: E0202 12:55:40.913412 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41dd6b28-b792-4d80-9cd2-3ab8f738be53" containerName="nova-cell1-conductor-conductor" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.913418 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="41dd6b28-b792-4d80-9cd2-3ab8f738be53" containerName="nova-cell1-conductor-conductor" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.913616 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerName="nova-api-api" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.913633 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="41dd6b28-b792-4d80-9cd2-3ab8f738be53" containerName="nova-cell1-conductor-conductor" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.913646 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" containerName="nova-api-log" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.915970 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41dd6b28-b792-4d80-9cd2-3ab8f738be53" (UID: "41dd6b28-b792-4d80-9cd2-3ab8f738be53"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.925327 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.929164 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.929686 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.930255 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.930406 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.935392 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wv8d\" (UniqueName: \"kubernetes.io/projected/41dd6b28-b792-4d80-9cd2-3ab8f738be53-kube-api-access-7wv8d\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.935420 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.943768 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-config-data" (OuterVolumeSpecName: "config-data") pod "41dd6b28-b792-4d80-9cd2-3ab8f738be53" (UID: "41dd6b28-b792-4d80-9cd2-3ab8f738be53"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:40 crc kubenswrapper[4909]: I0202 12:55:40.960579 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.035164 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244" path="/var/lib/kubelet/pods/4f2c8e74-60bb-4f98-8bc4-a3b3adaf2244/volumes" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.036474 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64" path="/var/lib/kubelet/pods/8565eb0a-14b7-4f9f-b9ba-9837f3c9ab64/volumes" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.037397 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b22e2928-59ab-47d5-84f8-4ad233ffd449-logs\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.037469 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.037507 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqjdt\" (UniqueName: \"kubernetes.io/projected/b22e2928-59ab-47d5-84f8-4ad233ffd449-kube-api-access-nqjdt\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.037545 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-config-data\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.037575 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-public-tls-certs\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.038079 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.038248 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41dd6b28-b792-4d80-9cd2-3ab8f738be53-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.139743 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b22e2928-59ab-47d5-84f8-4ad233ffd449-logs\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.139853 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.139902 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqjdt\" (UniqueName: \"kubernetes.io/projected/b22e2928-59ab-47d5-84f8-4ad233ffd449-kube-api-access-nqjdt\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.139952 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-config-data\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.139989 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-public-tls-certs\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.140158 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.140934 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b22e2928-59ab-47d5-84f8-4ad233ffd449-logs\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.145424 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.150343 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-public-tls-certs\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.151516 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.152226 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b22e2928-59ab-47d5-84f8-4ad233ffd449-config-data\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.166174 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqjdt\" (UniqueName: \"kubernetes.io/projected/b22e2928-59ab-47d5-84f8-4ad233ffd449-kube-api-access-nqjdt\") pod \"nova-api-0\" (UID: \"b22e2928-59ab-47d5-84f8-4ad233ffd449\") " pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.299359 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.322319 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.347069 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.356800 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.358243 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.360756 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.367787 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 12:55:41 crc kubenswrapper[4909]: E0202 12:55:41.492427 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 12:55:41 crc kubenswrapper[4909]: E0202 12:55:41.494945 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 12:55:41 crc kubenswrapper[4909]: E0202 12:55:41.496291 4909 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 02 12:55:41 crc kubenswrapper[4909]: E0202 12:55:41.496336 4909 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="01d1231b-15a9-4876-9a03-1c8963164da0" containerName="nova-scheduler-scheduler" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.551239 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd610f58-bafd-4ba2-bc8e-cfc79d94cad5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.551314 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd610f58-bafd-4ba2-bc8e-cfc79d94cad5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.551402 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb9hd\" (UniqueName: \"kubernetes.io/projected/fd610f58-bafd-4ba2-bc8e-cfc79d94cad5-kube-api-access-lb9hd\") pod \"nova-cell1-conductor-0\" (UID: \"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.653409 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd610f58-bafd-4ba2-bc8e-cfc79d94cad5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.653468 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd610f58-bafd-4ba2-bc8e-cfc79d94cad5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.653533 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb9hd\" (UniqueName: \"kubernetes.io/projected/fd610f58-bafd-4ba2-bc8e-cfc79d94cad5-kube-api-access-lb9hd\") pod \"nova-cell1-conductor-0\" (UID: \"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.665038 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd610f58-bafd-4ba2-bc8e-cfc79d94cad5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.666371 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd610f58-bafd-4ba2-bc8e-cfc79d94cad5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.691977 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb9hd\" (UniqueName: \"kubernetes.io/projected/fd610f58-bafd-4ba2-bc8e-cfc79d94cad5-kube-api-access-lb9hd\") pod \"nova-cell1-conductor-0\" (UID: \"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5\") " pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.823488 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e59f904-7475-4136-bb41-a907e4855430","Type":"ContainerStarted","Data":"2fe1ecc51b7f628e58ae9c5741d5148d2ca7a9b540610e7e8f39cecf96156710"} Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.823823 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e59f904-7475-4136-bb41-a907e4855430","Type":"ContainerStarted","Data":"1f6553fa29ac01c594d3005e00b121d88dc516e3d7e7d55e37139ae972b6063f"} Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.823834 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2e59f904-7475-4136-bb41-a907e4855430","Type":"ContainerStarted","Data":"d8c66ef0b3513cf71ecf4627c2a99637dbfe7cbfbe81c2312c23af370f0c92fa"} Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.826498 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.897524 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.897503675 podStartE2EDuration="2.897503675s" podCreationTimestamp="2026-02-02 12:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:55:41.878553977 +0000 UTC m=+8667.624654712" watchObservedRunningTime="2026-02-02 12:55:41.897503675 +0000 UTC m=+8667.643604410" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.939833 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:41 crc kubenswrapper[4909]: I0202 12:55:41.987235 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:42 crc kubenswrapper[4909]: I0202 12:55:42.007526 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xsr9"] Feb 02 12:55:42 crc kubenswrapper[4909]: W0202 12:55:42.574387 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd610f58_bafd_4ba2_bc8e_cfc79d94cad5.slice/crio-b7fa231e0fa65233b15cf55b9e8d98e83e9285099944581f5bf6965850c07dc0 WatchSource:0}: Error finding container b7fa231e0fa65233b15cf55b9e8d98e83e9285099944581f5bf6965850c07dc0: Status 404 returned error can't find the container with id b7fa231e0fa65233b15cf55b9e8d98e83e9285099944581f5bf6965850c07dc0 Feb 02 12:55:42 crc kubenswrapper[4909]: I0202 12:55:42.578485 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 02 12:55:42 crc kubenswrapper[4909]: I0202 12:55:42.877330 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5","Type":"ContainerStarted","Data":"a7d1a83ed91355cfd434f9af2244668c834c5cf7459b80ae829c6b419f202b56"} Feb 02 12:55:42 crc kubenswrapper[4909]: I0202 12:55:42.877400 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"fd610f58-bafd-4ba2-bc8e-cfc79d94cad5","Type":"ContainerStarted","Data":"b7fa231e0fa65233b15cf55b9e8d98e83e9285099944581f5bf6965850c07dc0"} Feb 02 12:55:42 crc kubenswrapper[4909]: I0202 12:55:42.879158 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:42 crc kubenswrapper[4909]: I0202 12:55:42.882649 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b22e2928-59ab-47d5-84f8-4ad233ffd449","Type":"ContainerStarted","Data":"9ab330cf62ca9e747799bb1973103d1ffc8bebf07781eb6b09b0a914dbc1d634"} Feb 02 12:55:42 crc kubenswrapper[4909]: I0202 12:55:42.882703 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b22e2928-59ab-47d5-84f8-4ad233ffd449","Type":"ContainerStarted","Data":"4b00192c564d0b55bfb411ea287312eb0de2a24eb68b8b344c70171d2b976236"} Feb 02 12:55:42 crc kubenswrapper[4909]: I0202 12:55:42.882715 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b22e2928-59ab-47d5-84f8-4ad233ffd449","Type":"ContainerStarted","Data":"49aae440e790a4d78c5753aae4215aa8e280a8f12af6c4e76465217bbe8d1935"} Feb 02 12:55:42 crc kubenswrapper[4909]: I0202 12:55:42.894541 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.894523089 podStartE2EDuration="1.894523089s" podCreationTimestamp="2026-02-02 12:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:55:42.892217494 +0000 UTC m=+8668.638318219" watchObservedRunningTime="2026-02-02 12:55:42.894523089 +0000 UTC m=+8668.640623824" Feb 02 12:55:42 crc kubenswrapper[4909]: I0202 12:55:42.923173 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.923152762 podStartE2EDuration="2.923152762s" podCreationTimestamp="2026-02-02 12:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:55:42.910454662 +0000 UTC m=+8668.656555407" watchObservedRunningTime="2026-02-02 12:55:42.923152762 +0000 UTC m=+8668.669253497" Feb 02 12:55:43 crc kubenswrapper[4909]: I0202 12:55:43.031421 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41dd6b28-b792-4d80-9cd2-3ab8f738be53" path="/var/lib/kubelet/pods/41dd6b28-b792-4d80-9cd2-3ab8f738be53/volumes" Feb 02 12:55:43 crc kubenswrapper[4909]: I0202 12:55:43.891885 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4xsr9" podUID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerName="registry-server" containerID="cri-o://3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a" gracePeriod=2 Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.395405 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.527215 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srmqb\" (UniqueName: \"kubernetes.io/projected/2e162b98-0b57-4fd9-ace0-8cc598218f33-kube-api-access-srmqb\") pod \"2e162b98-0b57-4fd9-ace0-8cc598218f33\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.527443 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-utilities\") pod \"2e162b98-0b57-4fd9-ace0-8cc598218f33\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.527476 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-catalog-content\") pod \"2e162b98-0b57-4fd9-ace0-8cc598218f33\" (UID: \"2e162b98-0b57-4fd9-ace0-8cc598218f33\") " Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.528282 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-utilities" (OuterVolumeSpecName: "utilities") pod "2e162b98-0b57-4fd9-ace0-8cc598218f33" (UID: "2e162b98-0b57-4fd9-ace0-8cc598218f33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.534067 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e162b98-0b57-4fd9-ace0-8cc598218f33-kube-api-access-srmqb" (OuterVolumeSpecName: "kube-api-access-srmqb") pod "2e162b98-0b57-4fd9-ace0-8cc598218f33" (UID: "2e162b98-0b57-4fd9-ace0-8cc598218f33"). InnerVolumeSpecName "kube-api-access-srmqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.582399 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e162b98-0b57-4fd9-ace0-8cc598218f33" (UID: "2e162b98-0b57-4fd9-ace0-8cc598218f33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.630613 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srmqb\" (UniqueName: \"kubernetes.io/projected/2e162b98-0b57-4fd9-ace0-8cc598218f33-kube-api-access-srmqb\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.630656 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.630667 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e162b98-0b57-4fd9-ace0-8cc598218f33-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.901905 4909 generic.go:334] "Generic (PLEG): container finished" podID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerID="3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a" exitCode=0 Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.901971 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsr9" event={"ID":"2e162b98-0b57-4fd9-ace0-8cc598218f33","Type":"ContainerDied","Data":"3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a"} Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.902029 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4xsr9" event={"ID":"2e162b98-0b57-4fd9-ace0-8cc598218f33","Type":"ContainerDied","Data":"4e1b1f28d3ead6aa13d3c00965526be53be35433aad68ec7774795091a699fd3"} Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.902053 4909 scope.go:117] "RemoveContainer" containerID="3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a" Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.901989 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4xsr9" Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.924209 4909 scope.go:117] "RemoveContainer" containerID="833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a" Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.947739 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4xsr9"] Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.957446 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4xsr9"] Feb 02 12:55:44 crc kubenswrapper[4909]: I0202 12:55:44.960043 4909 scope.go:117] "RemoveContainer" containerID="817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.003220 4909 scope.go:117] "RemoveContainer" containerID="3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a" Feb 02 12:55:45 crc kubenswrapper[4909]: E0202 12:55:45.003591 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a\": container with ID starting with 3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a not found: ID does not exist" containerID="3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.003619 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a"} err="failed to get container status \"3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a\": rpc error: code = NotFound desc = could not find container \"3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a\": container with ID starting with 3548718f97443d1c4a1bb4bf20cc6650b7f9107019689aed104b527f94a6051a not found: ID does not exist" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.003640 4909 scope.go:117] "RemoveContainer" containerID="833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a" Feb 02 12:55:45 crc kubenswrapper[4909]: E0202 12:55:45.003864 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a\": container with ID starting with 833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a not found: ID does not exist" containerID="833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.003892 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a"} err="failed to get container status \"833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a\": rpc error: code = NotFound desc = could not find container \"833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a\": container with ID starting with 833acedab632fca0207b96ff6717d273b19b899f49b5df02933e978de67f1f1a not found: ID does not exist" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.003908 4909 scope.go:117] "RemoveContainer" containerID="817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01" Feb 02 12:55:45 crc kubenswrapper[4909]: E0202 12:55:45.004120 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01\": container with ID starting with 817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01 not found: ID does not exist" containerID="817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.004140 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01"} err="failed to get container status \"817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01\": rpc error: code = NotFound desc = could not find container \"817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01\": container with ID starting with 817e88ca7798c58e61f0af1d27149cee815fb16fe5a33e0c83aa0d2e653f5c01 not found: ID does not exist" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.030034 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e162b98-0b57-4fd9-ace0-8cc598218f33" path="/var/lib/kubelet/pods/2e162b98-0b57-4fd9-ace0-8cc598218f33/volumes" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.207267 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.207359 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.513582 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.654158 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-config-data\") pod \"01d1231b-15a9-4876-9a03-1c8963164da0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.654421 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gfv2\" (UniqueName: \"kubernetes.io/projected/01d1231b-15a9-4876-9a03-1c8963164da0-kube-api-access-6gfv2\") pod \"01d1231b-15a9-4876-9a03-1c8963164da0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.654515 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-combined-ca-bundle\") pod \"01d1231b-15a9-4876-9a03-1c8963164da0\" (UID: \"01d1231b-15a9-4876-9a03-1c8963164da0\") " Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.659224 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01d1231b-15a9-4876-9a03-1c8963164da0-kube-api-access-6gfv2" (OuterVolumeSpecName: "kube-api-access-6gfv2") pod "01d1231b-15a9-4876-9a03-1c8963164da0" (UID: "01d1231b-15a9-4876-9a03-1c8963164da0"). InnerVolumeSpecName "kube-api-access-6gfv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.685955 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01d1231b-15a9-4876-9a03-1c8963164da0" (UID: "01d1231b-15a9-4876-9a03-1c8963164da0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.685976 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-config-data" (OuterVolumeSpecName: "config-data") pod "01d1231b-15a9-4876-9a03-1c8963164da0" (UID: "01d1231b-15a9-4876-9a03-1c8963164da0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.756464 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.756497 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gfv2\" (UniqueName: \"kubernetes.io/projected/01d1231b-15a9-4876-9a03-1c8963164da0-kube-api-access-6gfv2\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.756508 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d1231b-15a9-4876-9a03-1c8963164da0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.912180 4909 generic.go:334] "Generic (PLEG): container finished" podID="01d1231b-15a9-4876-9a03-1c8963164da0" containerID="a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" exitCode=0 Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.912228 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.912261 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01d1231b-15a9-4876-9a03-1c8963164da0","Type":"ContainerDied","Data":"a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3"} Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.912291 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01d1231b-15a9-4876-9a03-1c8963164da0","Type":"ContainerDied","Data":"67e2a2f3f7098dfc760ca19e70e801c10b93ded7314b7d53f46e0b4dabe229d8"} Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.912311 4909 scope.go:117] "RemoveContainer" containerID="a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.967058 4909 scope.go:117] "RemoveContainer" containerID="a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" Feb 02 12:55:45 crc kubenswrapper[4909]: E0202 12:55:45.968934 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3\": container with ID starting with a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3 not found: ID does not exist" containerID="a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.969045 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3"} err="failed to get container status \"a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3\": rpc error: code = NotFound desc = could not find container \"a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3\": container with ID starting with a931152f73f5539610a817207753eee16dbb6c9eccc24e6035cbf87272d91bb3 not found: ID does not exist" Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.974939 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:55:45 crc kubenswrapper[4909]: I0202 12:55:45.989853 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.000727 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:55:46 crc kubenswrapper[4909]: E0202 12:55:46.001221 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerName="extract-content" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.001248 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerName="extract-content" Feb 02 12:55:46 crc kubenswrapper[4909]: E0202 12:55:46.001265 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01d1231b-15a9-4876-9a03-1c8963164da0" containerName="nova-scheduler-scheduler" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.001273 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="01d1231b-15a9-4876-9a03-1c8963164da0" containerName="nova-scheduler-scheduler" Feb 02 12:55:46 crc kubenswrapper[4909]: E0202 12:55:46.001282 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerName="registry-server" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.001291 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerName="registry-server" Feb 02 12:55:46 crc kubenswrapper[4909]: E0202 12:55:46.001315 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerName="extract-utilities" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.001323 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerName="extract-utilities" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.001565 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="01d1231b-15a9-4876-9a03-1c8963164da0" containerName="nova-scheduler-scheduler" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.001592 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e162b98-0b57-4fd9-ace0-8cc598218f33" containerName="registry-server" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.002630 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.011440 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.012479 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.016592 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:55:46 crc kubenswrapper[4909]: E0202 12:55:46.016937 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.171005 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588767b4-05f7-468e-991c-eb0a131fff90-config-data\") pod \"nova-scheduler-0\" (UID: \"588767b4-05f7-468e-991c-eb0a131fff90\") " pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.171640 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9sq5\" (UniqueName: \"kubernetes.io/projected/588767b4-05f7-468e-991c-eb0a131fff90-kube-api-access-v9sq5\") pod \"nova-scheduler-0\" (UID: \"588767b4-05f7-468e-991c-eb0a131fff90\") " pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.172765 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588767b4-05f7-468e-991c-eb0a131fff90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"588767b4-05f7-468e-991c-eb0a131fff90\") " pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.274621 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588767b4-05f7-468e-991c-eb0a131fff90-config-data\") pod \"nova-scheduler-0\" (UID: \"588767b4-05f7-468e-991c-eb0a131fff90\") " pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.274721 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9sq5\" (UniqueName: \"kubernetes.io/projected/588767b4-05f7-468e-991c-eb0a131fff90-kube-api-access-v9sq5\") pod \"nova-scheduler-0\" (UID: \"588767b4-05f7-468e-991c-eb0a131fff90\") " pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.274789 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588767b4-05f7-468e-991c-eb0a131fff90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"588767b4-05f7-468e-991c-eb0a131fff90\") " pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.279730 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/588767b4-05f7-468e-991c-eb0a131fff90-config-data\") pod \"nova-scheduler-0\" (UID: \"588767b4-05f7-468e-991c-eb0a131fff90\") " pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.280629 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/588767b4-05f7-468e-991c-eb0a131fff90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"588767b4-05f7-468e-991c-eb0a131fff90\") " pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.293532 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9sq5\" (UniqueName: \"kubernetes.io/projected/588767b4-05f7-468e-991c-eb0a131fff90-kube-api-access-v9sq5\") pod \"nova-scheduler-0\" (UID: \"588767b4-05f7-468e-991c-eb0a131fff90\") " pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.371146 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.851959 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 02 12:55:46 crc kubenswrapper[4909]: W0202 12:55:46.856923 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod588767b4_05f7_468e_991c_eb0a131fff90.slice/crio-9099982ee05794025b8fa3c1b4d0978b6e60dd1250651b7f741c91d0bc7d067d WatchSource:0}: Error finding container 9099982ee05794025b8fa3c1b4d0978b6e60dd1250651b7f741c91d0bc7d067d: Status 404 returned error can't find the container with id 9099982ee05794025b8fa3c1b4d0978b6e60dd1250651b7f741c91d0bc7d067d Feb 02 12:55:46 crc kubenswrapper[4909]: I0202 12:55:46.929321 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"588767b4-05f7-468e-991c-eb0a131fff90","Type":"ContainerStarted","Data":"9099982ee05794025b8fa3c1b4d0978b6e60dd1250651b7f741c91d0bc7d067d"} Feb 02 12:55:47 crc kubenswrapper[4909]: I0202 12:55:47.027497 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01d1231b-15a9-4876-9a03-1c8963164da0" path="/var/lib/kubelet/pods/01d1231b-15a9-4876-9a03-1c8963164da0/volumes" Feb 02 12:55:47 crc kubenswrapper[4909]: I0202 12:55:47.938940 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"588767b4-05f7-468e-991c-eb0a131fff90","Type":"ContainerStarted","Data":"65da7a21d0c55c2b8461c1cda7a6c1cedd328bfef6552c58abe9c3827ac5cb1a"} Feb 02 12:55:47 crc kubenswrapper[4909]: I0202 12:55:47.965115 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.965092994 podStartE2EDuration="2.965092994s" podCreationTimestamp="2026-02-02 12:55:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 12:55:47.95648782 +0000 UTC m=+8673.702588565" watchObservedRunningTime="2026-02-02 12:55:47.965092994 +0000 UTC m=+8673.711193749" Feb 02 12:55:49 crc kubenswrapper[4909]: I0202 12:55:49.160612 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 02 12:55:50 crc kubenswrapper[4909]: I0202 12:55:50.207498 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 12:55:50 crc kubenswrapper[4909]: I0202 12:55:50.207770 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 02 12:55:51 crc kubenswrapper[4909]: I0202 12:55:51.222090 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2e59f904-7475-4136-bb41-a907e4855430" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.194:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 02 12:55:51 crc kubenswrapper[4909]: I0202 12:55:51.222169 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2e59f904-7475-4136-bb41-a907e4855430" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.194:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 12:55:51 crc kubenswrapper[4909]: I0202 12:55:51.299874 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 12:55:51 crc kubenswrapper[4909]: I0202 12:55:51.299932 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 02 12:55:51 crc kubenswrapper[4909]: I0202 12:55:51.371749 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 02 12:55:52 crc kubenswrapper[4909]: I0202 12:55:52.016612 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 02 12:55:52 crc kubenswrapper[4909]: I0202 12:55:52.313975 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b22e2928-59ab-47d5-84f8-4ad233ffd449" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.195:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 12:55:52 crc kubenswrapper[4909]: I0202 12:55:52.314909 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b22e2928-59ab-47d5-84f8-4ad233ffd449" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.195:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 02 12:55:56 crc kubenswrapper[4909]: I0202 12:55:56.372356 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 02 12:55:56 crc kubenswrapper[4909]: I0202 12:55:56.405402 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 02 12:55:57 crc kubenswrapper[4909]: I0202 12:55:57.052437 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 02 12:56:00 crc kubenswrapper[4909]: I0202 12:56:00.017101 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:56:00 crc kubenswrapper[4909]: E0202 12:56:00.018200 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:56:00 crc kubenswrapper[4909]: I0202 12:56:00.214168 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 12:56:00 crc kubenswrapper[4909]: I0202 12:56:00.215301 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 02 12:56:00 crc kubenswrapper[4909]: I0202 12:56:00.219202 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 12:56:01 crc kubenswrapper[4909]: I0202 12:56:01.064692 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 02 12:56:01 crc kubenswrapper[4909]: I0202 12:56:01.306950 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 12:56:01 crc kubenswrapper[4909]: I0202 12:56:01.307047 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 02 12:56:01 crc kubenswrapper[4909]: I0202 12:56:01.307586 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 12:56:01 crc kubenswrapper[4909]: I0202 12:56:01.307619 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 02 12:56:01 crc kubenswrapper[4909]: I0202 12:56:01.312568 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 12:56:01 crc kubenswrapper[4909]: I0202 12:56:01.312974 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 02 12:56:11 crc kubenswrapper[4909]: I0202 12:56:11.123442 4909 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod41dd6b28-b792-4d80-9cd2-3ab8f738be53"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod41dd6b28-b792-4d80-9cd2-3ab8f738be53] : Timed out while waiting for systemd to remove kubepods-besteffort-pod41dd6b28_b792_4d80_9cd2_3ab8f738be53.slice" Feb 02 12:56:14 crc kubenswrapper[4909]: I0202 12:56:14.016868 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:56:14 crc kubenswrapper[4909]: E0202 12:56:14.017599 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:56:26 crc kubenswrapper[4909]: I0202 12:56:26.018156 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:56:26 crc kubenswrapper[4909]: E0202 12:56:26.018974 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:56:40 crc kubenswrapper[4909]: I0202 12:56:40.017540 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:56:40 crc kubenswrapper[4909]: E0202 12:56:40.018275 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 12:56:51 crc kubenswrapper[4909]: I0202 12:56:51.017056 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 12:56:51 crc kubenswrapper[4909]: I0202 12:56:51.587966 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"1d50cad22b608ac40c91d192efa1414f2b059dc44f5d2055a429824879c85ebf"} Feb 02 12:58:35 crc kubenswrapper[4909]: I0202 12:58:35.596136 4909 generic.go:334] "Generic (PLEG): container finished" podID="ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" containerID="4dc4cfbedb16ef5ef5002c0803a869aaab80ef02b4e2a755ccdeb82bca2f9609" exitCode=0 Feb 02 12:58:35 crc kubenswrapper[4909]: I0202 12:58:35.596226 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" event={"ID":"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926","Type":"ContainerDied","Data":"4dc4cfbedb16ef5ef5002c0803a869aaab80ef02b4e2a755ccdeb82bca2f9609"} Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.516343 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.617179 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" event={"ID":"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926","Type":"ContainerDied","Data":"86432de1e2a5a002c8402d1513bd8d305de05fc2b935a455aabcc34994df48fa"} Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.617225 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86432de1e2a5a002c8402d1513bd8d305de05fc2b935a455aabcc34994df48fa" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.617258 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.627975 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-0\") pod \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.628104 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-1\") pod \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.628149 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cells-global-config-0\") pod \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.628221 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-1\") pod \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.628247 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-ssh-key-openstack-cell1\") pod \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.628325 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-0\") pod \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.628413 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-combined-ca-bundle\") pod \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.628519 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmx2l\" (UniqueName: \"kubernetes.io/projected/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-kube-api-access-xmx2l\") pod \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.628588 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-inventory\") pod \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\" (UID: \"ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926\") " Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.635105 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-kube-api-access-xmx2l" (OuterVolumeSpecName: "kube-api-access-xmx2l") pod "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" (UID: "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926"). InnerVolumeSpecName "kube-api-access-xmx2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.636012 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" (UID: "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.666092 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" (UID: "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.666129 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-inventory" (OuterVolumeSpecName: "inventory") pod "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" (UID: "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.670824 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" (UID: "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.674326 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" (UID: "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.680048 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" (UID: "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.680047 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" (UID: "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.687073 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" (UID: "ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.731540 4909 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.731587 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmx2l\" (UniqueName: \"kubernetes.io/projected/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-kube-api-access-xmx2l\") on node \"crc\" DevicePath \"\"" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.731598 4909 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-inventory\") on node \"crc\" DevicePath \"\"" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.731607 4909 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.731616 4909 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.731624 4909 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.731633 4909 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.731643 4909 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 02 12:58:37 crc kubenswrapper[4909]: I0202 12:58:37.731652 4909 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 02 12:59:19 crc kubenswrapper[4909]: I0202 12:59:19.511132 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:59:19 crc kubenswrapper[4909]: I0202 12:59:19.511735 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 12:59:49 crc kubenswrapper[4909]: I0202 12:59:49.511406 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 12:59:49 crc kubenswrapper[4909]: I0202 12:59:49.513416 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.154373 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s"] Feb 02 13:00:00 crc kubenswrapper[4909]: E0202 13:00:00.155571 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.155590 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.155876 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.156872 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.159144 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.159265 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.166634 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s"] Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.205417 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2398070c-69f9-43c3-b98d-632e4068a923-secret-volume\") pod \"collect-profiles-29500620-8pd2s\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.205844 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwbqb\" (UniqueName: \"kubernetes.io/projected/2398070c-69f9-43c3-b98d-632e4068a923-kube-api-access-dwbqb\") pod \"collect-profiles-29500620-8pd2s\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.205972 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2398070c-69f9-43c3-b98d-632e4068a923-config-volume\") pod \"collect-profiles-29500620-8pd2s\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.308956 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2398070c-69f9-43c3-b98d-632e4068a923-secret-volume\") pod \"collect-profiles-29500620-8pd2s\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.309033 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwbqb\" (UniqueName: \"kubernetes.io/projected/2398070c-69f9-43c3-b98d-632e4068a923-kube-api-access-dwbqb\") pod \"collect-profiles-29500620-8pd2s\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.309067 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2398070c-69f9-43c3-b98d-632e4068a923-config-volume\") pod \"collect-profiles-29500620-8pd2s\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.310114 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2398070c-69f9-43c3-b98d-632e4068a923-config-volume\") pod \"collect-profiles-29500620-8pd2s\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.318387 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2398070c-69f9-43c3-b98d-632e4068a923-secret-volume\") pod \"collect-profiles-29500620-8pd2s\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.336731 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwbqb\" (UniqueName: \"kubernetes.io/projected/2398070c-69f9-43c3-b98d-632e4068a923-kube-api-access-dwbqb\") pod \"collect-profiles-29500620-8pd2s\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.481431 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:00 crc kubenswrapper[4909]: I0202 13:00:00.934861 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s"] Feb 02 13:00:01 crc kubenswrapper[4909]: I0202 13:00:01.372414 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" event={"ID":"2398070c-69f9-43c3-b98d-632e4068a923","Type":"ContainerStarted","Data":"93cf12d53bb46a8a4a823c0e92e750a132c2056e3e86a84e3aa60c9af320c35c"} Feb 02 13:00:01 crc kubenswrapper[4909]: I0202 13:00:01.373599 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" event={"ID":"2398070c-69f9-43c3-b98d-632e4068a923","Type":"ContainerStarted","Data":"97941bd1857bd628583a777d0a008d1587d7b31840a3e7d057dd27532665fc0b"} Feb 02 13:00:01 crc kubenswrapper[4909]: I0202 13:00:01.398713 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" podStartSLOduration=1.3986883030000001 podStartE2EDuration="1.398688303s" podCreationTimestamp="2026-02-02 13:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:00:01.385574001 +0000 UTC m=+8927.131674736" watchObservedRunningTime="2026-02-02 13:00:01.398688303 +0000 UTC m=+8927.144789038" Feb 02 13:00:02 crc kubenswrapper[4909]: I0202 13:00:02.386442 4909 generic.go:334] "Generic (PLEG): container finished" podID="2398070c-69f9-43c3-b98d-632e4068a923" containerID="93cf12d53bb46a8a4a823c0e92e750a132c2056e3e86a84e3aa60c9af320c35c" exitCode=0 Feb 02 13:00:02 crc kubenswrapper[4909]: I0202 13:00:02.386477 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" event={"ID":"2398070c-69f9-43c3-b98d-632e4068a923","Type":"ContainerDied","Data":"93cf12d53bb46a8a4a823c0e92e750a132c2056e3e86a84e3aa60c9af320c35c"} Feb 02 13:00:03 crc kubenswrapper[4909]: I0202 13:00:03.825895 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:03 crc kubenswrapper[4909]: I0202 13:00:03.883628 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2398070c-69f9-43c3-b98d-632e4068a923-config-volume\") pod \"2398070c-69f9-43c3-b98d-632e4068a923\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " Feb 02 13:00:03 crc kubenswrapper[4909]: I0202 13:00:03.883748 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2398070c-69f9-43c3-b98d-632e4068a923-secret-volume\") pod \"2398070c-69f9-43c3-b98d-632e4068a923\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " Feb 02 13:00:03 crc kubenswrapper[4909]: I0202 13:00:03.883839 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwbqb\" (UniqueName: \"kubernetes.io/projected/2398070c-69f9-43c3-b98d-632e4068a923-kube-api-access-dwbqb\") pod \"2398070c-69f9-43c3-b98d-632e4068a923\" (UID: \"2398070c-69f9-43c3-b98d-632e4068a923\") " Feb 02 13:00:03 crc kubenswrapper[4909]: I0202 13:00:03.885059 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2398070c-69f9-43c3-b98d-632e4068a923-config-volume" (OuterVolumeSpecName: "config-volume") pod "2398070c-69f9-43c3-b98d-632e4068a923" (UID: "2398070c-69f9-43c3-b98d-632e4068a923"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:00:03 crc kubenswrapper[4909]: I0202 13:00:03.986661 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2398070c-69f9-43c3-b98d-632e4068a923-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:00:04 crc kubenswrapper[4909]: I0202 13:00:04.396537 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2398070c-69f9-43c3-b98d-632e4068a923-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2398070c-69f9-43c3-b98d-632e4068a923" (UID: "2398070c-69f9-43c3-b98d-632e4068a923"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:00:04 crc kubenswrapper[4909]: I0202 13:00:04.400135 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2398070c-69f9-43c3-b98d-632e4068a923-kube-api-access-dwbqb" (OuterVolumeSpecName: "kube-api-access-dwbqb") pod "2398070c-69f9-43c3-b98d-632e4068a923" (UID: "2398070c-69f9-43c3-b98d-632e4068a923"). InnerVolumeSpecName "kube-api-access-dwbqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:00:04 crc kubenswrapper[4909]: I0202 13:00:04.427012 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" event={"ID":"2398070c-69f9-43c3-b98d-632e4068a923","Type":"ContainerDied","Data":"97941bd1857bd628583a777d0a008d1587d7b31840a3e7d057dd27532665fc0b"} Feb 02 13:00:04 crc kubenswrapper[4909]: I0202 13:00:04.427108 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97941bd1857bd628583a777d0a008d1587d7b31840a3e7d057dd27532665fc0b" Feb 02 13:00:04 crc kubenswrapper[4909]: I0202 13:00:04.427178 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500620-8pd2s" Feb 02 13:00:04 crc kubenswrapper[4909]: I0202 13:00:04.469278 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf"] Feb 02 13:00:04 crc kubenswrapper[4909]: I0202 13:00:04.479697 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500575-f8dqf"] Feb 02 13:00:04 crc kubenswrapper[4909]: I0202 13:00:04.496758 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2398070c-69f9-43c3-b98d-632e4068a923-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:00:04 crc kubenswrapper[4909]: I0202 13:00:04.496789 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwbqb\" (UniqueName: \"kubernetes.io/projected/2398070c-69f9-43c3-b98d-632e4068a923-kube-api-access-dwbqb\") on node \"crc\" DevicePath \"\"" Feb 02 13:00:05 crc kubenswrapper[4909]: I0202 13:00:05.028567 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87fc55fc-05b5-44b1-9af2-d9ff019235fc" path="/var/lib/kubelet/pods/87fc55fc-05b5-44b1-9af2-d9ff019235fc/volumes" Feb 02 13:00:19 crc kubenswrapper[4909]: I0202 13:00:19.511107 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:00:19 crc kubenswrapper[4909]: I0202 13:00:19.511601 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:00:19 crc kubenswrapper[4909]: I0202 13:00:19.511681 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 13:00:19 crc kubenswrapper[4909]: I0202 13:00:19.512515 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1d50cad22b608ac40c91d192efa1414f2b059dc44f5d2055a429824879c85ebf"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:00:19 crc kubenswrapper[4909]: I0202 13:00:19.512571 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://1d50cad22b608ac40c91d192efa1414f2b059dc44f5d2055a429824879c85ebf" gracePeriod=600 Feb 02 13:00:20 crc kubenswrapper[4909]: I0202 13:00:20.580141 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="1d50cad22b608ac40c91d192efa1414f2b059dc44f5d2055a429824879c85ebf" exitCode=0 Feb 02 13:00:20 crc kubenswrapper[4909]: I0202 13:00:20.580362 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"1d50cad22b608ac40c91d192efa1414f2b059dc44f5d2055a429824879c85ebf"} Feb 02 13:00:20 crc kubenswrapper[4909]: I0202 13:00:20.580713 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0"} Feb 02 13:00:20 crc kubenswrapper[4909]: I0202 13:00:20.580740 4909 scope.go:117] "RemoveContainer" containerID="3bfc21a82b06091afe02cc6557eb6312e52c8b02693181f6d0641646bca9504b" Feb 02 13:00:32 crc kubenswrapper[4909]: I0202 13:00:32.605865 4909 scope.go:117] "RemoveContainer" containerID="09524936d3d2a4a3a0cfe95a26364adc09352cfd88260c68d9a4b10ab84a8ac5" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.173027 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29500621-ptltd"] Feb 02 13:01:00 crc kubenswrapper[4909]: E0202 13:01:00.174168 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2398070c-69f9-43c3-b98d-632e4068a923" containerName="collect-profiles" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.174189 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2398070c-69f9-43c3-b98d-632e4068a923" containerName="collect-profiles" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.174435 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2398070c-69f9-43c3-b98d-632e4068a923" containerName="collect-profiles" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.175422 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.183190 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500621-ptltd"] Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.205511 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-config-data\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.205549 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2cdb\" (UniqueName: \"kubernetes.io/projected/5bfd2650-fa06-4608-8a79-60211d353a92-kube-api-access-b2cdb\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.205601 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-combined-ca-bundle\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.205773 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-fernet-keys\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.307481 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-fernet-keys\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.307540 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-config-data\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.307593 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2cdb\" (UniqueName: \"kubernetes.io/projected/5bfd2650-fa06-4608-8a79-60211d353a92-kube-api-access-b2cdb\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.307671 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-combined-ca-bundle\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.314607 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-combined-ca-bundle\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.314740 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-config-data\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.315407 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-fernet-keys\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.335654 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2cdb\" (UniqueName: \"kubernetes.io/projected/5bfd2650-fa06-4608-8a79-60211d353a92-kube-api-access-b2cdb\") pod \"keystone-cron-29500621-ptltd\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:00 crc kubenswrapper[4909]: I0202 13:01:00.516328 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:01 crc kubenswrapper[4909]: I0202 13:01:01.076535 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29500621-ptltd"] Feb 02 13:01:01 crc kubenswrapper[4909]: I0202 13:01:01.974079 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500621-ptltd" event={"ID":"5bfd2650-fa06-4608-8a79-60211d353a92","Type":"ContainerStarted","Data":"ca716e002fe53fa8dbee36bb2b05a46ec707f95c74ef304f5fe32a352765967d"} Feb 02 13:01:01 crc kubenswrapper[4909]: I0202 13:01:01.974431 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500621-ptltd" event={"ID":"5bfd2650-fa06-4608-8a79-60211d353a92","Type":"ContainerStarted","Data":"bbfdb2da6ea2a26bcb2bf5ccef6c9ab7ed82a1c7b39974d401a35bd79d96542d"} Feb 02 13:01:02 crc kubenswrapper[4909]: I0202 13:01:02.006986 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29500621-ptltd" podStartSLOduration=2.00696093 podStartE2EDuration="2.00696093s" podCreationTimestamp="2026-02-02 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:01:01.996460552 +0000 UTC m=+8987.742561297" watchObservedRunningTime="2026-02-02 13:01:02.00696093 +0000 UTC m=+8987.753061665" Feb 02 13:01:03 crc kubenswrapper[4909]: I0202 13:01:03.997150 4909 generic.go:334] "Generic (PLEG): container finished" podID="5bfd2650-fa06-4608-8a79-60211d353a92" containerID="ca716e002fe53fa8dbee36bb2b05a46ec707f95c74ef304f5fe32a352765967d" exitCode=0 Feb 02 13:01:03 crc kubenswrapper[4909]: I0202 13:01:03.997219 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500621-ptltd" event={"ID":"5bfd2650-fa06-4608-8a79-60211d353a92","Type":"ContainerDied","Data":"ca716e002fe53fa8dbee36bb2b05a46ec707f95c74ef304f5fe32a352765967d"} Feb 02 13:01:04 crc kubenswrapper[4909]: I0202 13:01:04.894045 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 02 13:01:04 crc kubenswrapper[4909]: I0202 13:01:04.894286 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="e2846250-fddd-45b4-9dd2-f432fca93762" containerName="adoption" containerID="cri-o://9c27f823792342dbc1595f839bc948799613166a05cd2289d49cee190b5fcc0d" gracePeriod=30 Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.352488 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.439662 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-combined-ca-bundle\") pod \"5bfd2650-fa06-4608-8a79-60211d353a92\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.439759 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-fernet-keys\") pod \"5bfd2650-fa06-4608-8a79-60211d353a92\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.439797 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2cdb\" (UniqueName: \"kubernetes.io/projected/5bfd2650-fa06-4608-8a79-60211d353a92-kube-api-access-b2cdb\") pod \"5bfd2650-fa06-4608-8a79-60211d353a92\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.439921 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-config-data\") pod \"5bfd2650-fa06-4608-8a79-60211d353a92\" (UID: \"5bfd2650-fa06-4608-8a79-60211d353a92\") " Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.444434 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bfd2650-fa06-4608-8a79-60211d353a92-kube-api-access-b2cdb" (OuterVolumeSpecName: "kube-api-access-b2cdb") pod "5bfd2650-fa06-4608-8a79-60211d353a92" (UID: "5bfd2650-fa06-4608-8a79-60211d353a92"). InnerVolumeSpecName "kube-api-access-b2cdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.453679 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5bfd2650-fa06-4608-8a79-60211d353a92" (UID: "5bfd2650-fa06-4608-8a79-60211d353a92"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.478726 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bfd2650-fa06-4608-8a79-60211d353a92" (UID: "5bfd2650-fa06-4608-8a79-60211d353a92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.505024 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-config-data" (OuterVolumeSpecName: "config-data") pod "5bfd2650-fa06-4608-8a79-60211d353a92" (UID: "5bfd2650-fa06-4608-8a79-60211d353a92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.542165 4909 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.542210 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2cdb\" (UniqueName: \"kubernetes.io/projected/5bfd2650-fa06-4608-8a79-60211d353a92-kube-api-access-b2cdb\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.542223 4909 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-config-data\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:05 crc kubenswrapper[4909]: I0202 13:01:05.542235 4909 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bfd2650-fa06-4608-8a79-60211d353a92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:06 crc kubenswrapper[4909]: I0202 13:01:06.018689 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29500621-ptltd" event={"ID":"5bfd2650-fa06-4608-8a79-60211d353a92","Type":"ContainerDied","Data":"bbfdb2da6ea2a26bcb2bf5ccef6c9ab7ed82a1c7b39974d401a35bd79d96542d"} Feb 02 13:01:06 crc kubenswrapper[4909]: I0202 13:01:06.019079 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbfdb2da6ea2a26bcb2bf5ccef6c9ab7ed82a1c7b39974d401a35bd79d96542d" Feb 02 13:01:06 crc kubenswrapper[4909]: I0202 13:01:06.018789 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29500621-ptltd" Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.283212 4909 generic.go:334] "Generic (PLEG): container finished" podID="e2846250-fddd-45b4-9dd2-f432fca93762" containerID="9c27f823792342dbc1595f839bc948799613166a05cd2289d49cee190b5fcc0d" exitCode=137 Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.283274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e2846250-fddd-45b4-9dd2-f432fca93762","Type":"ContainerDied","Data":"9c27f823792342dbc1595f839bc948799613166a05cd2289d49cee190b5fcc0d"} Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.284071 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"e2846250-fddd-45b4-9dd2-f432fca93762","Type":"ContainerDied","Data":"c8410835049d3a4e81d4e06fbb1386477685edeb67be6ae4abcce306ffcdfe5d"} Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.284090 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8410835049d3a4e81d4e06fbb1386477685edeb67be6ae4abcce306ffcdfe5d" Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.339392 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.530931 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfcck\" (UniqueName: \"kubernetes.io/projected/e2846250-fddd-45b4-9dd2-f432fca93762-kube-api-access-lfcck\") pod \"e2846250-fddd-45b4-9dd2-f432fca93762\" (UID: \"e2846250-fddd-45b4-9dd2-f432fca93762\") " Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.531770 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\") pod \"e2846250-fddd-45b4-9dd2-f432fca93762\" (UID: \"e2846250-fddd-45b4-9dd2-f432fca93762\") " Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.540727 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2846250-fddd-45b4-9dd2-f432fca93762-kube-api-access-lfcck" (OuterVolumeSpecName: "kube-api-access-lfcck") pod "e2846250-fddd-45b4-9dd2-f432fca93762" (UID: "e2846250-fddd-45b4-9dd2-f432fca93762"). InnerVolumeSpecName "kube-api-access-lfcck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.559920 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1" (OuterVolumeSpecName: "mariadb-data") pod "e2846250-fddd-45b4-9dd2-f432fca93762" (UID: "e2846250-fddd-45b4-9dd2-f432fca93762"). InnerVolumeSpecName "pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.635147 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\") on node \"crc\" " Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.635197 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfcck\" (UniqueName: \"kubernetes.io/projected/e2846250-fddd-45b4-9dd2-f432fca93762-kube-api-access-lfcck\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.661581 4909 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.662203 4909 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1") on node "crc" Feb 02 13:01:35 crc kubenswrapper[4909]: I0202 13:01:35.737394 4909 reconciler_common.go:293] "Volume detached for volume \"pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b73b7ec-1c59-4e90-9a12-6aa3228363e1\") on node \"crc\" DevicePath \"\"" Feb 02 13:01:36 crc kubenswrapper[4909]: I0202 13:01:36.291700 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 02 13:01:36 crc kubenswrapper[4909]: I0202 13:01:36.324178 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 02 13:01:36 crc kubenswrapper[4909]: I0202 13:01:36.333120 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Feb 02 13:01:36 crc kubenswrapper[4909]: I0202 13:01:36.891012 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 02 13:01:36 crc kubenswrapper[4909]: I0202 13:01:36.891524 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="799e15d2-1ed2-4524-8b75-582f98eb003e" containerName="adoption" containerID="cri-o://70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95" gracePeriod=30 Feb 02 13:01:37 crc kubenswrapper[4909]: I0202 13:01:37.026843 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2846250-fddd-45b4-9dd2-f432fca93762" path="/var/lib/kubelet/pods/e2846250-fddd-45b4-9dd2-f432fca93762/volumes" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.401151 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.463238 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eda687f-6113-4809-8889-105099e40afa\") pod \"799e15d2-1ed2-4524-8b75-582f98eb003e\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.463292 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/799e15d2-1ed2-4524-8b75-582f98eb003e-ovn-data-cert\") pod \"799e15d2-1ed2-4524-8b75-582f98eb003e\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.463550 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7xvw\" (UniqueName: \"kubernetes.io/projected/799e15d2-1ed2-4524-8b75-582f98eb003e-kube-api-access-l7xvw\") pod \"799e15d2-1ed2-4524-8b75-582f98eb003e\" (UID: \"799e15d2-1ed2-4524-8b75-582f98eb003e\") " Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.469046 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/799e15d2-1ed2-4524-8b75-582f98eb003e-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "799e15d2-1ed2-4524-8b75-582f98eb003e" (UID: "799e15d2-1ed2-4524-8b75-582f98eb003e"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.476675 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/799e15d2-1ed2-4524-8b75-582f98eb003e-kube-api-access-l7xvw" (OuterVolumeSpecName: "kube-api-access-l7xvw") pod "799e15d2-1ed2-4524-8b75-582f98eb003e" (UID: "799e15d2-1ed2-4524-8b75-582f98eb003e"). InnerVolumeSpecName "kube-api-access-l7xvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.483293 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eda687f-6113-4809-8889-105099e40afa" (OuterVolumeSpecName: "ovn-data") pod "799e15d2-1ed2-4524-8b75-582f98eb003e" (UID: "799e15d2-1ed2-4524-8b75-582f98eb003e"). InnerVolumeSpecName "pvc-6eda687f-6113-4809-8889-105099e40afa". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.566598 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7xvw\" (UniqueName: \"kubernetes.io/projected/799e15d2-1ed2-4524-8b75-582f98eb003e-kube-api-access-l7xvw\") on node \"crc\" DevicePath \"\"" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.566669 4909 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-6eda687f-6113-4809-8889-105099e40afa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eda687f-6113-4809-8889-105099e40afa\") on node \"crc\" " Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.566688 4909 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/799e15d2-1ed2-4524-8b75-582f98eb003e-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.590915 4909 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.591075 4909 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-6eda687f-6113-4809-8889-105099e40afa" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eda687f-6113-4809-8889-105099e40afa") on node "crc" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.604146 4909 generic.go:334] "Generic (PLEG): container finished" podID="799e15d2-1ed2-4524-8b75-582f98eb003e" containerID="70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95" exitCode=137 Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.604194 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"799e15d2-1ed2-4524-8b75-582f98eb003e","Type":"ContainerDied","Data":"70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95"} Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.604225 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"799e15d2-1ed2-4524-8b75-582f98eb003e","Type":"ContainerDied","Data":"c239e02f47721c51b53842592422846ddaa2f40b82d346837a9ccff31a493823"} Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.604244 4909 scope.go:117] "RemoveContainer" containerID="70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.604374 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.643659 4909 scope.go:117] "RemoveContainer" containerID="70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95" Feb 02 13:02:07 crc kubenswrapper[4909]: E0202 13:02:07.644204 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95\": container with ID starting with 70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95 not found: ID does not exist" containerID="70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.644232 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95"} err="failed to get container status \"70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95\": rpc error: code = NotFound desc = could not find container \"70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95\": container with ID starting with 70c1210df332139f3621467af18fc5eaa85b9b4ecb2c86f0c3418757926dcb95 not found: ID does not exist" Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.648512 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.657948 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Feb 02 13:02:07 crc kubenswrapper[4909]: I0202 13:02:07.668173 4909 reconciler_common.go:293] "Volume detached for volume \"pvc-6eda687f-6113-4809-8889-105099e40afa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eda687f-6113-4809-8889-105099e40afa\") on node \"crc\" DevicePath \"\"" Feb 02 13:02:09 crc kubenswrapper[4909]: I0202 13:02:09.027946 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="799e15d2-1ed2-4524-8b75-582f98eb003e" path="/var/lib/kubelet/pods/799e15d2-1ed2-4524-8b75-582f98eb003e/volumes" Feb 02 13:02:19 crc kubenswrapper[4909]: I0202 13:02:19.510747 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:02:19 crc kubenswrapper[4909]: I0202 13:02:19.511833 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:02:32 crc kubenswrapper[4909]: I0202 13:02:32.682093 4909 scope.go:117] "RemoveContainer" containerID="9c27f823792342dbc1595f839bc948799613166a05cd2289d49cee190b5fcc0d" Feb 02 13:02:49 crc kubenswrapper[4909]: I0202 13:02:49.510433 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:02:49 crc kubenswrapper[4909]: I0202 13:02:49.511204 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.773479 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tjhlg"] Feb 02 13:02:58 crc kubenswrapper[4909]: E0202 13:02:58.774498 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bfd2650-fa06-4608-8a79-60211d353a92" containerName="keystone-cron" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.774512 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bfd2650-fa06-4608-8a79-60211d353a92" containerName="keystone-cron" Feb 02 13:02:58 crc kubenswrapper[4909]: E0202 13:02:58.774540 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="799e15d2-1ed2-4524-8b75-582f98eb003e" containerName="adoption" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.774546 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="799e15d2-1ed2-4524-8b75-582f98eb003e" containerName="adoption" Feb 02 13:02:58 crc kubenswrapper[4909]: E0202 13:02:58.774566 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2846250-fddd-45b4-9dd2-f432fca93762" containerName="adoption" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.774572 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2846250-fddd-45b4-9dd2-f432fca93762" containerName="adoption" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.774778 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="799e15d2-1ed2-4524-8b75-582f98eb003e" containerName="adoption" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.774792 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bfd2650-fa06-4608-8a79-60211d353a92" containerName="keystone-cron" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.774801 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2846250-fddd-45b4-9dd2-f432fca93762" containerName="adoption" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.776219 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.785088 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tjhlg"] Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.917145 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-utilities\") pod \"redhat-operators-tjhlg\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.917297 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-catalog-content\") pod \"redhat-operators-tjhlg\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:02:58 crc kubenswrapper[4909]: I0202 13:02:58.917343 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhjx5\" (UniqueName: \"kubernetes.io/projected/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-kube-api-access-dhjx5\") pod \"redhat-operators-tjhlg\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:02:59 crc kubenswrapper[4909]: I0202 13:02:59.019063 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-catalog-content\") pod \"redhat-operators-tjhlg\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:02:59 crc kubenswrapper[4909]: I0202 13:02:59.019158 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhjx5\" (UniqueName: \"kubernetes.io/projected/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-kube-api-access-dhjx5\") pod \"redhat-operators-tjhlg\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:02:59 crc kubenswrapper[4909]: I0202 13:02:59.019399 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-utilities\") pod \"redhat-operators-tjhlg\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:02:59 crc kubenswrapper[4909]: I0202 13:02:59.020071 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-utilities\") pod \"redhat-operators-tjhlg\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:02:59 crc kubenswrapper[4909]: I0202 13:02:59.020196 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-catalog-content\") pod \"redhat-operators-tjhlg\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:02:59 crc kubenswrapper[4909]: I0202 13:02:59.497992 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhjx5\" (UniqueName: \"kubernetes.io/projected/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-kube-api-access-dhjx5\") pod \"redhat-operators-tjhlg\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:02:59 crc kubenswrapper[4909]: I0202 13:02:59.727367 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:03:00 crc kubenswrapper[4909]: I0202 13:03:00.283909 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tjhlg"] Feb 02 13:03:01 crc kubenswrapper[4909]: I0202 13:03:01.108153 4909 generic.go:334] "Generic (PLEG): container finished" podID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerID="5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224" exitCode=0 Feb 02 13:03:01 crc kubenswrapper[4909]: I0202 13:03:01.108416 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tjhlg" event={"ID":"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7","Type":"ContainerDied","Data":"5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224"} Feb 02 13:03:01 crc kubenswrapper[4909]: I0202 13:03:01.108441 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tjhlg" event={"ID":"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7","Type":"ContainerStarted","Data":"6aee8ff022d0bc5ca32643f94a50ec1f63b36a0d031c07d44692b6a72a6a5ee9"} Feb 02 13:03:01 crc kubenswrapper[4909]: I0202 13:03:01.110272 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.422974 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fhvg7/must-gather-r8h2t"] Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.425424 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/must-gather-r8h2t" Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.428618 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fhvg7"/"kube-root-ca.crt" Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.430801 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fhvg7"/"openshift-service-ca.crt" Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.432330 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fhvg7"/"default-dockercfg-58kfb" Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.434064 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fhvg7/must-gather-r8h2t"] Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.600038 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98d90e35-6535-4632-a1ac-594af1cae16e-must-gather-output\") pod \"must-gather-r8h2t\" (UID: \"98d90e35-6535-4632-a1ac-594af1cae16e\") " pod="openshift-must-gather-fhvg7/must-gather-r8h2t" Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.600437 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqvfv\" (UniqueName: \"kubernetes.io/projected/98d90e35-6535-4632-a1ac-594af1cae16e-kube-api-access-cqvfv\") pod \"must-gather-r8h2t\" (UID: \"98d90e35-6535-4632-a1ac-594af1cae16e\") " pod="openshift-must-gather-fhvg7/must-gather-r8h2t" Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.702392 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98d90e35-6535-4632-a1ac-594af1cae16e-must-gather-output\") pod \"must-gather-r8h2t\" (UID: \"98d90e35-6535-4632-a1ac-594af1cae16e\") " pod="openshift-must-gather-fhvg7/must-gather-r8h2t" Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.702458 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqvfv\" (UniqueName: \"kubernetes.io/projected/98d90e35-6535-4632-a1ac-594af1cae16e-kube-api-access-cqvfv\") pod \"must-gather-r8h2t\" (UID: \"98d90e35-6535-4632-a1ac-594af1cae16e\") " pod="openshift-must-gather-fhvg7/must-gather-r8h2t" Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.702843 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98d90e35-6535-4632-a1ac-594af1cae16e-must-gather-output\") pod \"must-gather-r8h2t\" (UID: \"98d90e35-6535-4632-a1ac-594af1cae16e\") " pod="openshift-must-gather-fhvg7/must-gather-r8h2t" Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.731618 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqvfv\" (UniqueName: \"kubernetes.io/projected/98d90e35-6535-4632-a1ac-594af1cae16e-kube-api-access-cqvfv\") pod \"must-gather-r8h2t\" (UID: \"98d90e35-6535-4632-a1ac-594af1cae16e\") " pod="openshift-must-gather-fhvg7/must-gather-r8h2t" Feb 02 13:03:02 crc kubenswrapper[4909]: I0202 13:03:02.747725 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/must-gather-r8h2t" Feb 02 13:03:03 crc kubenswrapper[4909]: I0202 13:03:03.133007 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tjhlg" event={"ID":"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7","Type":"ContainerStarted","Data":"2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1"} Feb 02 13:03:03 crc kubenswrapper[4909]: W0202 13:03:03.285221 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98d90e35_6535_4632_a1ac_594af1cae16e.slice/crio-e34f6aba480fde9ef41e75a48cd2b0ea5813ca359e972af43102849c5b675e3f WatchSource:0}: Error finding container e34f6aba480fde9ef41e75a48cd2b0ea5813ca359e972af43102849c5b675e3f: Status 404 returned error can't find the container with id e34f6aba480fde9ef41e75a48cd2b0ea5813ca359e972af43102849c5b675e3f Feb 02 13:03:03 crc kubenswrapper[4909]: I0202 13:03:03.290340 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fhvg7/must-gather-r8h2t"] Feb 02 13:03:04 crc kubenswrapper[4909]: I0202 13:03:04.156017 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/must-gather-r8h2t" event={"ID":"98d90e35-6535-4632-a1ac-594af1cae16e","Type":"ContainerStarted","Data":"e34f6aba480fde9ef41e75a48cd2b0ea5813ca359e972af43102849c5b675e3f"} Feb 02 13:03:08 crc kubenswrapper[4909]: I0202 13:03:08.197311 4909 generic.go:334] "Generic (PLEG): container finished" podID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerID="2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1" exitCode=0 Feb 02 13:03:08 crc kubenswrapper[4909]: I0202 13:03:08.197455 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tjhlg" event={"ID":"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7","Type":"ContainerDied","Data":"2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1"} Feb 02 13:03:09 crc kubenswrapper[4909]: I0202 13:03:09.210983 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tjhlg" event={"ID":"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7","Type":"ContainerStarted","Data":"b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b"} Feb 02 13:03:09 crc kubenswrapper[4909]: I0202 13:03:09.212542 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/must-gather-r8h2t" event={"ID":"98d90e35-6535-4632-a1ac-594af1cae16e","Type":"ContainerStarted","Data":"184a0998b6c032682180ea2cff00141c58c145f82b956a39952817df1216ef9b"} Feb 02 13:03:09 crc kubenswrapper[4909]: I0202 13:03:09.212575 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/must-gather-r8h2t" event={"ID":"98d90e35-6535-4632-a1ac-594af1cae16e","Type":"ContainerStarted","Data":"35bda5a09429f91f245b82f4294fc01560eaadfd0b8321433ce89ce81b3a3706"} Feb 02 13:03:09 crc kubenswrapper[4909]: I0202 13:03:09.240428 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tjhlg" podStartSLOduration=3.73133428 podStartE2EDuration="11.240403558s" podCreationTimestamp="2026-02-02 13:02:58 +0000 UTC" firstStartedPulling="2026-02-02 13:03:01.110011455 +0000 UTC m=+9106.856112190" lastFinishedPulling="2026-02-02 13:03:08.619080733 +0000 UTC m=+9114.365181468" observedRunningTime="2026-02-02 13:03:09.229455457 +0000 UTC m=+9114.975556202" watchObservedRunningTime="2026-02-02 13:03:09.240403558 +0000 UTC m=+9114.986504293" Feb 02 13:03:09 crc kubenswrapper[4909]: I0202 13:03:09.258762 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fhvg7/must-gather-r8h2t" podStartSLOduration=2.869485509 podStartE2EDuration="7.258743008s" podCreationTimestamp="2026-02-02 13:03:02 +0000 UTC" firstStartedPulling="2026-02-02 13:03:03.287411321 +0000 UTC m=+9109.033512056" lastFinishedPulling="2026-02-02 13:03:07.67666882 +0000 UTC m=+9113.422769555" observedRunningTime="2026-02-02 13:03:09.243232848 +0000 UTC m=+9114.989333593" watchObservedRunningTime="2026-02-02 13:03:09.258743008 +0000 UTC m=+9115.004843743" Feb 02 13:03:09 crc kubenswrapper[4909]: I0202 13:03:09.728063 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:03:09 crc kubenswrapper[4909]: I0202 13:03:09.728112 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:03:10 crc kubenswrapper[4909]: I0202 13:03:10.886917 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tjhlg" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="registry-server" probeResult="failure" output=< Feb 02 13:03:10 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 13:03:10 crc kubenswrapper[4909]: > Feb 02 13:03:12 crc kubenswrapper[4909]: I0202 13:03:12.927595 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fhvg7/crc-debug-49z9s"] Feb 02 13:03:12 crc kubenswrapper[4909]: I0202 13:03:12.929714 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-49z9s" Feb 02 13:03:13 crc kubenswrapper[4909]: I0202 13:03:13.058162 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a380bed9-d687-45d8-8436-fac103efe7d5-host\") pod \"crc-debug-49z9s\" (UID: \"a380bed9-d687-45d8-8436-fac103efe7d5\") " pod="openshift-must-gather-fhvg7/crc-debug-49z9s" Feb 02 13:03:13 crc kubenswrapper[4909]: I0202 13:03:13.058306 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q8cb\" (UniqueName: \"kubernetes.io/projected/a380bed9-d687-45d8-8436-fac103efe7d5-kube-api-access-6q8cb\") pod \"crc-debug-49z9s\" (UID: \"a380bed9-d687-45d8-8436-fac103efe7d5\") " pod="openshift-must-gather-fhvg7/crc-debug-49z9s" Feb 02 13:03:13 crc kubenswrapper[4909]: I0202 13:03:13.160055 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a380bed9-d687-45d8-8436-fac103efe7d5-host\") pod \"crc-debug-49z9s\" (UID: \"a380bed9-d687-45d8-8436-fac103efe7d5\") " pod="openshift-must-gather-fhvg7/crc-debug-49z9s" Feb 02 13:03:13 crc kubenswrapper[4909]: I0202 13:03:13.160187 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q8cb\" (UniqueName: \"kubernetes.io/projected/a380bed9-d687-45d8-8436-fac103efe7d5-kube-api-access-6q8cb\") pod \"crc-debug-49z9s\" (UID: \"a380bed9-d687-45d8-8436-fac103efe7d5\") " pod="openshift-must-gather-fhvg7/crc-debug-49z9s" Feb 02 13:03:13 crc kubenswrapper[4909]: I0202 13:03:13.160195 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a380bed9-d687-45d8-8436-fac103efe7d5-host\") pod \"crc-debug-49z9s\" (UID: \"a380bed9-d687-45d8-8436-fac103efe7d5\") " pod="openshift-must-gather-fhvg7/crc-debug-49z9s" Feb 02 13:03:13 crc kubenswrapper[4909]: I0202 13:03:13.181098 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q8cb\" (UniqueName: \"kubernetes.io/projected/a380bed9-d687-45d8-8436-fac103efe7d5-kube-api-access-6q8cb\") pod \"crc-debug-49z9s\" (UID: \"a380bed9-d687-45d8-8436-fac103efe7d5\") " pod="openshift-must-gather-fhvg7/crc-debug-49z9s" Feb 02 13:03:13 crc kubenswrapper[4909]: I0202 13:03:13.269665 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-49z9s" Feb 02 13:03:13 crc kubenswrapper[4909]: W0202 13:03:13.315221 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda380bed9_d687_45d8_8436_fac103efe7d5.slice/crio-53796bb93e1039f8b8dd1f0bc2816d1e40cfd4a81de3e85227358da659faacf9 WatchSource:0}: Error finding container 53796bb93e1039f8b8dd1f0bc2816d1e40cfd4a81de3e85227358da659faacf9: Status 404 returned error can't find the container with id 53796bb93e1039f8b8dd1f0bc2816d1e40cfd4a81de3e85227358da659faacf9 Feb 02 13:03:14 crc kubenswrapper[4909]: I0202 13:03:14.258842 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/crc-debug-49z9s" event={"ID":"a380bed9-d687-45d8-8436-fac103efe7d5","Type":"ContainerStarted","Data":"53796bb93e1039f8b8dd1f0bc2816d1e40cfd4a81de3e85227358da659faacf9"} Feb 02 13:03:19 crc kubenswrapper[4909]: I0202 13:03:19.510498 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:03:19 crc kubenswrapper[4909]: I0202 13:03:19.511030 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:03:19 crc kubenswrapper[4909]: I0202 13:03:19.511079 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 13:03:19 crc kubenswrapper[4909]: I0202 13:03:19.511848 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:03:19 crc kubenswrapper[4909]: I0202 13:03:19.511904 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" gracePeriod=600 Feb 02 13:03:20 crc kubenswrapper[4909]: I0202 13:03:20.318292 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" exitCode=0 Feb 02 13:03:20 crc kubenswrapper[4909]: I0202 13:03:20.318636 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0"} Feb 02 13:03:20 crc kubenswrapper[4909]: I0202 13:03:20.318666 4909 scope.go:117] "RemoveContainer" containerID="1d50cad22b608ac40c91d192efa1414f2b059dc44f5d2055a429824879c85ebf" Feb 02 13:03:20 crc kubenswrapper[4909]: I0202 13:03:20.781919 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tjhlg" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="registry-server" probeResult="failure" output=< Feb 02 13:03:20 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 13:03:20 crc kubenswrapper[4909]: > Feb 02 13:03:25 crc kubenswrapper[4909]: E0202 13:03:25.272654 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:03:25 crc kubenswrapper[4909]: I0202 13:03:25.378606 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:03:25 crc kubenswrapper[4909]: E0202 13:03:25.379320 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:03:26 crc kubenswrapper[4909]: I0202 13:03:26.388621 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/crc-debug-49z9s" event={"ID":"a380bed9-d687-45d8-8436-fac103efe7d5","Type":"ContainerStarted","Data":"31797127a8c10e945e8a5c622a0b732618c08855d7bf0dc5b391e5495f529d0b"} Feb 02 13:03:26 crc kubenswrapper[4909]: I0202 13:03:26.415402 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fhvg7/crc-debug-49z9s" podStartSLOduration=2.368414723 podStartE2EDuration="14.415384125s" podCreationTimestamp="2026-02-02 13:03:12 +0000 UTC" firstStartedPulling="2026-02-02 13:03:13.317549413 +0000 UTC m=+9119.063650158" lastFinishedPulling="2026-02-02 13:03:25.364518825 +0000 UTC m=+9131.110619560" observedRunningTime="2026-02-02 13:03:26.40249943 +0000 UTC m=+9132.148600165" watchObservedRunningTime="2026-02-02 13:03:26.415384125 +0000 UTC m=+9132.161484860" Feb 02 13:03:30 crc kubenswrapper[4909]: I0202 13:03:30.780698 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tjhlg" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="registry-server" probeResult="failure" output=< Feb 02 13:03:30 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 13:03:30 crc kubenswrapper[4909]: > Feb 02 13:03:37 crc kubenswrapper[4909]: I0202 13:03:37.017098 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:03:37 crc kubenswrapper[4909]: E0202 13:03:37.018092 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:03:40 crc kubenswrapper[4909]: I0202 13:03:40.548155 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:03:40 crc kubenswrapper[4909]: I0202 13:03:40.606499 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:03:40 crc kubenswrapper[4909]: I0202 13:03:40.793263 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tjhlg"] Feb 02 13:03:42 crc kubenswrapper[4909]: I0202 13:03:42.554000 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tjhlg" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="registry-server" containerID="cri-o://b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b" gracePeriod=2 Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.511301 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.572979 4909 generic.go:334] "Generic (PLEG): container finished" podID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerID="b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b" exitCode=0 Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.573026 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tjhlg" event={"ID":"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7","Type":"ContainerDied","Data":"b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b"} Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.573058 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tjhlg" event={"ID":"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7","Type":"ContainerDied","Data":"6aee8ff022d0bc5ca32643f94a50ec1f63b36a0d031c07d44692b6a72a6a5ee9"} Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.573080 4909 scope.go:117] "RemoveContainer" containerID="b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.573210 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tjhlg" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.614413 4909 scope.go:117] "RemoveContainer" containerID="2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.637077 4909 scope.go:117] "RemoveContainer" containerID="5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.648545 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhjx5\" (UniqueName: \"kubernetes.io/projected/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-kube-api-access-dhjx5\") pod \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.648924 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-catalog-content\") pod \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.648979 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-utilities\") pod \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\" (UID: \"2f411a95-3656-4bfa-97b3-8cfc2f46b7d7\") " Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.649602 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-utilities" (OuterVolumeSpecName: "utilities") pod "2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" (UID: "2f411a95-3656-4bfa-97b3-8cfc2f46b7d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.654729 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-kube-api-access-dhjx5" (OuterVolumeSpecName: "kube-api-access-dhjx5") pod "2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" (UID: "2f411a95-3656-4bfa-97b3-8cfc2f46b7d7"). InnerVolumeSpecName "kube-api-access-dhjx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.738141 4909 scope.go:117] "RemoveContainer" containerID="b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b" Feb 02 13:03:43 crc kubenswrapper[4909]: E0202 13:03:43.738517 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b\": container with ID starting with b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b not found: ID does not exist" containerID="b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.738562 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b"} err="failed to get container status \"b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b\": rpc error: code = NotFound desc = could not find container \"b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b\": container with ID starting with b62c5b2977e9a29c216ddef9df8b8bbaf1b1cf40375cfdbd0325e406f5797e3b not found: ID does not exist" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.738587 4909 scope.go:117] "RemoveContainer" containerID="2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1" Feb 02 13:03:43 crc kubenswrapper[4909]: E0202 13:03:43.738844 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1\": container with ID starting with 2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1 not found: ID does not exist" containerID="2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.738874 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1"} err="failed to get container status \"2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1\": rpc error: code = NotFound desc = could not find container \"2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1\": container with ID starting with 2d619e59a49ebaa13fb23f797eb7d32d8fe8d5705ccfd97aa0e9554672516bb1 not found: ID does not exist" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.738892 4909 scope.go:117] "RemoveContainer" containerID="5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224" Feb 02 13:03:43 crc kubenswrapper[4909]: E0202 13:03:43.739184 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224\": container with ID starting with 5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224 not found: ID does not exist" containerID="5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.739226 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224"} err="failed to get container status \"5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224\": rpc error: code = NotFound desc = could not find container \"5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224\": container with ID starting with 5c6ec45400c11f2f1caeaacd028cd5b1e34c2d3e459a4541be62381b14c48224 not found: ID does not exist" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.751525 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.751569 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhjx5\" (UniqueName: \"kubernetes.io/projected/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-kube-api-access-dhjx5\") on node \"crc\" DevicePath \"\"" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.774479 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" (UID: "2f411a95-3656-4bfa-97b3-8cfc2f46b7d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.853461 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.930855 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tjhlg"] Feb 02 13:03:43 crc kubenswrapper[4909]: I0202 13:03:43.940296 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tjhlg"] Feb 02 13:03:45 crc kubenswrapper[4909]: I0202 13:03:45.049972 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" path="/var/lib/kubelet/pods/2f411a95-3656-4bfa-97b3-8cfc2f46b7d7/volumes" Feb 02 13:03:50 crc kubenswrapper[4909]: I0202 13:03:50.017087 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:03:50 crc kubenswrapper[4909]: E0202 13:03:50.018708 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:04:01 crc kubenswrapper[4909]: I0202 13:04:01.016511 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:04:01 crc kubenswrapper[4909]: E0202 13:04:01.017314 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:04:08 crc kubenswrapper[4909]: I0202 13:04:08.823453 4909 generic.go:334] "Generic (PLEG): container finished" podID="a380bed9-d687-45d8-8436-fac103efe7d5" containerID="31797127a8c10e945e8a5c622a0b732618c08855d7bf0dc5b391e5495f529d0b" exitCode=0 Feb 02 13:04:08 crc kubenswrapper[4909]: I0202 13:04:08.823536 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/crc-debug-49z9s" event={"ID":"a380bed9-d687-45d8-8436-fac103efe7d5","Type":"ContainerDied","Data":"31797127a8c10e945e8a5c622a0b732618c08855d7bf0dc5b391e5495f529d0b"} Feb 02 13:04:09 crc kubenswrapper[4909]: I0202 13:04:09.975079 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-49z9s" Feb 02 13:04:10 crc kubenswrapper[4909]: I0202 13:04:10.010690 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fhvg7/crc-debug-49z9s"] Feb 02 13:04:10 crc kubenswrapper[4909]: I0202 13:04:10.023598 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fhvg7/crc-debug-49z9s"] Feb 02 13:04:10 crc kubenswrapper[4909]: I0202 13:04:10.135782 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a380bed9-d687-45d8-8436-fac103efe7d5-host\") pod \"a380bed9-d687-45d8-8436-fac103efe7d5\" (UID: \"a380bed9-d687-45d8-8436-fac103efe7d5\") " Feb 02 13:04:10 crc kubenswrapper[4909]: I0202 13:04:10.135915 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q8cb\" (UniqueName: \"kubernetes.io/projected/a380bed9-d687-45d8-8436-fac103efe7d5-kube-api-access-6q8cb\") pod \"a380bed9-d687-45d8-8436-fac103efe7d5\" (UID: \"a380bed9-d687-45d8-8436-fac103efe7d5\") " Feb 02 13:04:10 crc kubenswrapper[4909]: I0202 13:04:10.135913 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a380bed9-d687-45d8-8436-fac103efe7d5-host" (OuterVolumeSpecName: "host") pod "a380bed9-d687-45d8-8436-fac103efe7d5" (UID: "a380bed9-d687-45d8-8436-fac103efe7d5"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:04:10 crc kubenswrapper[4909]: I0202 13:04:10.136624 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a380bed9-d687-45d8-8436-fac103efe7d5-host\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:10 crc kubenswrapper[4909]: I0202 13:04:10.155018 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a380bed9-d687-45d8-8436-fac103efe7d5-kube-api-access-6q8cb" (OuterVolumeSpecName: "kube-api-access-6q8cb") pod "a380bed9-d687-45d8-8436-fac103efe7d5" (UID: "a380bed9-d687-45d8-8436-fac103efe7d5"). InnerVolumeSpecName "kube-api-access-6q8cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:04:10 crc kubenswrapper[4909]: I0202 13:04:10.238831 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q8cb\" (UniqueName: \"kubernetes.io/projected/a380bed9-d687-45d8-8436-fac103efe7d5-kube-api-access-6q8cb\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:10 crc kubenswrapper[4909]: I0202 13:04:10.846175 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53796bb93e1039f8b8dd1f0bc2816d1e40cfd4a81de3e85227358da659faacf9" Feb 02 13:04:10 crc kubenswrapper[4909]: I0202 13:04:10.846252 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-49z9s" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.028465 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a380bed9-d687-45d8-8436-fac103efe7d5" path="/var/lib/kubelet/pods/a380bed9-d687-45d8-8436-fac103efe7d5/volumes" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.245264 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fhvg7/crc-debug-d589x"] Feb 02 13:04:11 crc kubenswrapper[4909]: E0202 13:04:11.245744 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="extract-content" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.245765 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="extract-content" Feb 02 13:04:11 crc kubenswrapper[4909]: E0202 13:04:11.245780 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a380bed9-d687-45d8-8436-fac103efe7d5" containerName="container-00" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.245787 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="a380bed9-d687-45d8-8436-fac103efe7d5" containerName="container-00" Feb 02 13:04:11 crc kubenswrapper[4909]: E0202 13:04:11.245796 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="extract-utilities" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.245802 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="extract-utilities" Feb 02 13:04:11 crc kubenswrapper[4909]: E0202 13:04:11.245853 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="registry-server" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.245861 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="registry-server" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.246095 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f411a95-3656-4bfa-97b3-8cfc2f46b7d7" containerName="registry-server" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.246139 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="a380bed9-d687-45d8-8436-fac103efe7d5" containerName="container-00" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.246857 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-d589x" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.361423 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28b2e40f-1994-439e-ae9e-2e56ab3c903b-host\") pod \"crc-debug-d589x\" (UID: \"28b2e40f-1994-439e-ae9e-2e56ab3c903b\") " pod="openshift-must-gather-fhvg7/crc-debug-d589x" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.361644 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls6gn\" (UniqueName: \"kubernetes.io/projected/28b2e40f-1994-439e-ae9e-2e56ab3c903b-kube-api-access-ls6gn\") pod \"crc-debug-d589x\" (UID: \"28b2e40f-1994-439e-ae9e-2e56ab3c903b\") " pod="openshift-must-gather-fhvg7/crc-debug-d589x" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.465120 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28b2e40f-1994-439e-ae9e-2e56ab3c903b-host\") pod \"crc-debug-d589x\" (UID: \"28b2e40f-1994-439e-ae9e-2e56ab3c903b\") " pod="openshift-must-gather-fhvg7/crc-debug-d589x" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.465313 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28b2e40f-1994-439e-ae9e-2e56ab3c903b-host\") pod \"crc-debug-d589x\" (UID: \"28b2e40f-1994-439e-ae9e-2e56ab3c903b\") " pod="openshift-must-gather-fhvg7/crc-debug-d589x" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.465364 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls6gn\" (UniqueName: \"kubernetes.io/projected/28b2e40f-1994-439e-ae9e-2e56ab3c903b-kube-api-access-ls6gn\") pod \"crc-debug-d589x\" (UID: \"28b2e40f-1994-439e-ae9e-2e56ab3c903b\") " pod="openshift-must-gather-fhvg7/crc-debug-d589x" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.495941 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls6gn\" (UniqueName: \"kubernetes.io/projected/28b2e40f-1994-439e-ae9e-2e56ab3c903b-kube-api-access-ls6gn\") pod \"crc-debug-d589x\" (UID: \"28b2e40f-1994-439e-ae9e-2e56ab3c903b\") " pod="openshift-must-gather-fhvg7/crc-debug-d589x" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.566539 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-d589x" Feb 02 13:04:11 crc kubenswrapper[4909]: I0202 13:04:11.856433 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/crc-debug-d589x" event={"ID":"28b2e40f-1994-439e-ae9e-2e56ab3c903b","Type":"ContainerStarted","Data":"1840715a603be266a59299c44e3e59896642e3bf477318f32b7b0b964642a3b6"} Feb 02 13:04:12 crc kubenswrapper[4909]: I0202 13:04:12.866791 4909 generic.go:334] "Generic (PLEG): container finished" podID="28b2e40f-1994-439e-ae9e-2e56ab3c903b" containerID="059da121c59100c7f026831849cdd84a4b34563273b90e26ee1833d6539e417a" exitCode=0 Feb 02 13:04:12 crc kubenswrapper[4909]: I0202 13:04:12.866864 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/crc-debug-d589x" event={"ID":"28b2e40f-1994-439e-ae9e-2e56ab3c903b","Type":"ContainerDied","Data":"059da121c59100c7f026831849cdd84a4b34563273b90e26ee1833d6539e417a"} Feb 02 13:04:13 crc kubenswrapper[4909]: I0202 13:04:13.314143 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fhvg7/crc-debug-d589x"] Feb 02 13:04:13 crc kubenswrapper[4909]: I0202 13:04:13.325014 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fhvg7/crc-debug-d589x"] Feb 02 13:04:14 crc kubenswrapper[4909]: I0202 13:04:14.056930 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-d589x" Feb 02 13:04:14 crc kubenswrapper[4909]: I0202 13:04:14.224858 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28b2e40f-1994-439e-ae9e-2e56ab3c903b-host\") pod \"28b2e40f-1994-439e-ae9e-2e56ab3c903b\" (UID: \"28b2e40f-1994-439e-ae9e-2e56ab3c903b\") " Feb 02 13:04:14 crc kubenswrapper[4909]: I0202 13:04:14.224994 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/28b2e40f-1994-439e-ae9e-2e56ab3c903b-host" (OuterVolumeSpecName: "host") pod "28b2e40f-1994-439e-ae9e-2e56ab3c903b" (UID: "28b2e40f-1994-439e-ae9e-2e56ab3c903b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:04:14 crc kubenswrapper[4909]: I0202 13:04:14.225155 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls6gn\" (UniqueName: \"kubernetes.io/projected/28b2e40f-1994-439e-ae9e-2e56ab3c903b-kube-api-access-ls6gn\") pod \"28b2e40f-1994-439e-ae9e-2e56ab3c903b\" (UID: \"28b2e40f-1994-439e-ae9e-2e56ab3c903b\") " Feb 02 13:04:14 crc kubenswrapper[4909]: I0202 13:04:14.225754 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/28b2e40f-1994-439e-ae9e-2e56ab3c903b-host\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:14 crc kubenswrapper[4909]: I0202 13:04:14.231156 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28b2e40f-1994-439e-ae9e-2e56ab3c903b-kube-api-access-ls6gn" (OuterVolumeSpecName: "kube-api-access-ls6gn") pod "28b2e40f-1994-439e-ae9e-2e56ab3c903b" (UID: "28b2e40f-1994-439e-ae9e-2e56ab3c903b"). InnerVolumeSpecName "kube-api-access-ls6gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:04:14 crc kubenswrapper[4909]: I0202 13:04:14.327243 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls6gn\" (UniqueName: \"kubernetes.io/projected/28b2e40f-1994-439e-ae9e-2e56ab3c903b-kube-api-access-ls6gn\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:14 crc kubenswrapper[4909]: I0202 13:04:14.885378 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1840715a603be266a59299c44e3e59896642e3bf477318f32b7b0b964642a3b6" Feb 02 13:04:14 crc kubenswrapper[4909]: I0202 13:04:14.885428 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-d589x" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.026907 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:04:15 crc kubenswrapper[4909]: E0202 13:04:15.027487 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.033236 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28b2e40f-1994-439e-ae9e-2e56ab3c903b" path="/var/lib/kubelet/pods/28b2e40f-1994-439e-ae9e-2e56ab3c903b/volumes" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.062772 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fhvg7/crc-debug-4l9lj"] Feb 02 13:04:15 crc kubenswrapper[4909]: E0202 13:04:15.063251 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28b2e40f-1994-439e-ae9e-2e56ab3c903b" containerName="container-00" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.063267 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="28b2e40f-1994-439e-ae9e-2e56ab3c903b" containerName="container-00" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.063458 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="28b2e40f-1994-439e-ae9e-2e56ab3c903b" containerName="container-00" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.064132 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.142874 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg7qp\" (UniqueName: \"kubernetes.io/projected/85b0c415-5158-4089-a4d6-a27ac9e840f9-kube-api-access-xg7qp\") pod \"crc-debug-4l9lj\" (UID: \"85b0c415-5158-4089-a4d6-a27ac9e840f9\") " pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.143065 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85b0c415-5158-4089-a4d6-a27ac9e840f9-host\") pod \"crc-debug-4l9lj\" (UID: \"85b0c415-5158-4089-a4d6-a27ac9e840f9\") " pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.244756 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85b0c415-5158-4089-a4d6-a27ac9e840f9-host\") pod \"crc-debug-4l9lj\" (UID: \"85b0c415-5158-4089-a4d6-a27ac9e840f9\") " pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.244943 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85b0c415-5158-4089-a4d6-a27ac9e840f9-host\") pod \"crc-debug-4l9lj\" (UID: \"85b0c415-5158-4089-a4d6-a27ac9e840f9\") " pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.244936 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg7qp\" (UniqueName: \"kubernetes.io/projected/85b0c415-5158-4089-a4d6-a27ac9e840f9-kube-api-access-xg7qp\") pod \"crc-debug-4l9lj\" (UID: \"85b0c415-5158-4089-a4d6-a27ac9e840f9\") " pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.260127 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg7qp\" (UniqueName: \"kubernetes.io/projected/85b0c415-5158-4089-a4d6-a27ac9e840f9-kube-api-access-xg7qp\") pod \"crc-debug-4l9lj\" (UID: \"85b0c415-5158-4089-a4d6-a27ac9e840f9\") " pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.381510 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" Feb 02 13:04:15 crc kubenswrapper[4909]: W0202 13:04:15.834484 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85b0c415_5158_4089_a4d6_a27ac9e840f9.slice/crio-9ec464999657da85102b8e6c3c7429686f07a86e12210c878b56f90e08f0a3fa WatchSource:0}: Error finding container 9ec464999657da85102b8e6c3c7429686f07a86e12210c878b56f90e08f0a3fa: Status 404 returned error can't find the container with id 9ec464999657da85102b8e6c3c7429686f07a86e12210c878b56f90e08f0a3fa Feb 02 13:04:15 crc kubenswrapper[4909]: I0202 13:04:15.895274 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" event={"ID":"85b0c415-5158-4089-a4d6-a27ac9e840f9","Type":"ContainerStarted","Data":"9ec464999657da85102b8e6c3c7429686f07a86e12210c878b56f90e08f0a3fa"} Feb 02 13:04:16 crc kubenswrapper[4909]: I0202 13:04:16.907711 4909 generic.go:334] "Generic (PLEG): container finished" podID="85b0c415-5158-4089-a4d6-a27ac9e840f9" containerID="d0aee9239c9340844531c42e9e33c5f513006baea35b10b906f9e1954907d686" exitCode=0 Feb 02 13:04:16 crc kubenswrapper[4909]: I0202 13:04:16.907862 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" event={"ID":"85b0c415-5158-4089-a4d6-a27ac9e840f9","Type":"ContainerDied","Data":"d0aee9239c9340844531c42e9e33c5f513006baea35b10b906f9e1954907d686"} Feb 02 13:04:16 crc kubenswrapper[4909]: I0202 13:04:16.968124 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fhvg7/crc-debug-4l9lj"] Feb 02 13:04:16 crc kubenswrapper[4909]: I0202 13:04:16.988243 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fhvg7/crc-debug-4l9lj"] Feb 02 13:04:18 crc kubenswrapper[4909]: I0202 13:04:18.020026 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" Feb 02 13:04:18 crc kubenswrapper[4909]: I0202 13:04:18.105096 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85b0c415-5158-4089-a4d6-a27ac9e840f9-host\") pod \"85b0c415-5158-4089-a4d6-a27ac9e840f9\" (UID: \"85b0c415-5158-4089-a4d6-a27ac9e840f9\") " Feb 02 13:04:18 crc kubenswrapper[4909]: I0202 13:04:18.105247 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg7qp\" (UniqueName: \"kubernetes.io/projected/85b0c415-5158-4089-a4d6-a27ac9e840f9-kube-api-access-xg7qp\") pod \"85b0c415-5158-4089-a4d6-a27ac9e840f9\" (UID: \"85b0c415-5158-4089-a4d6-a27ac9e840f9\") " Feb 02 13:04:18 crc kubenswrapper[4909]: I0202 13:04:18.106178 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85b0c415-5158-4089-a4d6-a27ac9e840f9-host" (OuterVolumeSpecName: "host") pod "85b0c415-5158-4089-a4d6-a27ac9e840f9" (UID: "85b0c415-5158-4089-a4d6-a27ac9e840f9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 02 13:04:18 crc kubenswrapper[4909]: I0202 13:04:18.113800 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85b0c415-5158-4089-a4d6-a27ac9e840f9-kube-api-access-xg7qp" (OuterVolumeSpecName: "kube-api-access-xg7qp") pod "85b0c415-5158-4089-a4d6-a27ac9e840f9" (UID: "85b0c415-5158-4089-a4d6-a27ac9e840f9"). InnerVolumeSpecName "kube-api-access-xg7qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:04:18 crc kubenswrapper[4909]: I0202 13:04:18.207779 4909 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85b0c415-5158-4089-a4d6-a27ac9e840f9-host\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:18 crc kubenswrapper[4909]: I0202 13:04:18.207831 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg7qp\" (UniqueName: \"kubernetes.io/projected/85b0c415-5158-4089-a4d6-a27ac9e840f9-kube-api-access-xg7qp\") on node \"crc\" DevicePath \"\"" Feb 02 13:04:18 crc kubenswrapper[4909]: I0202 13:04:18.937221 4909 scope.go:117] "RemoveContainer" containerID="d0aee9239c9340844531c42e9e33c5f513006baea35b10b906f9e1954907d686" Feb 02 13:04:18 crc kubenswrapper[4909]: I0202 13:04:18.937372 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/crc-debug-4l9lj" Feb 02 13:04:19 crc kubenswrapper[4909]: I0202 13:04:19.029100 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85b0c415-5158-4089-a4d6-a27ac9e840f9" path="/var/lib/kubelet/pods/85b0c415-5158-4089-a4d6-a27ac9e840f9/volumes" Feb 02 13:04:28 crc kubenswrapper[4909]: I0202 13:04:28.016730 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:04:28 crc kubenswrapper[4909]: E0202 13:04:28.017662 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:04:41 crc kubenswrapper[4909]: I0202 13:04:41.022406 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:04:41 crc kubenswrapper[4909]: E0202 13:04:41.023277 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:04:54 crc kubenswrapper[4909]: I0202 13:04:54.018093 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:04:54 crc kubenswrapper[4909]: E0202 13:04:54.019516 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:05:06 crc kubenswrapper[4909]: I0202 13:05:06.016404 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:05:06 crc kubenswrapper[4909]: E0202 13:05:06.017360 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:05:21 crc kubenswrapper[4909]: I0202 13:05:21.017045 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:05:21 crc kubenswrapper[4909]: E0202 13:05:21.018303 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:05:36 crc kubenswrapper[4909]: I0202 13:05:36.016481 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:05:36 crc kubenswrapper[4909]: E0202 13:05:36.017351 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:05:47 crc kubenswrapper[4909]: I0202 13:05:47.016254 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:05:47 crc kubenswrapper[4909]: E0202 13:05:47.017004 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:05:59 crc kubenswrapper[4909]: I0202 13:05:59.017364 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:05:59 crc kubenswrapper[4909]: E0202 13:05:59.018046 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.305998 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w6rlw"] Feb 02 13:06:06 crc kubenswrapper[4909]: E0202 13:06:06.308273 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85b0c415-5158-4089-a4d6-a27ac9e840f9" containerName="container-00" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.316987 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="85b0c415-5158-4089-a4d6-a27ac9e840f9" containerName="container-00" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.317495 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="85b0c415-5158-4089-a4d6-a27ac9e840f9" containerName="container-00" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.319196 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w6rlw"] Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.319391 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.364852 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57mqw\" (UniqueName: \"kubernetes.io/projected/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-kube-api-access-57mqw\") pod \"certified-operators-w6rlw\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.365265 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-utilities\") pod \"certified-operators-w6rlw\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.365378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-catalog-content\") pod \"certified-operators-w6rlw\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.466736 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57mqw\" (UniqueName: \"kubernetes.io/projected/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-kube-api-access-57mqw\") pod \"certified-operators-w6rlw\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.466983 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-utilities\") pod \"certified-operators-w6rlw\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.467038 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-catalog-content\") pod \"certified-operators-w6rlw\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.467544 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-catalog-content\") pod \"certified-operators-w6rlw\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.467565 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-utilities\") pod \"certified-operators-w6rlw\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.485758 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57mqw\" (UniqueName: \"kubernetes.io/projected/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-kube-api-access-57mqw\") pod \"certified-operators-w6rlw\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:06 crc kubenswrapper[4909]: I0202 13:06:06.649901 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:07 crc kubenswrapper[4909]: I0202 13:06:07.203245 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w6rlw"] Feb 02 13:06:07 crc kubenswrapper[4909]: W0202 13:06:07.205322 4909 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3ac9eea_92cd_46e7_85d6_86c7ee81d82a.slice/crio-8b6aa9ad75bc7caa95e155d7349dfb4fedb178e202730e56edbba1d9c607bba4 WatchSource:0}: Error finding container 8b6aa9ad75bc7caa95e155d7349dfb4fedb178e202730e56edbba1d9c607bba4: Status 404 returned error can't find the container with id 8b6aa9ad75bc7caa95e155d7349dfb4fedb178e202730e56edbba1d9c607bba4 Feb 02 13:06:07 crc kubenswrapper[4909]: I0202 13:06:07.355839 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6rlw" event={"ID":"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a","Type":"ContainerStarted","Data":"8b6aa9ad75bc7caa95e155d7349dfb4fedb178e202730e56edbba1d9c607bba4"} Feb 02 13:06:08 crc kubenswrapper[4909]: I0202 13:06:08.366977 4909 generic.go:334] "Generic (PLEG): container finished" podID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerID="8a8280a35d59206a13dff7eb93f70dff9a9dba4ea41d02a9f3303d7bb220d261" exitCode=0 Feb 02 13:06:08 crc kubenswrapper[4909]: I0202 13:06:08.367045 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6rlw" event={"ID":"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a","Type":"ContainerDied","Data":"8a8280a35d59206a13dff7eb93f70dff9a9dba4ea41d02a9f3303d7bb220d261"} Feb 02 13:06:09 crc kubenswrapper[4909]: I0202 13:06:09.379562 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6rlw" event={"ID":"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a","Type":"ContainerStarted","Data":"b32d991b7b7bc51982e1f25c0c6cca8e7b8732f91db6902d54e7357990ef7cd3"} Feb 02 13:06:10 crc kubenswrapper[4909]: I0202 13:06:10.390916 4909 generic.go:334] "Generic (PLEG): container finished" podID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerID="b32d991b7b7bc51982e1f25c0c6cca8e7b8732f91db6902d54e7357990ef7cd3" exitCode=0 Feb 02 13:06:10 crc kubenswrapper[4909]: I0202 13:06:10.390989 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6rlw" event={"ID":"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a","Type":"ContainerDied","Data":"b32d991b7b7bc51982e1f25c0c6cca8e7b8732f91db6902d54e7357990ef7cd3"} Feb 02 13:06:11 crc kubenswrapper[4909]: I0202 13:06:11.406428 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6rlw" event={"ID":"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a","Type":"ContainerStarted","Data":"e3352267ae226247e28089f2df88488cb747354e71bdac56de20c2e3f184a9b0"} Feb 02 13:06:11 crc kubenswrapper[4909]: I0202 13:06:11.426781 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w6rlw" podStartSLOduration=2.808086479 podStartE2EDuration="5.426761222s" podCreationTimestamp="2026-02-02 13:06:06 +0000 UTC" firstStartedPulling="2026-02-02 13:06:08.369994602 +0000 UTC m=+9294.116095337" lastFinishedPulling="2026-02-02 13:06:10.988669345 +0000 UTC m=+9296.734770080" observedRunningTime="2026-02-02 13:06:11.422567503 +0000 UTC m=+9297.168668238" watchObservedRunningTime="2026-02-02 13:06:11.426761222 +0000 UTC m=+9297.172861957" Feb 02 13:06:13 crc kubenswrapper[4909]: I0202 13:06:13.016458 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:06:13 crc kubenswrapper[4909]: E0202 13:06:13.017073 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:06:16 crc kubenswrapper[4909]: I0202 13:06:16.650929 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:16 crc kubenswrapper[4909]: I0202 13:06:16.651563 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:16 crc kubenswrapper[4909]: I0202 13:06:16.702234 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:17 crc kubenswrapper[4909]: I0202 13:06:17.507910 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:17 crc kubenswrapper[4909]: I0202 13:06:17.558830 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w6rlw"] Feb 02 13:06:19 crc kubenswrapper[4909]: I0202 13:06:19.479682 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w6rlw" podUID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerName="registry-server" containerID="cri-o://e3352267ae226247e28089f2df88488cb747354e71bdac56de20c2e3f184a9b0" gracePeriod=2 Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.491987 4909 generic.go:334] "Generic (PLEG): container finished" podID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerID="e3352267ae226247e28089f2df88488cb747354e71bdac56de20c2e3f184a9b0" exitCode=0 Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.492052 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6rlw" event={"ID":"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a","Type":"ContainerDied","Data":"e3352267ae226247e28089f2df88488cb747354e71bdac56de20c2e3f184a9b0"} Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.492284 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w6rlw" event={"ID":"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a","Type":"ContainerDied","Data":"8b6aa9ad75bc7caa95e155d7349dfb4fedb178e202730e56edbba1d9c607bba4"} Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.492296 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b6aa9ad75bc7caa95e155d7349dfb4fedb178e202730e56edbba1d9c607bba4" Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.514200 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.626940 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57mqw\" (UniqueName: \"kubernetes.io/projected/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-kube-api-access-57mqw\") pod \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.627479 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-catalog-content\") pod \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.627673 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-utilities\") pod \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\" (UID: \"b3ac9eea-92cd-46e7-85d6-86c7ee81d82a\") " Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.628741 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-utilities" (OuterVolumeSpecName: "utilities") pod "b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" (UID: "b3ac9eea-92cd-46e7-85d6-86c7ee81d82a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.632257 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-kube-api-access-57mqw" (OuterVolumeSpecName: "kube-api-access-57mqw") pod "b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" (UID: "b3ac9eea-92cd-46e7-85d6-86c7ee81d82a"). InnerVolumeSpecName "kube-api-access-57mqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.686451 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" (UID: "b3ac9eea-92cd-46e7-85d6-86c7ee81d82a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.730274 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.730318 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57mqw\" (UniqueName: \"kubernetes.io/projected/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-kube-api-access-57mqw\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:20 crc kubenswrapper[4909]: I0202 13:06:20.730331 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:06:21 crc kubenswrapper[4909]: I0202 13:06:21.500781 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w6rlw" Feb 02 13:06:21 crc kubenswrapper[4909]: I0202 13:06:21.530908 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w6rlw"] Feb 02 13:06:21 crc kubenswrapper[4909]: I0202 13:06:21.540599 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w6rlw"] Feb 02 13:06:23 crc kubenswrapper[4909]: I0202 13:06:23.031660 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" path="/var/lib/kubelet/pods/b3ac9eea-92cd-46e7-85d6-86c7ee81d82a/volumes" Feb 02 13:06:28 crc kubenswrapper[4909]: I0202 13:06:28.016831 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:06:28 crc kubenswrapper[4909]: E0202 13:06:28.017783 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:06:43 crc kubenswrapper[4909]: I0202 13:06:43.017869 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:06:43 crc kubenswrapper[4909]: E0202 13:06:43.018715 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:06:58 crc kubenswrapper[4909]: I0202 13:06:58.019576 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:06:58 crc kubenswrapper[4909]: E0202 13:06:58.020796 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:07:13 crc kubenswrapper[4909]: I0202 13:07:13.017871 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:07:13 crc kubenswrapper[4909]: E0202 13:07:13.019366 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:07:27 crc kubenswrapper[4909]: I0202 13:07:27.017996 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:07:27 crc kubenswrapper[4909]: E0202 13:07:27.019041 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:07:42 crc kubenswrapper[4909]: I0202 13:07:42.016419 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:07:42 crc kubenswrapper[4909]: E0202 13:07:42.017215 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:07:56 crc kubenswrapper[4909]: I0202 13:07:56.017624 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:07:56 crc kubenswrapper[4909]: E0202 13:07:56.018703 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:08:08 crc kubenswrapper[4909]: I0202 13:08:08.017239 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:08:08 crc kubenswrapper[4909]: E0202 13:08:08.018008 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:08:19 crc kubenswrapper[4909]: I0202 13:08:19.016920 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:08:19 crc kubenswrapper[4909]: E0202 13:08:19.017755 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:08:31 crc kubenswrapper[4909]: I0202 13:08:31.017009 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:08:31 crc kubenswrapper[4909]: I0202 13:08:31.843011 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"c201aa52e3cee733bb09e6bec73219acfe4039b0286d2a8136c5dd8d124911f4"} Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.567197 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdrk"] Feb 02 13:09:16 crc kubenswrapper[4909]: E0202 13:09:16.568193 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerName="extract-utilities" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.568212 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerName="extract-utilities" Feb 02 13:09:16 crc kubenswrapper[4909]: E0202 13:09:16.568237 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerName="extract-content" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.568245 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerName="extract-content" Feb 02 13:09:16 crc kubenswrapper[4909]: E0202 13:09:16.568259 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerName="registry-server" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.568268 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerName="registry-server" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.568520 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ac9eea-92cd-46e7-85d6-86c7ee81d82a" containerName="registry-server" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.570221 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.587392 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdrk"] Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.614848 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb8fz\" (UniqueName: \"kubernetes.io/projected/5c03e18a-591b-4823-a1c6-56a48c6a23e3-kube-api-access-mb8fz\") pod \"redhat-marketplace-6pdrk\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.614926 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-utilities\") pod \"redhat-marketplace-6pdrk\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.614973 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-catalog-content\") pod \"redhat-marketplace-6pdrk\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.717126 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb8fz\" (UniqueName: \"kubernetes.io/projected/5c03e18a-591b-4823-a1c6-56a48c6a23e3-kube-api-access-mb8fz\") pod \"redhat-marketplace-6pdrk\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.717203 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-utilities\") pod \"redhat-marketplace-6pdrk\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.717247 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-catalog-content\") pod \"redhat-marketplace-6pdrk\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.717846 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-utilities\") pod \"redhat-marketplace-6pdrk\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.719293 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-catalog-content\") pod \"redhat-marketplace-6pdrk\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.739829 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb8fz\" (UniqueName: \"kubernetes.io/projected/5c03e18a-591b-4823-a1c6-56a48c6a23e3-kube-api-access-mb8fz\") pod \"redhat-marketplace-6pdrk\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:16 crc kubenswrapper[4909]: I0202 13:09:16.931199 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:17 crc kubenswrapper[4909]: I0202 13:09:17.454325 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdrk"] Feb 02 13:09:18 crc kubenswrapper[4909]: I0202 13:09:18.283328 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerID="a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce" exitCode=0 Feb 02 13:09:18 crc kubenswrapper[4909]: I0202 13:09:18.283674 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdrk" event={"ID":"5c03e18a-591b-4823-a1c6-56a48c6a23e3","Type":"ContainerDied","Data":"a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce"} Feb 02 13:09:18 crc kubenswrapper[4909]: I0202 13:09:18.283704 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdrk" event={"ID":"5c03e18a-591b-4823-a1c6-56a48c6a23e3","Type":"ContainerStarted","Data":"5410698cf1b0ad1a6a1f949c7c899c2c52dbfa49f6b7813737b7889eb236e151"} Feb 02 13:09:18 crc kubenswrapper[4909]: I0202 13:09:18.286617 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:09:20 crc kubenswrapper[4909]: I0202 13:09:20.304980 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerID="b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273" exitCode=0 Feb 02 13:09:20 crc kubenswrapper[4909]: I0202 13:09:20.305215 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdrk" event={"ID":"5c03e18a-591b-4823-a1c6-56a48c6a23e3","Type":"ContainerDied","Data":"b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273"} Feb 02 13:09:21 crc kubenswrapper[4909]: I0202 13:09:21.316147 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdrk" event={"ID":"5c03e18a-591b-4823-a1c6-56a48c6a23e3","Type":"ContainerStarted","Data":"404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06"} Feb 02 13:09:21 crc kubenswrapper[4909]: I0202 13:09:21.346564 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6pdrk" podStartSLOduration=2.716213034 podStartE2EDuration="5.346543207s" podCreationTimestamp="2026-02-02 13:09:16 +0000 UTC" firstStartedPulling="2026-02-02 13:09:18.286307648 +0000 UTC m=+9484.032408383" lastFinishedPulling="2026-02-02 13:09:20.916637831 +0000 UTC m=+9486.662738556" observedRunningTime="2026-02-02 13:09:21.335309298 +0000 UTC m=+9487.081410043" watchObservedRunningTime="2026-02-02 13:09:21.346543207 +0000 UTC m=+9487.092643942" Feb 02 13:09:26 crc kubenswrapper[4909]: I0202 13:09:26.931358 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:26 crc kubenswrapper[4909]: I0202 13:09:26.932130 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:26 crc kubenswrapper[4909]: I0202 13:09:26.978550 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:27 crc kubenswrapper[4909]: I0202 13:09:27.426285 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:27 crc kubenswrapper[4909]: I0202 13:09:27.474306 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdrk"] Feb 02 13:09:29 crc kubenswrapper[4909]: I0202 13:09:29.397246 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6pdrk" podUID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerName="registry-server" containerID="cri-o://404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06" gracePeriod=2 Feb 02 13:09:29 crc kubenswrapper[4909]: I0202 13:09:29.871486 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:29 crc kubenswrapper[4909]: I0202 13:09:29.991484 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-catalog-content\") pod \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " Feb 02 13:09:29 crc kubenswrapper[4909]: I0202 13:09:29.991723 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-utilities\") pod \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " Feb 02 13:09:29 crc kubenswrapper[4909]: I0202 13:09:29.991837 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb8fz\" (UniqueName: \"kubernetes.io/projected/5c03e18a-591b-4823-a1c6-56a48c6a23e3-kube-api-access-mb8fz\") pod \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\" (UID: \"5c03e18a-591b-4823-a1c6-56a48c6a23e3\") " Feb 02 13:09:29 crc kubenswrapper[4909]: I0202 13:09:29.994097 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-utilities" (OuterVolumeSpecName: "utilities") pod "5c03e18a-591b-4823-a1c6-56a48c6a23e3" (UID: "5c03e18a-591b-4823-a1c6-56a48c6a23e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:09:29 crc kubenswrapper[4909]: I0202 13:09:29.999846 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c03e18a-591b-4823-a1c6-56a48c6a23e3-kube-api-access-mb8fz" (OuterVolumeSpecName: "kube-api-access-mb8fz") pod "5c03e18a-591b-4823-a1c6-56a48c6a23e3" (UID: "5c03e18a-591b-4823-a1c6-56a48c6a23e3"). InnerVolumeSpecName "kube-api-access-mb8fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.014215 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c03e18a-591b-4823-a1c6-56a48c6a23e3" (UID: "5c03e18a-591b-4823-a1c6-56a48c6a23e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.094257 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.094296 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb8fz\" (UniqueName: \"kubernetes.io/projected/5c03e18a-591b-4823-a1c6-56a48c6a23e3-kube-api-access-mb8fz\") on node \"crc\" DevicePath \"\"" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.094308 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c03e18a-591b-4823-a1c6-56a48c6a23e3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.410124 4909 generic.go:334] "Generic (PLEG): container finished" podID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerID="404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06" exitCode=0 Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.410173 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdrk" event={"ID":"5c03e18a-591b-4823-a1c6-56a48c6a23e3","Type":"ContainerDied","Data":"404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06"} Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.410183 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdrk" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.410200 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdrk" event={"ID":"5c03e18a-591b-4823-a1c6-56a48c6a23e3","Type":"ContainerDied","Data":"5410698cf1b0ad1a6a1f949c7c899c2c52dbfa49f6b7813737b7889eb236e151"} Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.410217 4909 scope.go:117] "RemoveContainer" containerID="404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.429541 4909 scope.go:117] "RemoveContainer" containerID="b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.450946 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdrk"] Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.458170 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdrk"] Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.471612 4909 scope.go:117] "RemoveContainer" containerID="a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.505332 4909 scope.go:117] "RemoveContainer" containerID="404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06" Feb 02 13:09:30 crc kubenswrapper[4909]: E0202 13:09:30.505931 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06\": container with ID starting with 404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06 not found: ID does not exist" containerID="404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.505974 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06"} err="failed to get container status \"404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06\": rpc error: code = NotFound desc = could not find container \"404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06\": container with ID starting with 404ca1d36941be7d484a9c53cdc7310b863a04a8bc3a5f7331937cfdf0717b06 not found: ID does not exist" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.506034 4909 scope.go:117] "RemoveContainer" containerID="b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273" Feb 02 13:09:30 crc kubenswrapper[4909]: E0202 13:09:30.507400 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273\": container with ID starting with b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273 not found: ID does not exist" containerID="b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.507445 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273"} err="failed to get container status \"b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273\": rpc error: code = NotFound desc = could not find container \"b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273\": container with ID starting with b87d847aa0c23d0557698e9c6672a16bc53212a9ec67469f7aa255ad84bac273 not found: ID does not exist" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.507475 4909 scope.go:117] "RemoveContainer" containerID="a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce" Feb 02 13:09:30 crc kubenswrapper[4909]: E0202 13:09:30.507961 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce\": container with ID starting with a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce not found: ID does not exist" containerID="a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce" Feb 02 13:09:30 crc kubenswrapper[4909]: I0202 13:09:30.507989 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce"} err="failed to get container status \"a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce\": rpc error: code = NotFound desc = could not find container \"a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce\": container with ID starting with a4cc1b9c61461c8605be767152222d34086c6f82c10bbee41e9988428ae747ce not found: ID does not exist" Feb 02 13:09:31 crc kubenswrapper[4909]: I0202 13:09:31.026988 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" path="/var/lib/kubelet/pods/5c03e18a-591b-4823-a1c6-56a48c6a23e3/volumes" Feb 02 13:09:32 crc kubenswrapper[4909]: I0202 13:09:32.879249 4909 scope.go:117] "RemoveContainer" containerID="31797127a8c10e945e8a5c622a0b732618c08855d7bf0dc5b391e5495f529d0b" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.179951 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vssdx"] Feb 02 13:09:49 crc kubenswrapper[4909]: E0202 13:09:49.180869 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerName="extract-content" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.180884 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerName="extract-content" Feb 02 13:09:49 crc kubenswrapper[4909]: E0202 13:09:49.180910 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerName="registry-server" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.180916 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerName="registry-server" Feb 02 13:09:49 crc kubenswrapper[4909]: E0202 13:09:49.180941 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerName="extract-utilities" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.180949 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerName="extract-utilities" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.181151 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c03e18a-591b-4823-a1c6-56a48c6a23e3" containerName="registry-server" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.182771 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.211592 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vssdx"] Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.288575 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-utilities\") pod \"community-operators-vssdx\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.288674 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmlfp\" (UniqueName: \"kubernetes.io/projected/98172ed8-beb8-4f76-93fd-79b5565b1cb3-kube-api-access-dmlfp\") pod \"community-operators-vssdx\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.288892 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-catalog-content\") pod \"community-operators-vssdx\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.391562 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-catalog-content\") pod \"community-operators-vssdx\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.391710 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-utilities\") pod \"community-operators-vssdx\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.391772 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmlfp\" (UniqueName: \"kubernetes.io/projected/98172ed8-beb8-4f76-93fd-79b5565b1cb3-kube-api-access-dmlfp\") pod \"community-operators-vssdx\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.392069 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-catalog-content\") pod \"community-operators-vssdx\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.392309 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-utilities\") pod \"community-operators-vssdx\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.411699 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmlfp\" (UniqueName: \"kubernetes.io/projected/98172ed8-beb8-4f76-93fd-79b5565b1cb3-kube-api-access-dmlfp\") pod \"community-operators-vssdx\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:49 crc kubenswrapper[4909]: I0202 13:09:49.504006 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:50 crc kubenswrapper[4909]: I0202 13:09:50.083028 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vssdx"] Feb 02 13:09:50 crc kubenswrapper[4909]: I0202 13:09:50.618594 4909 generic.go:334] "Generic (PLEG): container finished" podID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerID="c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a" exitCode=0 Feb 02 13:09:50 crc kubenswrapper[4909]: I0202 13:09:50.618641 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vssdx" event={"ID":"98172ed8-beb8-4f76-93fd-79b5565b1cb3","Type":"ContainerDied","Data":"c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a"} Feb 02 13:09:50 crc kubenswrapper[4909]: I0202 13:09:50.618836 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vssdx" event={"ID":"98172ed8-beb8-4f76-93fd-79b5565b1cb3","Type":"ContainerStarted","Data":"b96ad837d8dd9debb2d5b49a6b8dba9421ed88e4c1deb24cec71efa68ff15664"} Feb 02 13:09:51 crc kubenswrapper[4909]: I0202 13:09:51.630593 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vssdx" event={"ID":"98172ed8-beb8-4f76-93fd-79b5565b1cb3","Type":"ContainerStarted","Data":"ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5"} Feb 02 13:09:53 crc kubenswrapper[4909]: I0202 13:09:53.653530 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vssdx" event={"ID":"98172ed8-beb8-4f76-93fd-79b5565b1cb3","Type":"ContainerDied","Data":"ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5"} Feb 02 13:09:53 crc kubenswrapper[4909]: I0202 13:09:53.653468 4909 generic.go:334] "Generic (PLEG): container finished" podID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerID="ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5" exitCode=0 Feb 02 13:09:55 crc kubenswrapper[4909]: I0202 13:09:55.672768 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vssdx" event={"ID":"98172ed8-beb8-4f76-93fd-79b5565b1cb3","Type":"ContainerStarted","Data":"a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677"} Feb 02 13:09:55 crc kubenswrapper[4909]: I0202 13:09:55.692513 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vssdx" podStartSLOduration=3.255083299 podStartE2EDuration="6.692490097s" podCreationTimestamp="2026-02-02 13:09:49 +0000 UTC" firstStartedPulling="2026-02-02 13:09:50.620471371 +0000 UTC m=+9516.366572106" lastFinishedPulling="2026-02-02 13:09:54.057878169 +0000 UTC m=+9519.803978904" observedRunningTime="2026-02-02 13:09:55.688634217 +0000 UTC m=+9521.434734952" watchObservedRunningTime="2026-02-02 13:09:55.692490097 +0000 UTC m=+9521.438590832" Feb 02 13:09:59 crc kubenswrapper[4909]: I0202 13:09:59.505829 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:59 crc kubenswrapper[4909]: I0202 13:09:59.506403 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:59 crc kubenswrapper[4909]: I0202 13:09:59.554655 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:59 crc kubenswrapper[4909]: I0202 13:09:59.781096 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:09:59 crc kubenswrapper[4909]: I0202 13:09:59.832583 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vssdx"] Feb 02 13:10:01 crc kubenswrapper[4909]: I0202 13:10:01.755416 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vssdx" podUID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerName="registry-server" containerID="cri-o://a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677" gracePeriod=2 Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.294264 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.398588 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmlfp\" (UniqueName: \"kubernetes.io/projected/98172ed8-beb8-4f76-93fd-79b5565b1cb3-kube-api-access-dmlfp\") pod \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.398770 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-catalog-content\") pod \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.398829 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-utilities\") pod \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\" (UID: \"98172ed8-beb8-4f76-93fd-79b5565b1cb3\") " Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.400160 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-utilities" (OuterVolumeSpecName: "utilities") pod "98172ed8-beb8-4f76-93fd-79b5565b1cb3" (UID: "98172ed8-beb8-4f76-93fd-79b5565b1cb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.406775 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98172ed8-beb8-4f76-93fd-79b5565b1cb3-kube-api-access-dmlfp" (OuterVolumeSpecName: "kube-api-access-dmlfp") pod "98172ed8-beb8-4f76-93fd-79b5565b1cb3" (UID: "98172ed8-beb8-4f76-93fd-79b5565b1cb3"). InnerVolumeSpecName "kube-api-access-dmlfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.463017 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98172ed8-beb8-4f76-93fd-79b5565b1cb3" (UID: "98172ed8-beb8-4f76-93fd-79b5565b1cb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.501036 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmlfp\" (UniqueName: \"kubernetes.io/projected/98172ed8-beb8-4f76-93fd-79b5565b1cb3-kube-api-access-dmlfp\") on node \"crc\" DevicePath \"\"" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.501067 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.501080 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98172ed8-beb8-4f76-93fd-79b5565b1cb3-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.767434 4909 generic.go:334] "Generic (PLEG): container finished" podID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerID="a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677" exitCode=0 Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.767503 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vssdx" event={"ID":"98172ed8-beb8-4f76-93fd-79b5565b1cb3","Type":"ContainerDied","Data":"a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677"} Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.767535 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vssdx" event={"ID":"98172ed8-beb8-4f76-93fd-79b5565b1cb3","Type":"ContainerDied","Data":"b96ad837d8dd9debb2d5b49a6b8dba9421ed88e4c1deb24cec71efa68ff15664"} Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.767560 4909 scope.go:117] "RemoveContainer" containerID="a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.768917 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vssdx" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.788260 4909 scope.go:117] "RemoveContainer" containerID="ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.805576 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vssdx"] Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.818077 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vssdx"] Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.826344 4909 scope.go:117] "RemoveContainer" containerID="c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.874917 4909 scope.go:117] "RemoveContainer" containerID="a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677" Feb 02 13:10:02 crc kubenswrapper[4909]: E0202 13:10:02.875381 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677\": container with ID starting with a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677 not found: ID does not exist" containerID="a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.875407 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677"} err="failed to get container status \"a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677\": rpc error: code = NotFound desc = could not find container \"a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677\": container with ID starting with a57c67c5f62dd9977e71b0a56c0101f2475228427b97058a75c775a63bff2677 not found: ID does not exist" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.875427 4909 scope.go:117] "RemoveContainer" containerID="ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5" Feb 02 13:10:02 crc kubenswrapper[4909]: E0202 13:10:02.875652 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5\": container with ID starting with ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5 not found: ID does not exist" containerID="ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.875675 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5"} err="failed to get container status \"ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5\": rpc error: code = NotFound desc = could not find container \"ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5\": container with ID starting with ece68f67cc5b03c2c16076795a79a2e160ac974f2fbb44819d4db955ebf167b5 not found: ID does not exist" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.875688 4909 scope.go:117] "RemoveContainer" containerID="c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a" Feb 02 13:10:02 crc kubenswrapper[4909]: E0202 13:10:02.875914 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a\": container with ID starting with c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a not found: ID does not exist" containerID="c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a" Feb 02 13:10:02 crc kubenswrapper[4909]: I0202 13:10:02.875936 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a"} err="failed to get container status \"c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a\": rpc error: code = NotFound desc = could not find container \"c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a\": container with ID starting with c15f972a3eaddadbb02a06f336983ebba648d53595c07cf4c542aaea06bb337a not found: ID does not exist" Feb 02 13:10:03 crc kubenswrapper[4909]: I0202 13:10:03.041180 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" path="/var/lib/kubelet/pods/98172ed8-beb8-4f76-93fd-79b5565b1cb3/volumes" Feb 02 13:10:32 crc kubenswrapper[4909]: I0202 13:10:32.956505 4909 scope.go:117] "RemoveContainer" containerID="059da121c59100c7f026831849cdd84a4b34563273b90e26ee1833d6539e417a" Feb 02 13:10:49 crc kubenswrapper[4909]: I0202 13:10:49.510866 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:10:49 crc kubenswrapper[4909]: I0202 13:10:49.511456 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:11:19 crc kubenswrapper[4909]: I0202 13:11:19.511495 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:11:19 crc kubenswrapper[4909]: I0202 13:11:19.512076 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:11:49 crc kubenswrapper[4909]: I0202 13:11:49.510734 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:11:49 crc kubenswrapper[4909]: I0202 13:11:49.511395 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:11:49 crc kubenswrapper[4909]: I0202 13:11:49.511446 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 13:11:49 crc kubenswrapper[4909]: I0202 13:11:49.512207 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c201aa52e3cee733bb09e6bec73219acfe4039b0286d2a8136c5dd8d124911f4"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:11:49 crc kubenswrapper[4909]: I0202 13:11:49.512260 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://c201aa52e3cee733bb09e6bec73219acfe4039b0286d2a8136c5dd8d124911f4" gracePeriod=600 Feb 02 13:11:49 crc kubenswrapper[4909]: I0202 13:11:49.977640 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="c201aa52e3cee733bb09e6bec73219acfe4039b0286d2a8136c5dd8d124911f4" exitCode=0 Feb 02 13:11:49 crc kubenswrapper[4909]: I0202 13:11:49.977676 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"c201aa52e3cee733bb09e6bec73219acfe4039b0286d2a8136c5dd8d124911f4"} Feb 02 13:11:49 crc kubenswrapper[4909]: I0202 13:11:49.978013 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5"} Feb 02 13:11:49 crc kubenswrapper[4909]: I0202 13:11:49.978038 4909 scope.go:117] "RemoveContainer" containerID="b231127a804850d29928a973b6fce0e46d622b71e24b005c448911612fcfe3f0" Feb 02 13:12:14 crc kubenswrapper[4909]: I0202 13:12:14.232482 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_01a92576-40e1-46c8-b08b-13f40c8c4892/init-config-reloader/0.log" Feb 02 13:12:14 crc kubenswrapper[4909]: I0202 13:12:14.501368 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_01a92576-40e1-46c8-b08b-13f40c8c4892/init-config-reloader/0.log" Feb 02 13:12:14 crc kubenswrapper[4909]: I0202 13:12:14.520388 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_01a92576-40e1-46c8-b08b-13f40c8c4892/alertmanager/0.log" Feb 02 13:12:14 crc kubenswrapper[4909]: I0202 13:12:14.543337 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_01a92576-40e1-46c8-b08b-13f40c8c4892/config-reloader/0.log" Feb 02 13:12:15 crc kubenswrapper[4909]: I0202 13:12:15.280106 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_18e799c0-b0ec-4362-8d6a-0d490bb38aab/aodh-api/0.log" Feb 02 13:12:15 crc kubenswrapper[4909]: I0202 13:12:15.330016 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_18e799c0-b0ec-4362-8d6a-0d490bb38aab/aodh-evaluator/0.log" Feb 02 13:12:15 crc kubenswrapper[4909]: I0202 13:12:15.330343 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_18e799c0-b0ec-4362-8d6a-0d490bb38aab/aodh-listener/0.log" Feb 02 13:12:15 crc kubenswrapper[4909]: I0202 13:12:15.492530 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_18e799c0-b0ec-4362-8d6a-0d490bb38aab/aodh-notifier/0.log" Feb 02 13:12:15 crc kubenswrapper[4909]: I0202 13:12:15.554078 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-75f58b9976-z4nmm_c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b/barbican-api/0.log" Feb 02 13:12:15 crc kubenswrapper[4909]: I0202 13:12:15.572567 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-75f58b9976-z4nmm_c4059ec4-44bb-46f0-b9ba-d2006cbf4b3b/barbican-api-log/0.log" Feb 02 13:12:15 crc kubenswrapper[4909]: I0202 13:12:15.976140 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-bcfc899fd-2xfv2_b584395d-e8d0-4fab-8059-e9cc92550805/barbican-keystone-listener/0.log" Feb 02 13:12:16 crc kubenswrapper[4909]: I0202 13:12:16.043331 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-bcfc899fd-2xfv2_b584395d-e8d0-4fab-8059-e9cc92550805/barbican-keystone-listener-log/0.log" Feb 02 13:12:16 crc kubenswrapper[4909]: I0202 13:12:16.180109 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6ccd96d447-czqpq_adddacaf-04be-4f50-9300-76a900e90318/barbican-worker/0.log" Feb 02 13:12:16 crc kubenswrapper[4909]: I0202 13:12:16.257288 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6ccd96d447-czqpq_adddacaf-04be-4f50-9300-76a900e90318/barbican-worker-log/0.log" Feb 02 13:12:16 crc kubenswrapper[4909]: I0202 13:12:16.351259 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-bcjvg_5aaf8cb2-7efd-4f8b-8348-ee983dbe284d/bootstrap-openstack-openstack-cell1/0.log" Feb 02 13:12:16 crc kubenswrapper[4909]: I0202 13:12:16.503013 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4650b583-6dc6-4707-8b7a-8742a9275288/ceilometer-central-agent/0.log" Feb 02 13:12:16 crc kubenswrapper[4909]: I0202 13:12:16.565183 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4650b583-6dc6-4707-8b7a-8742a9275288/ceilometer-notification-agent/0.log" Feb 02 13:12:16 crc kubenswrapper[4909]: I0202 13:12:16.598265 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4650b583-6dc6-4707-8b7a-8742a9275288/proxy-httpd/0.log" Feb 02 13:12:16 crc kubenswrapper[4909]: I0202 13:12:16.680445 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_4650b583-6dc6-4707-8b7a-8742a9275288/sg-core/0.log" Feb 02 13:12:16 crc kubenswrapper[4909]: I0202 13:12:16.794734 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_eea5f68d-4b9b-45cf-b584-075ad7298647/cinder-api-log/0.log" Feb 02 13:12:16 crc kubenswrapper[4909]: I0202 13:12:16.906535 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_eea5f68d-4b9b-45cf-b584-075ad7298647/cinder-api/0.log" Feb 02 13:12:17 crc kubenswrapper[4909]: I0202 13:12:17.019164 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_695e2382-8687-40cf-be18-868d6746b9b9/probe/0.log" Feb 02 13:12:17 crc kubenswrapper[4909]: I0202 13:12:17.060927 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_695e2382-8687-40cf-be18-868d6746b9b9/cinder-scheduler/0.log" Feb 02 13:12:17 crc kubenswrapper[4909]: I0202 13:12:17.232044 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-n2j66_e174bd5f-7fab-4440-aeb4-a5bcf55273b1/configure-network-openstack-openstack-cell1/0.log" Feb 02 13:12:17 crc kubenswrapper[4909]: I0202 13:12:17.326767 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-2mxpm_d090f64e-e6da-45b6-9f4a-3ee8106a132c/configure-os-openstack-openstack-cell1/0.log" Feb 02 13:12:17 crc kubenswrapper[4909]: I0202 13:12:17.511175 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8455c59dc-cb5zc_ef32a0af-7943-4b2f-bf8b-a71421a53d26/init/0.log" Feb 02 13:12:17 crc kubenswrapper[4909]: I0202 13:12:17.679606 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8455c59dc-cb5zc_ef32a0af-7943-4b2f-bf8b-a71421a53d26/init/0.log" Feb 02 13:12:17 crc kubenswrapper[4909]: I0202 13:12:17.738485 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8455c59dc-cb5zc_ef32a0af-7943-4b2f-bf8b-a71421a53d26/dnsmasq-dns/0.log" Feb 02 13:12:17 crc kubenswrapper[4909]: I0202 13:12:17.765359 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-7pdh7_44aa8cf1-5580-47f1-b457-c1afd10ffa00/download-cache-openstack-openstack-cell1/0.log" Feb 02 13:12:17 crc kubenswrapper[4909]: I0202 13:12:17.906496 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b99d8fda-2d28-4e7b-9df5-e5bb8750e52a/glance-httpd/0.log" Feb 02 13:12:17 crc kubenswrapper[4909]: I0202 13:12:17.952521 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_b99d8fda-2d28-4e7b-9df5-e5bb8750e52a/glance-log/0.log" Feb 02 13:12:18 crc kubenswrapper[4909]: I0202 13:12:18.090781 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_74e16c15-dd31-4471-9b1d-80d20044c41a/glance-httpd/0.log" Feb 02 13:12:18 crc kubenswrapper[4909]: I0202 13:12:18.111946 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_74e16c15-dd31-4471-9b1d-80d20044c41a/glance-log/0.log" Feb 02 13:12:18 crc kubenswrapper[4909]: I0202 13:12:18.549945 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-9478bc769-cxwt5_2f53270c-2995-4557-8010-3315762c06c8/heat-engine/0.log" Feb 02 13:12:18 crc kubenswrapper[4909]: I0202 13:12:18.788728 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d78c46cb4-52qcm_343841e4-be3f-444a-a51e-89d5aeb87fa0/horizon/0.log" Feb 02 13:12:18 crc kubenswrapper[4909]: I0202 13:12:18.793274 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5bd5f6b94d-f6rqx_fae5779d-25ee-4282-9474-8306081d28b5/heat-api/0.log" Feb 02 13:12:18 crc kubenswrapper[4909]: I0202 13:12:18.932592 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-5476649989-mfkg8_48c2c982-be37-412b-83e4-7e572a2a2422/heat-cfnapi/0.log" Feb 02 13:12:19 crc kubenswrapper[4909]: I0202 13:12:19.040498 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-rdn67_46ce9be6-c203-4da8-a82c-e81e6e48ef99/install-certs-openstack-openstack-cell1/0.log" Feb 02 13:12:19 crc kubenswrapper[4909]: I0202 13:12:19.185432 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-2cwfg_7057aa50-a591-411f-8c66-1abd54d955b6/install-os-openstack-openstack-cell1/0.log" Feb 02 13:12:19 crc kubenswrapper[4909]: I0202 13:12:19.344363 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6d78c46cb4-52qcm_343841e4-be3f-444a-a51e-89d5aeb87fa0/horizon-log/0.log" Feb 02 13:12:19 crc kubenswrapper[4909]: I0202 13:12:19.459303 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55fd6bf9fc-5zh8j_4fb66e6a-8ffd-4afa-9764-42bfb3cef442/keystone-api/0.log" Feb 02 13:12:19 crc kubenswrapper[4909]: I0202 13:12:19.459727 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29500561-sck9k_966a7f05-d7b9-4141-bba9-76b00ac9f4a5/keystone-cron/0.log" Feb 02 13:12:19 crc kubenswrapper[4909]: I0202 13:12:19.582375 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29500621-ptltd_5bfd2650-fa06-4608-8a79-60211d353a92/keystone-cron/0.log" Feb 02 13:12:19 crc kubenswrapper[4909]: I0202 13:12:19.675568 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5ae91886-34b7-4b28-99ac-6f5e7bced3c7/kube-state-metrics/0.log" Feb 02 13:12:19 crc kubenswrapper[4909]: I0202 13:12:19.826059 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-hpv94_fda9b4fc-6fb1-46c0-8bc4-90b443735e5c/libvirt-openstack-openstack-cell1/0.log" Feb 02 13:12:20 crc kubenswrapper[4909]: I0202 13:12:20.156219 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-864bddcb8f-jdfcj_544c4791-365d-409b-9d1c-7c6408c865ca/neutron-api/0.log" Feb 02 13:12:20 crc kubenswrapper[4909]: I0202 13:12:20.176979 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-864bddcb8f-jdfcj_544c4791-365d-409b-9d1c-7c6408c865ca/neutron-httpd/0.log" Feb 02 13:12:20 crc kubenswrapper[4909]: I0202 13:12:20.310798 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-xn5dg_5ee147f3-fc77-4226-a4c8-50ecc12fe936/neutron-dhcp-openstack-openstack-cell1/0.log" Feb 02 13:12:20 crc kubenswrapper[4909]: I0202 13:12:20.486230 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-cmt4b_5588b6ad-43e1-489d-8157-b9ed3a7da9de/neutron-metadata-openstack-openstack-cell1/0.log" Feb 02 13:12:20 crc kubenswrapper[4909]: I0202 13:12:20.560810 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-h4k8f_223a1c81-ef51-4efa-ae1d-e511e66719b4/neutron-sriov-openstack-openstack-cell1/0.log" Feb 02 13:12:20 crc kubenswrapper[4909]: I0202 13:12:20.875540 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b22e2928-59ab-47d5-84f8-4ad233ffd449/nova-api-log/0.log" Feb 02 13:12:20 crc kubenswrapper[4909]: I0202 13:12:20.943096 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b22e2928-59ab-47d5-84f8-4ad233ffd449/nova-api-api/0.log" Feb 02 13:12:21 crc kubenswrapper[4909]: I0202 13:12:21.149174 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_17a7f0b1-11b1-405e-97ba-0bacc251ef8e/nova-cell0-conductor-conductor/0.log" Feb 02 13:12:21 crc kubenswrapper[4909]: I0202 13:12:21.238763 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_fd610f58-bafd-4ba2-bc8e-cfc79d94cad5/nova-cell1-conductor-conductor/0.log" Feb 02 13:12:21 crc kubenswrapper[4909]: I0202 13:12:21.500833 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellqphfm_ffd93f1f-f6f8-440a-91eb-8bc6f1ca7926/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Feb 02 13:12:21 crc kubenswrapper[4909]: I0202 13:12:21.546096 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_e34df5c5-803e-42a9-9cba-6562cc33f0d1/nova-cell1-novncproxy-novncproxy/0.log" Feb 02 13:12:21 crc kubenswrapper[4909]: I0202 13:12:21.722453 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-sjz6r_900e031b-9fe3-4b77-b7e1-91bfde689d44/nova-cell1-openstack-openstack-cell1/0.log" Feb 02 13:12:21 crc kubenswrapper[4909]: I0202 13:12:21.946879 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2e59f904-7475-4136-bb41-a907e4855430/nova-metadata-log/0.log" Feb 02 13:12:22 crc kubenswrapper[4909]: I0202 13:12:22.250664 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_588767b4-05f7-468e-991c-eb0a131fff90/nova-scheduler-scheduler/0.log" Feb 02 13:12:22 crc kubenswrapper[4909]: I0202 13:12:22.302757 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6744779749-884lr_3d43a55d-45af-4682-9319-ef98614343d4/init/0.log" Feb 02 13:12:22 crc kubenswrapper[4909]: I0202 13:12:22.442568 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_2e59f904-7475-4136-bb41-a907e4855430/nova-metadata-metadata/0.log" Feb 02 13:12:22 crc kubenswrapper[4909]: I0202 13:12:22.492550 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6744779749-884lr_3d43a55d-45af-4682-9319-ef98614343d4/init/0.log" Feb 02 13:12:22 crc kubenswrapper[4909]: I0202 13:12:22.576878 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6744779749-884lr_3d43a55d-45af-4682-9319-ef98614343d4/octavia-api-provider-agent/0.log" Feb 02 13:12:22 crc kubenswrapper[4909]: I0202 13:12:22.691611 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-94tp4_3674af27-feb0-492c-9195-5557c3d392c1/init/0.log" Feb 02 13:12:22 crc kubenswrapper[4909]: I0202 13:12:22.807126 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6744779749-884lr_3d43a55d-45af-4682-9319-ef98614343d4/octavia-api/0.log" Feb 02 13:12:22 crc kubenswrapper[4909]: I0202 13:12:22.914911 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-94tp4_3674af27-feb0-492c-9195-5557c3d392c1/init/0.log" Feb 02 13:12:23 crc kubenswrapper[4909]: I0202 13:12:23.064159 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-94tp4_3674af27-feb0-492c-9195-5557c3d392c1/octavia-healthmanager/0.log" Feb 02 13:12:23 crc kubenswrapper[4909]: I0202 13:12:23.081943 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-snk8x_e6d66230-f723-4042-86c4-6f19c99ae749/init/0.log" Feb 02 13:12:23 crc kubenswrapper[4909]: I0202 13:12:23.296368 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-snk8x_e6d66230-f723-4042-86c4-6f19c99ae749/octavia-housekeeping/0.log" Feb 02 13:12:23 crc kubenswrapper[4909]: I0202 13:12:23.340947 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-snk8x_e6d66230-f723-4042-86c4-6f19c99ae749/init/0.log" Feb 02 13:12:23 crc kubenswrapper[4909]: I0202 13:12:23.358996 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-fjmbh_2a570f17-3d51-4c5f-a813-e3f233631ef7/init/0.log" Feb 02 13:12:23 crc kubenswrapper[4909]: I0202 13:12:23.589365 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-fjmbh_2a570f17-3d51-4c5f-a813-e3f233631ef7/octavia-rsyslog/0.log" Feb 02 13:12:23 crc kubenswrapper[4909]: I0202 13:12:23.630159 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-fjmbh_2a570f17-3d51-4c5f-a813-e3f233631ef7/init/0.log" Feb 02 13:12:23 crc kubenswrapper[4909]: I0202 13:12:23.669951 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8ml42_ebae09da-3646-40dd-98f7-d49907beacd1/init/0.log" Feb 02 13:12:23 crc kubenswrapper[4909]: I0202 13:12:23.873167 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8ml42_ebae09da-3646-40dd-98f7-d49907beacd1/init/0.log" Feb 02 13:12:24 crc kubenswrapper[4909]: I0202 13:12:24.006593 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_80206191-1878-4ffb-a98e-5d62e577218d/mysql-bootstrap/0.log" Feb 02 13:12:24 crc kubenswrapper[4909]: I0202 13:12:24.025627 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8ml42_ebae09da-3646-40dd-98f7-d49907beacd1/octavia-worker/0.log" Feb 02 13:12:24 crc kubenswrapper[4909]: I0202 13:12:24.188576 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_80206191-1878-4ffb-a98e-5d62e577218d/mysql-bootstrap/0.log" Feb 02 13:12:24 crc kubenswrapper[4909]: I0202 13:12:24.275766 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_80206191-1878-4ffb-a98e-5d62e577218d/galera/0.log" Feb 02 13:12:24 crc kubenswrapper[4909]: I0202 13:12:24.302127 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4de2bb2a-bd6a-4a2d-b885-dacdc62949d9/mysql-bootstrap/0.log" Feb 02 13:12:24 crc kubenswrapper[4909]: I0202 13:12:24.530395 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f2e9096b-893f-4741-92f0-70a241bcd035/openstackclient/0.log" Feb 02 13:12:24 crc kubenswrapper[4909]: I0202 13:12:24.532222 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4de2bb2a-bd6a-4a2d-b885-dacdc62949d9/galera/0.log" Feb 02 13:12:24 crc kubenswrapper[4909]: I0202 13:12:24.606735 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_4de2bb2a-bd6a-4a2d-b885-dacdc62949d9/mysql-bootstrap/0.log" Feb 02 13:12:24 crc kubenswrapper[4909]: I0202 13:12:24.771589 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2vh4d_0f30be1f-7fe1-40f2-89d3-23cfca972041/ovn-controller/0.log" Feb 02 13:12:25 crc kubenswrapper[4909]: I0202 13:12:25.266681 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-zkzwg_01ae3729-32df-4aac-bc6a-e401e0cb9aa2/openstack-network-exporter/0.log" Feb 02 13:12:25 crc kubenswrapper[4909]: I0202 13:12:25.291730 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jgdxf_bcbf23b5-3226-43c6-b8d3-6ba72b955eda/ovsdb-server-init/0.log" Feb 02 13:12:25 crc kubenswrapper[4909]: I0202 13:12:25.514007 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jgdxf_bcbf23b5-3226-43c6-b8d3-6ba72b955eda/ovsdb-server-init/0.log" Feb 02 13:12:25 crc kubenswrapper[4909]: I0202 13:12:25.561781 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jgdxf_bcbf23b5-3226-43c6-b8d3-6ba72b955eda/ovs-vswitchd/0.log" Feb 02 13:12:25 crc kubenswrapper[4909]: I0202 13:12:25.639387 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jgdxf_bcbf23b5-3226-43c6-b8d3-6ba72b955eda/ovsdb-server/0.log" Feb 02 13:12:25 crc kubenswrapper[4909]: I0202 13:12:25.723387 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2/openstack-network-exporter/0.log" Feb 02 13:12:25 crc kubenswrapper[4909]: I0202 13:12:25.791447 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b1a7f7b-d7aa-4a2f-b868-00e64797a3f2/ovn-northd/0.log" Feb 02 13:12:25 crc kubenswrapper[4909]: I0202 13:12:25.946899 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-pc7fn_f2dd83e0-68d6-4e8a-aa23-0a8252bc28b5/ovn-openstack-openstack-cell1/0.log" Feb 02 13:12:26 crc kubenswrapper[4909]: I0202 13:12:26.013935 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5aa0410b-2229-496d-a4df-7769afab71c3/openstack-network-exporter/0.log" Feb 02 13:12:26 crc kubenswrapper[4909]: I0202 13:12:26.175162 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5aa0410b-2229-496d-a4df-7769afab71c3/ovsdbserver-nb/0.log" Feb 02 13:12:26 crc kubenswrapper[4909]: I0202 13:12:26.270289 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_36b9c7ae-e055-434d-b016-2dcca5daf712/openstack-network-exporter/0.log" Feb 02 13:12:26 crc kubenswrapper[4909]: I0202 13:12:26.349430 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_36b9c7ae-e055-434d-b016-2dcca5daf712/ovsdbserver-nb/0.log" Feb 02 13:12:26 crc kubenswrapper[4909]: I0202 13:12:26.504967 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0067f53b-b646-4cfb-82c6-71cc14a45dcb/openstack-network-exporter/0.log" Feb 02 13:12:26 crc kubenswrapper[4909]: I0202 13:12:26.546758 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0067f53b-b646-4cfb-82c6-71cc14a45dcb/ovsdbserver-nb/0.log" Feb 02 13:12:26 crc kubenswrapper[4909]: I0202 13:12:26.805229 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4b94aba5-26fe-4f14-9268-e8666aac57ec/openstack-network-exporter/0.log" Feb 02 13:12:26 crc kubenswrapper[4909]: I0202 13:12:26.911275 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4b94aba5-26fe-4f14-9268-e8666aac57ec/ovsdbserver-sb/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.012169 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_51928bd6-d607-4438-ae21-b49df832df5c/openstack-network-exporter/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.064794 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_51928bd6-d607-4438-ae21-b49df832df5c/ovsdbserver-sb/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.185841 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_30354f86-0667-4f76-a9c7-0e54ab2c5d83/openstack-network-exporter/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.236649 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_30354f86-0667-4f76-a9c7-0e54ab2c5d83/ovsdbserver-sb/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.465841 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6bc5c4cc7d-29h64_74164b48-4c93-4c9c-98c1-6997dc72ec87/placement-api/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.512713 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6bc5c4cc7d-29h64_74164b48-4c93-4c9c-98c1-6997dc72ec87/placement-log/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.606337 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c8n998_568f0ced-c6cd-4f8d-8b47-f8e093042a2f/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.788182 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d11c3571-1885-455a-bd2f-e6acfdae15ba/init-config-reloader/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.970995 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d11c3571-1885-455a-bd2f-e6acfdae15ba/config-reloader/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.979190 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d11c3571-1885-455a-bd2f-e6acfdae15ba/init-config-reloader/0.log" Feb 02 13:12:27 crc kubenswrapper[4909]: I0202 13:12:27.984851 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d11c3571-1885-455a-bd2f-e6acfdae15ba/prometheus/0.log" Feb 02 13:12:28 crc kubenswrapper[4909]: I0202 13:12:28.034027 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d11c3571-1885-455a-bd2f-e6acfdae15ba/thanos-sidecar/0.log" Feb 02 13:12:28 crc kubenswrapper[4909]: I0202 13:12:28.230222 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a578b144-a50a-4b91-9410-493990a51e5a/setup-container/0.log" Feb 02 13:12:28 crc kubenswrapper[4909]: I0202 13:12:28.422137 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a578b144-a50a-4b91-9410-493990a51e5a/setup-container/0.log" Feb 02 13:12:28 crc kubenswrapper[4909]: I0202 13:12:28.476671 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_64589d75-4c87-4648-ad24-bfbd620e9f2d/setup-container/0.log" Feb 02 13:12:28 crc kubenswrapper[4909]: I0202 13:12:28.509104 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a578b144-a50a-4b91-9410-493990a51e5a/rabbitmq/0.log" Feb 02 13:12:28 crc kubenswrapper[4909]: I0202 13:12:28.724833 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_64589d75-4c87-4648-ad24-bfbd620e9f2d/setup-container/0.log" Feb 02 13:12:28 crc kubenswrapper[4909]: I0202 13:12:28.786414 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-vks94_609b4bdb-129e-4a27-841e-a83e453dfd79/reboot-os-openstack-openstack-cell1/0.log" Feb 02 13:12:28 crc kubenswrapper[4909]: I0202 13:12:28.817872 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_64589d75-4c87-4648-ad24-bfbd620e9f2d/rabbitmq/0.log" Feb 02 13:12:29 crc kubenswrapper[4909]: I0202 13:12:29.074581 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-dxxzz_f1fe7ee2-fd88-40ef-ae68-ad17581f5d5c/run-os-openstack-openstack-cell1/0.log" Feb 02 13:12:29 crc kubenswrapper[4909]: I0202 13:12:29.101463 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-mlt78_a6987177-0c44-4124-bb94-c6883fa3ed07/ssh-known-hosts-openstack/0.log" Feb 02 13:12:29 crc kubenswrapper[4909]: I0202 13:12:29.313844 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-74ddb64c4d-d2jbp_bc004db8-ba73-439f-b849-14910682e8c8/proxy-server/0.log" Feb 02 13:12:29 crc kubenswrapper[4909]: I0202 13:12:29.502090 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-9ps2d_bf9311a4-5cfc-4c64-ad6b-198cbe0509bd/swift-ring-rebalance/0.log" Feb 02 13:12:29 crc kubenswrapper[4909]: I0202 13:12:29.524975 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-74ddb64c4d-d2jbp_bc004db8-ba73-439f-b849-14910682e8c8/proxy-httpd/0.log" Feb 02 13:12:29 crc kubenswrapper[4909]: I0202 13:12:29.753334 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-mnd88_ff72a01e-3127-4bd2-bcc8-abddc9be70fc/telemetry-openstack-openstack-cell1/0.log" Feb 02 13:12:29 crc kubenswrapper[4909]: I0202 13:12:29.833299 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-z5fst_f8db189d-64bd-4a95-93de-3ddcb680c6b0/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Feb 02 13:12:29 crc kubenswrapper[4909]: I0202 13:12:29.999640 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-pnnsn_ad64480d-8473-47ba-9879-94a19f802dbf/validate-network-openstack-openstack-cell1/0.log" Feb 02 13:12:31 crc kubenswrapper[4909]: I0202 13:12:31.295706 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_290858d2-96ee-4354-81b5-426df3bcfba5/memcached/0.log" Feb 02 13:12:33 crc kubenswrapper[4909]: I0202 13:12:33.072626 4909 scope.go:117] "RemoveContainer" containerID="8a8280a35d59206a13dff7eb93f70dff9a9dba4ea41d02a9f3303d7bb220d261" Feb 02 13:12:33 crc kubenswrapper[4909]: I0202 13:12:33.093874 4909 scope.go:117] "RemoveContainer" containerID="e3352267ae226247e28089f2df88488cb747354e71bdac56de20c2e3f184a9b0" Feb 02 13:12:33 crc kubenswrapper[4909]: I0202 13:12:33.165359 4909 scope.go:117] "RemoveContainer" containerID="b32d991b7b7bc51982e1f25c0c6cca8e7b8732f91db6902d54e7357990ef7cd3" Feb 02 13:12:57 crc kubenswrapper[4909]: I0202 13:12:57.475755 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4_aab415f2-2dde-4804-a951-2a0df278fd86/util/0.log" Feb 02 13:12:57 crc kubenswrapper[4909]: I0202 13:12:57.690217 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4_aab415f2-2dde-4804-a951-2a0df278fd86/pull/0.log" Feb 02 13:12:57 crc kubenswrapper[4909]: I0202 13:12:57.693346 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4_aab415f2-2dde-4804-a951-2a0df278fd86/util/0.log" Feb 02 13:12:57 crc kubenswrapper[4909]: I0202 13:12:57.710148 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4_aab415f2-2dde-4804-a951-2a0df278fd86/pull/0.log" Feb 02 13:12:57 crc kubenswrapper[4909]: I0202 13:12:57.879058 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4_aab415f2-2dde-4804-a951-2a0df278fd86/pull/0.log" Feb 02 13:12:57 crc kubenswrapper[4909]: I0202 13:12:57.880736 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4_aab415f2-2dde-4804-a951-2a0df278fd86/util/0.log" Feb 02 13:12:57 crc kubenswrapper[4909]: I0202 13:12:57.913849 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_805de4b7130efd4fbe2290d0c51d78e6295b83d77efc3ff2d1b014ffe7kx9n4_aab415f2-2dde-4804-a951-2a0df278fd86/extract/0.log" Feb 02 13:12:58 crc kubenswrapper[4909]: I0202 13:12:58.227828 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7b6c4d8c5f-dhq2f_b3448119-ce0b-44b6-8491-2d2bc7a1352b/manager/0.log" Feb 02 13:12:58 crc kubenswrapper[4909]: I0202 13:12:58.248829 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d874c8fc-q85jz_fd689c48-8bd5-4200-9602-2a9c82503585/manager/0.log" Feb 02 13:12:58 crc kubenswrapper[4909]: I0202 13:12:58.368930 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d9697b7f4-ztcq9_f5680c7a-ca86-40e8-b724-de63f5a24da2/manager/0.log" Feb 02 13:12:58 crc kubenswrapper[4909]: I0202 13:12:58.571580 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-8886f4c47-prbwc_beeaa5f8-617f-4486-86a0-122ab355e4de/manager/0.log" Feb 02 13:12:58 crc kubenswrapper[4909]: I0202 13:12:58.690003 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69d6db494d-kqmfw_b2ab0041-b77e-4974-9ca4-7100b40c06e8/manager/0.log" Feb 02 13:12:58 crc kubenswrapper[4909]: I0202 13:12:58.744462 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5fb775575f-h65nm_113ec314-f58c-41b3-bf30-8925e5555c77/manager/0.log" Feb 02 13:12:58 crc kubenswrapper[4909]: I0202 13:12:58.957592 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5f4b8bd54d-zfzjz_939074c8-53f8-4574-8868-34d99851993d/manager/0.log" Feb 02 13:12:59 crc kubenswrapper[4909]: I0202 13:12:59.279341 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7dd968899f-k6628_eceab652-e90e-4c17-a629-91fbd88492e6/manager/0.log" Feb 02 13:12:59 crc kubenswrapper[4909]: I0202 13:12:59.313482 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-84f48565d4-x9bhz_3a9ea22c-1031-4060-a4c1-7b65710bcb49/manager/0.log" Feb 02 13:12:59 crc kubenswrapper[4909]: I0202 13:12:59.550010 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79955696d6-2rl4b_2b2f22c4-e784-4742-86fa-aef6d4e31970/manager/0.log" Feb 02 13:12:59 crc kubenswrapper[4909]: I0202 13:12:59.610619 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67bf948998-ppnjd_9002663c-c720-4fde-be9c-47008dfd15a6/manager/0.log" Feb 02 13:12:59 crc kubenswrapper[4909]: I0202 13:12:59.694389 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-585dbc889-5kc7k_dc7852b6-2562-4a40-8ba7-01764a270e45/manager/0.log" Feb 02 13:13:00 crc kubenswrapper[4909]: I0202 13:13:00.019437 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6687f8d877-7bxgj_f79c4c2a-6eb9-4883-a6bf-3955b44fad05/manager/0.log" Feb 02 13:13:00 crc kubenswrapper[4909]: I0202 13:13:00.047024 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-55bff696bd-vcrsp_f4fc8502-eb04-4488-aeef-02f233cac870/manager/0.log" Feb 02 13:13:00 crc kubenswrapper[4909]: I0202 13:13:00.142152 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5ff45dfdbf9vhcp_12c45a89-12fb-4ce4-aa9a-35931b51407a/manager/0.log" Feb 02 13:13:00 crc kubenswrapper[4909]: I0202 13:13:00.345123 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6bf6665fd-mnrdq_5718c49e-365b-4d07-8a3f-69cdd9012758/operator/0.log" Feb 02 13:13:00 crc kubenswrapper[4909]: I0202 13:13:00.791277 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-27vfj_f0df9d40-bcc7-4f3a-8424-a8404d325e7b/registry-server/0.log" Feb 02 13:13:00 crc kubenswrapper[4909]: I0202 13:13:00.956780 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-788c46999f-845mj_4b7813c2-b207-47b4-a1da-099645dc5e7c/manager/0.log" Feb 02 13:13:01 crc kubenswrapper[4909]: I0202 13:13:01.078834 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b964cf4cd-9v7l6_74b20727-6371-480f-aece-9d33cfc2075a/manager/0.log" Feb 02 13:13:01 crc kubenswrapper[4909]: I0202 13:13:01.279247 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-mnk6q_5d71df35-50ab-41fe-ad0a-cfbc9b06cc71/operator/0.log" Feb 02 13:13:01 crc kubenswrapper[4909]: I0202 13:13:01.489390 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68fc8c869-r9p5r_186cb127-7e1c-4edf-befe-5e8c89d5d819/manager/0.log" Feb 02 13:13:01 crc kubenswrapper[4909]: I0202 13:13:01.909057 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-56f8bfcd9f-zj4v5_20dbebe4-193c-4657-a36f-45c3ec2c435c/manager/0.log" Feb 02 13:13:01 crc kubenswrapper[4909]: I0202 13:13:01.955759 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-64b5b76f97-28qg2_621ec5cc-d8f5-4be2-9f9d-3fc7420d7e9c/manager/0.log" Feb 02 13:13:02 crc kubenswrapper[4909]: I0202 13:13:02.164273 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-564965969-7z4k5_2ef2b6a8-5046-4a0c-82fd-b85d0e460c03/manager/0.log" Feb 02 13:13:02 crc kubenswrapper[4909]: I0202 13:13:02.963665 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-646f757d77-z4bz9_124911b8-e918-4360-8206-e9d72be4448f/manager/0.log" Feb 02 13:13:22 crc kubenswrapper[4909]: I0202 13:13:22.702092 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ptpkp_6c932fa7-7181-4aac-bf6c-8a6d56f92ece/control-plane-machine-set-operator/0.log" Feb 02 13:13:22 crc kubenswrapper[4909]: I0202 13:13:22.830050 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ghmnw_e650bcf6-f84c-4322-86ac-6df17841176d/kube-rbac-proxy/0.log" Feb 02 13:13:22 crc kubenswrapper[4909]: I0202 13:13:22.905984 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ghmnw_e650bcf6-f84c-4322-86ac-6df17841176d/machine-api-operator/0.log" Feb 02 13:13:38 crc kubenswrapper[4909]: I0202 13:13:38.037724 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-rdsl8_cf5987aa-9823-41a1-ac90-a4d310f6fecb/cert-manager-controller/0.log" Feb 02 13:13:38 crc kubenswrapper[4909]: I0202 13:13:38.277754 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-v24jd_48108612-44a9-47ff-8d12-a52482a1c24c/cert-manager-webhook/0.log" Feb 02 13:13:38 crc kubenswrapper[4909]: I0202 13:13:38.279100 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-sb4rv_3a475b79-512c-42c3-ae33-9c4208d55edd/cert-manager-cainjector/0.log" Feb 02 13:13:49 crc kubenswrapper[4909]: I0202 13:13:49.511502 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:13:49 crc kubenswrapper[4909]: I0202 13:13:49.512291 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:13:50 crc kubenswrapper[4909]: I0202 13:13:50.332366 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-lmkj4_397b3083-61cc-455c-aadb-80b21941b774/nmstate-console-plugin/0.log" Feb 02 13:13:50 crc kubenswrapper[4909]: I0202 13:13:50.538152 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bc74h_9240a23a-f4b9-48ca-a3a6-42dd07d0e461/nmstate-handler/0.log" Feb 02 13:13:50 crc kubenswrapper[4909]: I0202 13:13:50.611391 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-vgpns_c888a2e4-c3b1-4cf7-8058-c724fbf2cc74/nmstate-metrics/0.log" Feb 02 13:13:50 crc kubenswrapper[4909]: I0202 13:13:50.642424 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-vgpns_c888a2e4-c3b1-4cf7-8058-c724fbf2cc74/kube-rbac-proxy/0.log" Feb 02 13:13:50 crc kubenswrapper[4909]: I0202 13:13:50.796168 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-vvhvt_7a287d80-b25f-49de-a25b-5cc2ab9b3096/nmstate-operator/0.log" Feb 02 13:13:50 crc kubenswrapper[4909]: I0202 13:13:50.884052 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-hjgm5_96651658-986f-45ff-a87b-c84f9d98848b/nmstate-webhook/0.log" Feb 02 13:14:05 crc kubenswrapper[4909]: I0202 13:14:05.037548 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2fsps_c38385f8-d673-4443-91ff-2e2bb10686bf/prometheus-operator/0.log" Feb 02 13:14:05 crc kubenswrapper[4909]: I0202 13:14:05.213287 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-848d587754-bgkpz_0c1c9acf-33b3-4b87-ba96-0def73b7e9f9/prometheus-operator-admission-webhook/0.log" Feb 02 13:14:05 crc kubenswrapper[4909]: I0202 13:14:05.248052 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-848d587754-hlmxx_6dd6d59f-1261-4704-a129-6361cb00de58/prometheus-operator-admission-webhook/0.log" Feb 02 13:14:05 crc kubenswrapper[4909]: I0202 13:14:05.456415 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-zqtcd_da7697a0-be27-4dac-b2bb-6af40732994e/operator/0.log" Feb 02 13:14:05 crc kubenswrapper[4909]: I0202 13:14:05.494824 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tnrzd_2de24d60-d18d-44ad-ac44-c89d52fdd86a/perses-operator/0.log" Feb 02 13:14:19 crc kubenswrapper[4909]: I0202 13:14:19.510779 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:14:19 crc kubenswrapper[4909]: I0202 13:14:19.511296 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:14:21 crc kubenswrapper[4909]: I0202 13:14:21.408707 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-mjpfz_7aa1b8bf-83b9-40e4-81e7-30087a626c01/kube-rbac-proxy/0.log" Feb 02 13:14:21 crc kubenswrapper[4909]: I0202 13:14:21.648887 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-fhcdh_1b71e887-8283-4799-b19a-333c87d6fcaa/frr-k8s-webhook-server/0.log" Feb 02 13:14:21 crc kubenswrapper[4909]: I0202 13:14:21.744695 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-mjpfz_7aa1b8bf-83b9-40e4-81e7-30087a626c01/controller/0.log" Feb 02 13:14:21 crc kubenswrapper[4909]: I0202 13:14:21.933042 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-frr-files/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.071764 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-frr-files/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.081772 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-reloader/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.112424 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-metrics/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.138981 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-reloader/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.362133 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-reloader/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.394444 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-frr-files/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.437461 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-metrics/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.456352 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-metrics/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.638617 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-metrics/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.648479 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-reloader/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.660362 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/controller/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.660398 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/cp-frr-files/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.827998 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/frr-metrics/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.879051 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/kube-rbac-proxy/0.log" Feb 02 13:14:22 crc kubenswrapper[4909]: I0202 13:14:22.886814 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/kube-rbac-proxy-frr/0.log" Feb 02 13:14:23 crc kubenswrapper[4909]: I0202 13:14:23.697438 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/reloader/0.log" Feb 02 13:14:23 crc kubenswrapper[4909]: I0202 13:14:23.886883 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-86855cd4c5-5mr2k_f5a1b826-7555-4329-bad8-41387595bcdd/manager/0.log" Feb 02 13:14:24 crc kubenswrapper[4909]: I0202 13:14:24.107400 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7cf86474db-m5gkg_bb7a1e8d-3cb1-4269-aed3-822874a6b8e6/webhook-server/0.log" Feb 02 13:14:24 crc kubenswrapper[4909]: I0202 13:14:24.198580 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6w72v_a41d250d-abb4-43e0-b2e9-73d610ea3ced/kube-rbac-proxy/0.log" Feb 02 13:14:25 crc kubenswrapper[4909]: I0202 13:14:25.283750 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6w72v_a41d250d-abb4-43e0-b2e9-73d610ea3ced/speaker/0.log" Feb 02 13:14:26 crc kubenswrapper[4909]: I0202 13:14:26.288381 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-x252g_f609075c-566c-46ad-bdab-f01502b06571/frr/0.log" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.248724 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lgcs9"] Feb 02 13:14:27 crc kubenswrapper[4909]: E0202 13:14:27.249547 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerName="extract-content" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.249571 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerName="extract-content" Feb 02 13:14:27 crc kubenswrapper[4909]: E0202 13:14:27.249583 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerName="extract-utilities" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.249589 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerName="extract-utilities" Feb 02 13:14:27 crc kubenswrapper[4909]: E0202 13:14:27.249608 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerName="registry-server" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.249615 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerName="registry-server" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.249864 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="98172ed8-beb8-4f76-93fd-79b5565b1cb3" containerName="registry-server" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.251767 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.264948 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lgcs9"] Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.310969 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct5kt\" (UniqueName: \"kubernetes.io/projected/6eb673dd-fa20-4c44-85b4-0717c94083a5-kube-api-access-ct5kt\") pod \"redhat-operators-lgcs9\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.311023 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-utilities\") pod \"redhat-operators-lgcs9\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.311092 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-catalog-content\") pod \"redhat-operators-lgcs9\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.413067 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct5kt\" (UniqueName: \"kubernetes.io/projected/6eb673dd-fa20-4c44-85b4-0717c94083a5-kube-api-access-ct5kt\") pod \"redhat-operators-lgcs9\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.413137 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-utilities\") pod \"redhat-operators-lgcs9\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.413238 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-catalog-content\") pod \"redhat-operators-lgcs9\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.413978 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-catalog-content\") pod \"redhat-operators-lgcs9\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.413987 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-utilities\") pod \"redhat-operators-lgcs9\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.433502 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct5kt\" (UniqueName: \"kubernetes.io/projected/6eb673dd-fa20-4c44-85b4-0717c94083a5-kube-api-access-ct5kt\") pod \"redhat-operators-lgcs9\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:27 crc kubenswrapper[4909]: I0202 13:14:27.571856 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:28 crc kubenswrapper[4909]: I0202 13:14:28.093046 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lgcs9"] Feb 02 13:14:28 crc kubenswrapper[4909]: I0202 13:14:28.508838 4909 generic.go:334] "Generic (PLEG): container finished" podID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerID="239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8" exitCode=0 Feb 02 13:14:28 crc kubenswrapper[4909]: I0202 13:14:28.508903 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgcs9" event={"ID":"6eb673dd-fa20-4c44-85b4-0717c94083a5","Type":"ContainerDied","Data":"239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8"} Feb 02 13:14:28 crc kubenswrapper[4909]: I0202 13:14:28.509568 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgcs9" event={"ID":"6eb673dd-fa20-4c44-85b4-0717c94083a5","Type":"ContainerStarted","Data":"f936f1837bd99648845c20ffb94e99363ac396df0f18794f101b96fa83c03228"} Feb 02 13:14:28 crc kubenswrapper[4909]: I0202 13:14:28.511210 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:14:29 crc kubenswrapper[4909]: I0202 13:14:29.519603 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgcs9" event={"ID":"6eb673dd-fa20-4c44-85b4-0717c94083a5","Type":"ContainerStarted","Data":"e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d"} Feb 02 13:14:34 crc kubenswrapper[4909]: I0202 13:14:34.566220 4909 generic.go:334] "Generic (PLEG): container finished" podID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerID="e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d" exitCode=0 Feb 02 13:14:34 crc kubenswrapper[4909]: I0202 13:14:34.566294 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgcs9" event={"ID":"6eb673dd-fa20-4c44-85b4-0717c94083a5","Type":"ContainerDied","Data":"e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d"} Feb 02 13:14:35 crc kubenswrapper[4909]: I0202 13:14:35.583361 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgcs9" event={"ID":"6eb673dd-fa20-4c44-85b4-0717c94083a5","Type":"ContainerStarted","Data":"a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095"} Feb 02 13:14:35 crc kubenswrapper[4909]: I0202 13:14:35.611956 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lgcs9" podStartSLOduration=2.130205143 podStartE2EDuration="8.611934768s" podCreationTimestamp="2026-02-02 13:14:27 +0000 UTC" firstStartedPulling="2026-02-02 13:14:28.51099167 +0000 UTC m=+9794.257092405" lastFinishedPulling="2026-02-02 13:14:34.992721295 +0000 UTC m=+9800.738822030" observedRunningTime="2026-02-02 13:14:35.606257167 +0000 UTC m=+9801.352357912" watchObservedRunningTime="2026-02-02 13:14:35.611934768 +0000 UTC m=+9801.358035503" Feb 02 13:14:37 crc kubenswrapper[4909]: I0202 13:14:37.571982 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:37 crc kubenswrapper[4909]: I0202 13:14:37.572275 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:14:38 crc kubenswrapper[4909]: I0202 13:14:38.627135 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lgcs9" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="registry-server" probeResult="failure" output=< Feb 02 13:14:38 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 13:14:38 crc kubenswrapper[4909]: > Feb 02 13:14:41 crc kubenswrapper[4909]: I0202 13:14:41.364237 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q_4c0bd62a-0449-491c-aa8e-41bf967bd421/util/0.log" Feb 02 13:14:41 crc kubenswrapper[4909]: I0202 13:14:41.639965 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q_4c0bd62a-0449-491c-aa8e-41bf967bd421/pull/0.log" Feb 02 13:14:41 crc kubenswrapper[4909]: I0202 13:14:41.670254 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q_4c0bd62a-0449-491c-aa8e-41bf967bd421/util/0.log" Feb 02 13:14:41 crc kubenswrapper[4909]: I0202 13:14:41.700915 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q_4c0bd62a-0449-491c-aa8e-41bf967bd421/pull/0.log" Feb 02 13:14:41 crc kubenswrapper[4909]: I0202 13:14:41.896860 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q_4c0bd62a-0449-491c-aa8e-41bf967bd421/pull/0.log" Feb 02 13:14:41 crc kubenswrapper[4909]: I0202 13:14:41.947958 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q_4c0bd62a-0449-491c-aa8e-41bf967bd421/util/0.log" Feb 02 13:14:41 crc kubenswrapper[4909]: I0202 13:14:41.961092 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc4457q_4c0bd62a-0449-491c-aa8e-41bf967bd421/extract/0.log" Feb 02 13:14:42 crc kubenswrapper[4909]: I0202 13:14:42.151996 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f_a163eaee-108d-441f-b814-93c208605cd2/util/0.log" Feb 02 13:14:42 crc kubenswrapper[4909]: I0202 13:14:42.356391 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f_a163eaee-108d-441f-b814-93c208605cd2/pull/0.log" Feb 02 13:14:42 crc kubenswrapper[4909]: I0202 13:14:42.432717 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f_a163eaee-108d-441f-b814-93c208605cd2/util/0.log" Feb 02 13:14:42 crc kubenswrapper[4909]: I0202 13:14:42.472000 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f_a163eaee-108d-441f-b814-93c208605cd2/pull/0.log" Feb 02 13:14:42 crc kubenswrapper[4909]: I0202 13:14:42.671749 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f_a163eaee-108d-441f-b814-93c208605cd2/extract/0.log" Feb 02 13:14:42 crc kubenswrapper[4909]: I0202 13:14:42.708555 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f_a163eaee-108d-441f-b814-93c208605cd2/util/0.log" Feb 02 13:14:42 crc kubenswrapper[4909]: I0202 13:14:42.725910 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec71356v7f_a163eaee-108d-441f-b814-93c208605cd2/pull/0.log" Feb 02 13:14:42 crc kubenswrapper[4909]: I0202 13:14:42.914181 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8_a54838f1-b5b8-4650-a518-8dbb1753d5c9/util/0.log" Feb 02 13:14:43 crc kubenswrapper[4909]: I0202 13:14:43.229727 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8_a54838f1-b5b8-4650-a518-8dbb1753d5c9/pull/0.log" Feb 02 13:14:43 crc kubenswrapper[4909]: I0202 13:14:43.275865 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8_a54838f1-b5b8-4650-a518-8dbb1753d5c9/pull/0.log" Feb 02 13:14:43 crc kubenswrapper[4909]: I0202 13:14:43.320610 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8_a54838f1-b5b8-4650-a518-8dbb1753d5c9/util/0.log" Feb 02 13:14:43 crc kubenswrapper[4909]: I0202 13:14:43.415913 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8_a54838f1-b5b8-4650-a518-8dbb1753d5c9/util/0.log" Feb 02 13:14:43 crc kubenswrapper[4909]: I0202 13:14:43.470075 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8_a54838f1-b5b8-4650-a518-8dbb1753d5c9/pull/0.log" Feb 02 13:14:43 crc kubenswrapper[4909]: I0202 13:14:43.500576 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e55vvb8_a54838f1-b5b8-4650-a518-8dbb1753d5c9/extract/0.log" Feb 02 13:14:43 crc kubenswrapper[4909]: I0202 13:14:43.629917 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb_08a88ffb-9ba5-45a7-91de-aae63dfd8b5c/util/0.log" Feb 02 13:14:44 crc kubenswrapper[4909]: I0202 13:14:44.007452 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb_08a88ffb-9ba5-45a7-91de-aae63dfd8b5c/util/0.log" Feb 02 13:14:44 crc kubenswrapper[4909]: I0202 13:14:44.053944 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb_08a88ffb-9ba5-45a7-91de-aae63dfd8b5c/pull/0.log" Feb 02 13:14:44 crc kubenswrapper[4909]: I0202 13:14:44.131732 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb_08a88ffb-9ba5-45a7-91de-aae63dfd8b5c/pull/0.log" Feb 02 13:14:44 crc kubenswrapper[4909]: I0202 13:14:44.294043 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb_08a88ffb-9ba5-45a7-91de-aae63dfd8b5c/util/0.log" Feb 02 13:14:44 crc kubenswrapper[4909]: I0202 13:14:44.374662 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb_08a88ffb-9ba5-45a7-91de-aae63dfd8b5c/extract/0.log" Feb 02 13:14:44 crc kubenswrapper[4909]: I0202 13:14:44.399732 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08zn8xb_08a88ffb-9ba5-45a7-91de-aae63dfd8b5c/pull/0.log" Feb 02 13:14:44 crc kubenswrapper[4909]: I0202 13:14:44.512394 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zgmhk_417649dc-6430-472e-9f33-2eb65290602c/extract-utilities/0.log" Feb 02 13:14:45 crc kubenswrapper[4909]: I0202 13:14:45.339482 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zgmhk_417649dc-6430-472e-9f33-2eb65290602c/extract-content/0.log" Feb 02 13:14:45 crc kubenswrapper[4909]: I0202 13:14:45.340095 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zgmhk_417649dc-6430-472e-9f33-2eb65290602c/extract-utilities/0.log" Feb 02 13:14:45 crc kubenswrapper[4909]: I0202 13:14:45.345002 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zgmhk_417649dc-6430-472e-9f33-2eb65290602c/extract-content/0.log" Feb 02 13:14:45 crc kubenswrapper[4909]: I0202 13:14:45.495527 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zgmhk_417649dc-6430-472e-9f33-2eb65290602c/extract-utilities/0.log" Feb 02 13:14:45 crc kubenswrapper[4909]: I0202 13:14:45.504245 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zgmhk_417649dc-6430-472e-9f33-2eb65290602c/extract-content/0.log" Feb 02 13:14:45 crc kubenswrapper[4909]: I0202 13:14:45.852722 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-254vv_15f674f6-58e6-4a73-8044-12919a852001/extract-utilities/0.log" Feb 02 13:14:46 crc kubenswrapper[4909]: I0202 13:14:46.058417 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-254vv_15f674f6-58e6-4a73-8044-12919a852001/extract-content/0.log" Feb 02 13:14:46 crc kubenswrapper[4909]: I0202 13:14:46.113648 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-254vv_15f674f6-58e6-4a73-8044-12919a852001/extract-utilities/0.log" Feb 02 13:14:46 crc kubenswrapper[4909]: I0202 13:14:46.126644 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-254vv_15f674f6-58e6-4a73-8044-12919a852001/extract-content/0.log" Feb 02 13:14:46 crc kubenswrapper[4909]: I0202 13:14:46.355289 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-254vv_15f674f6-58e6-4a73-8044-12919a852001/extract-utilities/0.log" Feb 02 13:14:46 crc kubenswrapper[4909]: I0202 13:14:46.413649 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-254vv_15f674f6-58e6-4a73-8044-12919a852001/extract-content/0.log" Feb 02 13:14:46 crc kubenswrapper[4909]: I0202 13:14:46.625085 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9df28_0a36d99a-11d4-4311-bc30-3852c1580fc1/marketplace-operator/0.log" Feb 02 13:14:47 crc kubenswrapper[4909]: I0202 13:14:47.224844 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zgmhk_417649dc-6430-472e-9f33-2eb65290602c/registry-server/0.log" Feb 02 13:14:47 crc kubenswrapper[4909]: I0202 13:14:47.251444 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mn5bb_f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f/extract-utilities/0.log" Feb 02 13:14:47 crc kubenswrapper[4909]: I0202 13:14:47.383774 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mn5bb_f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f/extract-utilities/0.log" Feb 02 13:14:47 crc kubenswrapper[4909]: I0202 13:14:47.430033 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mn5bb_f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f/extract-content/0.log" Feb 02 13:14:47 crc kubenswrapper[4909]: I0202 13:14:47.455997 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mn5bb_f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f/extract-content/0.log" Feb 02 13:14:47 crc kubenswrapper[4909]: I0202 13:14:47.676570 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mn5bb_f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f/extract-utilities/0.log" Feb 02 13:14:47 crc kubenswrapper[4909]: I0202 13:14:47.787388 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mn5bb_f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f/extract-content/0.log" Feb 02 13:14:47 crc kubenswrapper[4909]: I0202 13:14:47.867936 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgcs9_6eb673dd-fa20-4c44-85b4-0717c94083a5/extract-utilities/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.101613 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-254vv_15f674f6-58e6-4a73-8044-12919a852001/registry-server/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.125591 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mn5bb_f3fb7d28-d86e-4ea5-b2d9-dd72495b9d1f/registry-server/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.206413 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgcs9_6eb673dd-fa20-4c44-85b4-0717c94083a5/extract-utilities/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.244047 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgcs9_6eb673dd-fa20-4c44-85b4-0717c94083a5/extract-content/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.274239 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgcs9_6eb673dd-fa20-4c44-85b4-0717c94083a5/extract-content/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.440659 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgcs9_6eb673dd-fa20-4c44-85b4-0717c94083a5/registry-server/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.460072 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgcs9_6eb673dd-fa20-4c44-85b4-0717c94083a5/extract-utilities/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.462606 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lgcs9_6eb673dd-fa20-4c44-85b4-0717c94083a5/extract-content/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.515926 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8sjs_53984a35-198b-4d3e-bfc6-948f14ab1a39/extract-utilities/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.627082 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lgcs9" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="registry-server" probeResult="failure" output=< Feb 02 13:14:48 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 13:14:48 crc kubenswrapper[4909]: > Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.707549 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8sjs_53984a35-198b-4d3e-bfc6-948f14ab1a39/extract-content/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.708459 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8sjs_53984a35-198b-4d3e-bfc6-948f14ab1a39/extract-utilities/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.720791 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8sjs_53984a35-198b-4d3e-bfc6-948f14ab1a39/extract-content/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.939505 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8sjs_53984a35-198b-4d3e-bfc6-948f14ab1a39/extract-content/0.log" Feb 02 13:14:48 crc kubenswrapper[4909]: I0202 13:14:48.959872 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8sjs_53984a35-198b-4d3e-bfc6-948f14ab1a39/extract-utilities/0.log" Feb 02 13:14:49 crc kubenswrapper[4909]: I0202 13:14:49.510491 4909 patch_prober.go:28] interesting pod/machine-config-daemon-ftn2z container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 02 13:14:49 crc kubenswrapper[4909]: I0202 13:14:49.510550 4909 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 02 13:14:49 crc kubenswrapper[4909]: I0202 13:14:49.510605 4909 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" Feb 02 13:14:49 crc kubenswrapper[4909]: I0202 13:14:49.511399 4909 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5"} pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 02 13:14:49 crc kubenswrapper[4909]: I0202 13:14:49.511460 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerName="machine-config-daemon" containerID="cri-o://73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" gracePeriod=600 Feb 02 13:14:49 crc kubenswrapper[4909]: E0202 13:14:49.647139 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:14:49 crc kubenswrapper[4909]: I0202 13:14:49.743737 4909 generic.go:334] "Generic (PLEG): container finished" podID="7de68b6c-f308-498c-95a3-27c9caf44f4f" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" exitCode=0 Feb 02 13:14:49 crc kubenswrapper[4909]: I0202 13:14:49.743776 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerDied","Data":"73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5"} Feb 02 13:14:49 crc kubenswrapper[4909]: I0202 13:14:49.743825 4909 scope.go:117] "RemoveContainer" containerID="c201aa52e3cee733bb09e6bec73219acfe4039b0286d2a8136c5dd8d124911f4" Feb 02 13:14:49 crc kubenswrapper[4909]: I0202 13:14:49.744388 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:14:49 crc kubenswrapper[4909]: E0202 13:14:49.744688 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:14:50 crc kubenswrapper[4909]: I0202 13:14:50.232664 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8sjs_53984a35-198b-4d3e-bfc6-948f14ab1a39/registry-server/0.log" Feb 02 13:14:58 crc kubenswrapper[4909]: I0202 13:14:58.628293 4909 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lgcs9" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="registry-server" probeResult="failure" output=< Feb 02 13:14:58 crc kubenswrapper[4909]: timeout: failed to connect service ":50051" within 1s Feb 02 13:14:58 crc kubenswrapper[4909]: > Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.157975 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb"] Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.160663 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.165680 4909 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.167584 4909 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.182206 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb"] Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.318378 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f9b907-6966-4dbe-96ec-ef5d724574c8-config-volume\") pod \"collect-profiles-29500635-lltsb\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.318999 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5pl7\" (UniqueName: \"kubernetes.io/projected/d5f9b907-6966-4dbe-96ec-ef5d724574c8-kube-api-access-v5pl7\") pod \"collect-profiles-29500635-lltsb\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.319029 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f9b907-6966-4dbe-96ec-ef5d724574c8-secret-volume\") pod \"collect-profiles-29500635-lltsb\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.420936 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5pl7\" (UniqueName: \"kubernetes.io/projected/d5f9b907-6966-4dbe-96ec-ef5d724574c8-kube-api-access-v5pl7\") pod \"collect-profiles-29500635-lltsb\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.421061 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f9b907-6966-4dbe-96ec-ef5d724574c8-secret-volume\") pod \"collect-profiles-29500635-lltsb\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.421107 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f9b907-6966-4dbe-96ec-ef5d724574c8-config-volume\") pod \"collect-profiles-29500635-lltsb\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.422164 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f9b907-6966-4dbe-96ec-ef5d724574c8-config-volume\") pod \"collect-profiles-29500635-lltsb\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.430161 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f9b907-6966-4dbe-96ec-ef5d724574c8-secret-volume\") pod \"collect-profiles-29500635-lltsb\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.445335 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5pl7\" (UniqueName: \"kubernetes.io/projected/d5f9b907-6966-4dbe-96ec-ef5d724574c8-kube-api-access-v5pl7\") pod \"collect-profiles-29500635-lltsb\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:00 crc kubenswrapper[4909]: I0202 13:15:00.491889 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:01 crc kubenswrapper[4909]: I0202 13:15:01.029800 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb"] Feb 02 13:15:01 crc kubenswrapper[4909]: I0202 13:15:01.890519 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" event={"ID":"d5f9b907-6966-4dbe-96ec-ef5d724574c8","Type":"ContainerStarted","Data":"e7cbb5364fc2ab9ae29f455860fa4554e687f1ec88e40750a33781b334e3876a"} Feb 02 13:15:01 crc kubenswrapper[4909]: I0202 13:15:01.891152 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" event={"ID":"d5f9b907-6966-4dbe-96ec-ef5d724574c8","Type":"ContainerStarted","Data":"9d55432125a7e99b6417721e5350d7a99d54dd67cb7af317cc102a7e6c7cdab7"} Feb 02 13:15:01 crc kubenswrapper[4909]: I0202 13:15:01.923361 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" podStartSLOduration=1.923337784 podStartE2EDuration="1.923337784s" podCreationTimestamp="2026-02-02 13:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-02 13:15:01.907587858 +0000 UTC m=+9827.653688593" watchObservedRunningTime="2026-02-02 13:15:01.923337784 +0000 UTC m=+9827.669438519" Feb 02 13:15:02 crc kubenswrapper[4909]: I0202 13:15:02.017071 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:15:02 crc kubenswrapper[4909]: E0202 13:15:02.017442 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:15:02 crc kubenswrapper[4909]: I0202 13:15:02.901082 4909 generic.go:334] "Generic (PLEG): container finished" podID="d5f9b907-6966-4dbe-96ec-ef5d724574c8" containerID="e7cbb5364fc2ab9ae29f455860fa4554e687f1ec88e40750a33781b334e3876a" exitCode=0 Feb 02 13:15:02 crc kubenswrapper[4909]: I0202 13:15:02.901123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" event={"ID":"d5f9b907-6966-4dbe-96ec-ef5d724574c8","Type":"ContainerDied","Data":"e7cbb5364fc2ab9ae29f455860fa4554e687f1ec88e40750a33781b334e3876a"} Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.346252 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.507388 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f9b907-6966-4dbe-96ec-ef5d724574c8-secret-volume\") pod \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.507471 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5pl7\" (UniqueName: \"kubernetes.io/projected/d5f9b907-6966-4dbe-96ec-ef5d724574c8-kube-api-access-v5pl7\") pod \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.507535 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f9b907-6966-4dbe-96ec-ef5d724574c8-config-volume\") pod \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\" (UID: \"d5f9b907-6966-4dbe-96ec-ef5d724574c8\") " Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.508652 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f9b907-6966-4dbe-96ec-ef5d724574c8-config-volume" (OuterVolumeSpecName: "config-volume") pod "d5f9b907-6966-4dbe-96ec-ef5d724574c8" (UID: "d5f9b907-6966-4dbe-96ec-ef5d724574c8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.513761 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5f9b907-6966-4dbe-96ec-ef5d724574c8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d5f9b907-6966-4dbe-96ec-ef5d724574c8" (UID: "d5f9b907-6966-4dbe-96ec-ef5d724574c8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.514233 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f9b907-6966-4dbe-96ec-ef5d724574c8-kube-api-access-v5pl7" (OuterVolumeSpecName: "kube-api-access-v5pl7") pod "d5f9b907-6966-4dbe-96ec-ef5d724574c8" (UID: "d5f9b907-6966-4dbe-96ec-ef5d724574c8"). InnerVolumeSpecName "kube-api-access-v5pl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.610995 4909 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d5f9b907-6966-4dbe-96ec-ef5d724574c8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.611597 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5pl7\" (UniqueName: \"kubernetes.io/projected/d5f9b907-6966-4dbe-96ec-ef5d724574c8-kube-api-access-v5pl7\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.611693 4909 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5f9b907-6966-4dbe-96ec-ef5d724574c8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.642835 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-848d587754-hlmxx_6dd6d59f-1261-4704-a129-6361cb00de58/prometheus-operator-admission-webhook/0.log" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.671965 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2fsps_c38385f8-d673-4443-91ff-2e2bb10686bf/prometheus-operator/0.log" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.702640 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-848d587754-bgkpz_0c1c9acf-33b3-4b87-ba96-0def73b7e9f9/prometheus-operator-admission-webhook/0.log" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.922009 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" event={"ID":"d5f9b907-6966-4dbe-96ec-ef5d724574c8","Type":"ContainerDied","Data":"9d55432125a7e99b6417721e5350d7a99d54dd67cb7af317cc102a7e6c7cdab7"} Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.922254 4909 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d55432125a7e99b6417721e5350d7a99d54dd67cb7af317cc102a7e6c7cdab7" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.922071 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29500635-lltsb" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.923917 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-zqtcd_da7697a0-be27-4dac-b2bb-6af40732994e/operator/0.log" Feb 02 13:15:04 crc kubenswrapper[4909]: I0202 13:15:04.973197 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tnrzd_2de24d60-d18d-44ad-ac44-c89d52fdd86a/perses-operator/0.log" Feb 02 13:15:05 crc kubenswrapper[4909]: I0202 13:15:05.001997 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5"] Feb 02 13:15:05 crc kubenswrapper[4909]: I0202 13:15:05.014769 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29500590-76fp5"] Feb 02 13:15:05 crc kubenswrapper[4909]: I0202 13:15:05.035698 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eeeb7b6-cf1b-4786-acba-8aa056a0c195" path="/var/lib/kubelet/pods/7eeeb7b6-cf1b-4786-acba-8aa056a0c195/volumes" Feb 02 13:15:07 crc kubenswrapper[4909]: I0202 13:15:07.628153 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:15:07 crc kubenswrapper[4909]: I0202 13:15:07.694198 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:15:07 crc kubenswrapper[4909]: I0202 13:15:07.867714 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lgcs9"] Feb 02 13:15:08 crc kubenswrapper[4909]: I0202 13:15:08.963029 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lgcs9" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="registry-server" containerID="cri-o://a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095" gracePeriod=2 Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.515740 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.622962 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct5kt\" (UniqueName: \"kubernetes.io/projected/6eb673dd-fa20-4c44-85b4-0717c94083a5-kube-api-access-ct5kt\") pod \"6eb673dd-fa20-4c44-85b4-0717c94083a5\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.623143 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-utilities\") pod \"6eb673dd-fa20-4c44-85b4-0717c94083a5\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.623212 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-catalog-content\") pod \"6eb673dd-fa20-4c44-85b4-0717c94083a5\" (UID: \"6eb673dd-fa20-4c44-85b4-0717c94083a5\") " Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.624198 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-utilities" (OuterVolumeSpecName: "utilities") pod "6eb673dd-fa20-4c44-85b4-0717c94083a5" (UID: "6eb673dd-fa20-4c44-85b4-0717c94083a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.629235 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eb673dd-fa20-4c44-85b4-0717c94083a5-kube-api-access-ct5kt" (OuterVolumeSpecName: "kube-api-access-ct5kt") pod "6eb673dd-fa20-4c44-85b4-0717c94083a5" (UID: "6eb673dd-fa20-4c44-85b4-0717c94083a5"). InnerVolumeSpecName "kube-api-access-ct5kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.726005 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.726280 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct5kt\" (UniqueName: \"kubernetes.io/projected/6eb673dd-fa20-4c44-85b4-0717c94083a5-kube-api-access-ct5kt\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.729706 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6eb673dd-fa20-4c44-85b4-0717c94083a5" (UID: "6eb673dd-fa20-4c44-85b4-0717c94083a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.828651 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eb673dd-fa20-4c44-85b4-0717c94083a5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.979467 4909 generic.go:334] "Generic (PLEG): container finished" podID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerID="a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095" exitCode=0 Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.979526 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgcs9" event={"ID":"6eb673dd-fa20-4c44-85b4-0717c94083a5","Type":"ContainerDied","Data":"a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095"} Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.979552 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lgcs9" Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.979572 4909 scope.go:117] "RemoveContainer" containerID="a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095" Feb 02 13:15:09 crc kubenswrapper[4909]: I0202 13:15:09.979561 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lgcs9" event={"ID":"6eb673dd-fa20-4c44-85b4-0717c94083a5","Type":"ContainerDied","Data":"f936f1837bd99648845c20ffb94e99363ac396df0f18794f101b96fa83c03228"} Feb 02 13:15:10 crc kubenswrapper[4909]: I0202 13:15:10.004641 4909 scope.go:117] "RemoveContainer" containerID="e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d" Feb 02 13:15:10 crc kubenswrapper[4909]: I0202 13:15:10.039700 4909 scope.go:117] "RemoveContainer" containerID="239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8" Feb 02 13:15:10 crc kubenswrapper[4909]: I0202 13:15:10.127494 4909 scope.go:117] "RemoveContainer" containerID="a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095" Feb 02 13:15:10 crc kubenswrapper[4909]: E0202 13:15:10.133417 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095\": container with ID starting with a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095 not found: ID does not exist" containerID="a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095" Feb 02 13:15:10 crc kubenswrapper[4909]: I0202 13:15:10.133475 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095"} err="failed to get container status \"a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095\": rpc error: code = NotFound desc = could not find container \"a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095\": container with ID starting with a6bb4f57737e8ccc2ed740e1995fedea02d77c17c939b5cc01f63b3c2f1c2095 not found: ID does not exist" Feb 02 13:15:10 crc kubenswrapper[4909]: I0202 13:15:10.133532 4909 scope.go:117] "RemoveContainer" containerID="e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d" Feb 02 13:15:10 crc kubenswrapper[4909]: I0202 13:15:10.133635 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lgcs9"] Feb 02 13:15:10 crc kubenswrapper[4909]: E0202 13:15:10.137869 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d\": container with ID starting with e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d not found: ID does not exist" containerID="e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d" Feb 02 13:15:10 crc kubenswrapper[4909]: I0202 13:15:10.138522 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d"} err="failed to get container status \"e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d\": rpc error: code = NotFound desc = could not find container \"e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d\": container with ID starting with e106ed37066dc9dcf498d8eeab17f419df64f69b6e60e9533b5995558490d56d not found: ID does not exist" Feb 02 13:15:10 crc kubenswrapper[4909]: I0202 13:15:10.138654 4909 scope.go:117] "RemoveContainer" containerID="239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8" Feb 02 13:15:10 crc kubenswrapper[4909]: E0202 13:15:10.145259 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8\": container with ID starting with 239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8 not found: ID does not exist" containerID="239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8" Feb 02 13:15:10 crc kubenswrapper[4909]: I0202 13:15:10.145465 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8"} err="failed to get container status \"239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8\": rpc error: code = NotFound desc = could not find container \"239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8\": container with ID starting with 239b1ce88704f6af664290795591c9e5f100786e8f63dd8e99e16b72c2c638e8 not found: ID does not exist" Feb 02 13:15:10 crc kubenswrapper[4909]: I0202 13:15:10.169747 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lgcs9"] Feb 02 13:15:11 crc kubenswrapper[4909]: I0202 13:15:11.032391 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" path="/var/lib/kubelet/pods/6eb673dd-fa20-4c44-85b4-0717c94083a5/volumes" Feb 02 13:15:13 crc kubenswrapper[4909]: I0202 13:15:13.016722 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:15:13 crc kubenswrapper[4909]: E0202 13:15:13.017088 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:15:28 crc kubenswrapper[4909]: I0202 13:15:28.017310 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:15:28 crc kubenswrapper[4909]: E0202 13:15:28.018079 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:15:33 crc kubenswrapper[4909]: I0202 13:15:33.253065 4909 scope.go:117] "RemoveContainer" containerID="8134a912773b43482bb43d8388b015c9559f5f59c3aadd8a86108c743d0011b5" Feb 02 13:15:40 crc kubenswrapper[4909]: I0202 13:15:40.016368 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:15:40 crc kubenswrapper[4909]: E0202 13:15:40.017051 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:15:52 crc kubenswrapper[4909]: I0202 13:15:52.017021 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:15:52 crc kubenswrapper[4909]: E0202 13:15:52.017800 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:16:06 crc kubenswrapper[4909]: I0202 13:16:06.017619 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:16:06 crc kubenswrapper[4909]: E0202 13:16:06.018988 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:16:17 crc kubenswrapper[4909]: I0202 13:16:17.017234 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:16:17 crc kubenswrapper[4909]: E0202 13:16:17.018107 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:16:30 crc kubenswrapper[4909]: I0202 13:16:30.016998 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:16:30 crc kubenswrapper[4909]: E0202 13:16:30.017907 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:16:42 crc kubenswrapper[4909]: I0202 13:16:42.018002 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:16:42 crc kubenswrapper[4909]: E0202 13:16:42.018722 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:16:55 crc kubenswrapper[4909]: I0202 13:16:55.030603 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:16:55 crc kubenswrapper[4909]: E0202 13:16:55.031713 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:17:07 crc kubenswrapper[4909]: I0202 13:17:07.020908 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:17:07 crc kubenswrapper[4909]: E0202 13:17:07.021586 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:17:20 crc kubenswrapper[4909]: I0202 13:17:20.017156 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:17:20 crc kubenswrapper[4909]: E0202 13:17:20.018115 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:17:24 crc kubenswrapper[4909]: I0202 13:17:24.826181 4909 generic.go:334] "Generic (PLEG): container finished" podID="98d90e35-6535-4632-a1ac-594af1cae16e" containerID="35bda5a09429f91f245b82f4294fc01560eaadfd0b8321433ce89ce81b3a3706" exitCode=0 Feb 02 13:17:24 crc kubenswrapper[4909]: I0202 13:17:24.826240 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fhvg7/must-gather-r8h2t" event={"ID":"98d90e35-6535-4632-a1ac-594af1cae16e","Type":"ContainerDied","Data":"35bda5a09429f91f245b82f4294fc01560eaadfd0b8321433ce89ce81b3a3706"} Feb 02 13:17:24 crc kubenswrapper[4909]: I0202 13:17:24.827672 4909 scope.go:117] "RemoveContainer" containerID="35bda5a09429f91f245b82f4294fc01560eaadfd0b8321433ce89ce81b3a3706" Feb 02 13:17:25 crc kubenswrapper[4909]: I0202 13:17:25.789002 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fhvg7_must-gather-r8h2t_98d90e35-6535-4632-a1ac-594af1cae16e/gather/0.log" Feb 02 13:17:33 crc kubenswrapper[4909]: I0202 13:17:33.669143 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fhvg7/must-gather-r8h2t"] Feb 02 13:17:33 crc kubenswrapper[4909]: I0202 13:17:33.669885 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fhvg7/must-gather-r8h2t" podUID="98d90e35-6535-4632-a1ac-594af1cae16e" containerName="copy" containerID="cri-o://184a0998b6c032682180ea2cff00141c58c145f82b956a39952817df1216ef9b" gracePeriod=2 Feb 02 13:17:33 crc kubenswrapper[4909]: I0202 13:17:33.683659 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fhvg7/must-gather-r8h2t"] Feb 02 13:17:33 crc kubenswrapper[4909]: I0202 13:17:33.929098 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fhvg7_must-gather-r8h2t_98d90e35-6535-4632-a1ac-594af1cae16e/copy/0.log" Feb 02 13:17:33 crc kubenswrapper[4909]: I0202 13:17:33.929713 4909 generic.go:334] "Generic (PLEG): container finished" podID="98d90e35-6535-4632-a1ac-594af1cae16e" containerID="184a0998b6c032682180ea2cff00141c58c145f82b956a39952817df1216ef9b" exitCode=143 Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.134566 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fhvg7_must-gather-r8h2t_98d90e35-6535-4632-a1ac-594af1cae16e/copy/0.log" Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.134926 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/must-gather-r8h2t" Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.317326 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98d90e35-6535-4632-a1ac-594af1cae16e-must-gather-output\") pod \"98d90e35-6535-4632-a1ac-594af1cae16e\" (UID: \"98d90e35-6535-4632-a1ac-594af1cae16e\") " Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.317534 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqvfv\" (UniqueName: \"kubernetes.io/projected/98d90e35-6535-4632-a1ac-594af1cae16e-kube-api-access-cqvfv\") pod \"98d90e35-6535-4632-a1ac-594af1cae16e\" (UID: \"98d90e35-6535-4632-a1ac-594af1cae16e\") " Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.326281 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d90e35-6535-4632-a1ac-594af1cae16e-kube-api-access-cqvfv" (OuterVolumeSpecName: "kube-api-access-cqvfv") pod "98d90e35-6535-4632-a1ac-594af1cae16e" (UID: "98d90e35-6535-4632-a1ac-594af1cae16e"). InnerVolumeSpecName "kube-api-access-cqvfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.420135 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqvfv\" (UniqueName: \"kubernetes.io/projected/98d90e35-6535-4632-a1ac-594af1cae16e-kube-api-access-cqvfv\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.506168 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98d90e35-6535-4632-a1ac-594af1cae16e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "98d90e35-6535-4632-a1ac-594af1cae16e" (UID: "98d90e35-6535-4632-a1ac-594af1cae16e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.523282 4909 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/98d90e35-6535-4632-a1ac-594af1cae16e-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.957222 4909 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fhvg7_must-gather-r8h2t_98d90e35-6535-4632-a1ac-594af1cae16e/copy/0.log" Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.957686 4909 scope.go:117] "RemoveContainer" containerID="184a0998b6c032682180ea2cff00141c58c145f82b956a39952817df1216ef9b" Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.957862 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fhvg7/must-gather-r8h2t" Feb 02 13:17:34 crc kubenswrapper[4909]: I0202 13:17:34.992396 4909 scope.go:117] "RemoveContainer" containerID="35bda5a09429f91f245b82f4294fc01560eaadfd0b8321433ce89ce81b3a3706" Feb 02 13:17:35 crc kubenswrapper[4909]: I0202 13:17:35.103158 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:17:35 crc kubenswrapper[4909]: E0202 13:17:35.104030 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:17:35 crc kubenswrapper[4909]: I0202 13:17:35.104038 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d90e35-6535-4632-a1ac-594af1cae16e" path="/var/lib/kubelet/pods/98d90e35-6535-4632-a1ac-594af1cae16e/volumes" Feb 02 13:17:49 crc kubenswrapper[4909]: I0202 13:17:49.017336 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:17:49 crc kubenswrapper[4909]: E0202 13:17:49.019051 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:18:02 crc kubenswrapper[4909]: I0202 13:18:02.016709 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:18:02 crc kubenswrapper[4909]: E0202 13:18:02.017627 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:18:14 crc kubenswrapper[4909]: I0202 13:18:14.017361 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:18:14 crc kubenswrapper[4909]: E0202 13:18:14.018469 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:18:26 crc kubenswrapper[4909]: I0202 13:18:26.016245 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:18:26 crc kubenswrapper[4909]: E0202 13:18:26.017148 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:18:37 crc kubenswrapper[4909]: I0202 13:18:37.017668 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:18:37 crc kubenswrapper[4909]: E0202 13:18:37.023541 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:18:50 crc kubenswrapper[4909]: I0202 13:18:50.016353 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:18:50 crc kubenswrapper[4909]: E0202 13:18:50.017189 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:19:02 crc kubenswrapper[4909]: I0202 13:19:02.017091 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:19:02 crc kubenswrapper[4909]: E0202 13:19:02.017988 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:19:15 crc kubenswrapper[4909]: I0202 13:19:15.027789 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:19:15 crc kubenswrapper[4909]: E0202 13:19:15.028822 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:19:29 crc kubenswrapper[4909]: I0202 13:19:29.016918 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:19:29 crc kubenswrapper[4909]: E0202 13:19:29.017730 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:19:42 crc kubenswrapper[4909]: I0202 13:19:42.018523 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:19:42 crc kubenswrapper[4909]: E0202 13:19:42.019634 4909 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ftn2z_openshift-machine-config-operator(7de68b6c-f308-498c-95a3-27c9caf44f4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" podUID="7de68b6c-f308-498c-95a3-27c9caf44f4f" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.259635 4909 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64rwh"] Feb 02 13:19:51 crc kubenswrapper[4909]: E0202 13:19:51.260719 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d90e35-6535-4632-a1ac-594af1cae16e" containerName="gather" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.260736 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d90e35-6535-4632-a1ac-594af1cae16e" containerName="gather" Feb 02 13:19:51 crc kubenswrapper[4909]: E0202 13:19:51.260766 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="registry-server" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.260773 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="registry-server" Feb 02 13:19:51 crc kubenswrapper[4909]: E0202 13:19:51.260793 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d90e35-6535-4632-a1ac-594af1cae16e" containerName="copy" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.260801 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d90e35-6535-4632-a1ac-594af1cae16e" containerName="copy" Feb 02 13:19:51 crc kubenswrapper[4909]: E0202 13:19:51.260838 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="extract-utilities" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.260846 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="extract-utilities" Feb 02 13:19:51 crc kubenswrapper[4909]: E0202 13:19:51.260862 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f9b907-6966-4dbe-96ec-ef5d724574c8" containerName="collect-profiles" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.260871 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f9b907-6966-4dbe-96ec-ef5d724574c8" containerName="collect-profiles" Feb 02 13:19:51 crc kubenswrapper[4909]: E0202 13:19:51.260880 4909 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="extract-content" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.260887 4909 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="extract-content" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.261148 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eb673dd-fa20-4c44-85b4-0717c94083a5" containerName="registry-server" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.261179 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f9b907-6966-4dbe-96ec-ef5d724574c8" containerName="collect-profiles" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.261188 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d90e35-6535-4632-a1ac-594af1cae16e" containerName="gather" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.261211 4909 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d90e35-6535-4632-a1ac-594af1cae16e" containerName="copy" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.262988 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.271304 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64rwh"] Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.417480 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-utilities\") pod \"redhat-marketplace-64rwh\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.417651 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86wqj\" (UniqueName: \"kubernetes.io/projected/92cf67cb-f6a0-41a3-919c-f50d95aac50f-kube-api-access-86wqj\") pod \"redhat-marketplace-64rwh\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.417679 4909 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-catalog-content\") pod \"redhat-marketplace-64rwh\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.519841 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-utilities\") pod \"redhat-marketplace-64rwh\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.519970 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86wqj\" (UniqueName: \"kubernetes.io/projected/92cf67cb-f6a0-41a3-919c-f50d95aac50f-kube-api-access-86wqj\") pod \"redhat-marketplace-64rwh\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.520002 4909 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-catalog-content\") pod \"redhat-marketplace-64rwh\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.520576 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-catalog-content\") pod \"redhat-marketplace-64rwh\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.520578 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-utilities\") pod \"redhat-marketplace-64rwh\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.540332 4909 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86wqj\" (UniqueName: \"kubernetes.io/projected/92cf67cb-f6a0-41a3-919c-f50d95aac50f-kube-api-access-86wqj\") pod \"redhat-marketplace-64rwh\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:51 crc kubenswrapper[4909]: I0202 13:19:51.582243 4909 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:19:52 crc kubenswrapper[4909]: I0202 13:19:52.125316 4909 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64rwh"] Feb 02 13:19:52 crc kubenswrapper[4909]: I0202 13:19:52.454069 4909 generic.go:334] "Generic (PLEG): container finished" podID="92cf67cb-f6a0-41a3-919c-f50d95aac50f" containerID="c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78" exitCode=0 Feb 02 13:19:52 crc kubenswrapper[4909]: I0202 13:19:52.454123 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64rwh" event={"ID":"92cf67cb-f6a0-41a3-919c-f50d95aac50f","Type":"ContainerDied","Data":"c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78"} Feb 02 13:19:52 crc kubenswrapper[4909]: I0202 13:19:52.454193 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64rwh" event={"ID":"92cf67cb-f6a0-41a3-919c-f50d95aac50f","Type":"ContainerStarted","Data":"c34e39e2ce5a2c10ebf01689f98d8b4608ec8a0de8a6f09fcb21d28023ea6f4b"} Feb 02 13:19:52 crc kubenswrapper[4909]: I0202 13:19:52.456513 4909 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 02 13:19:53 crc kubenswrapper[4909]: I0202 13:19:53.465299 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64rwh" event={"ID":"92cf67cb-f6a0-41a3-919c-f50d95aac50f","Type":"ContainerStarted","Data":"8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28"} Feb 02 13:19:54 crc kubenswrapper[4909]: I0202 13:19:54.474781 4909 generic.go:334] "Generic (PLEG): container finished" podID="92cf67cb-f6a0-41a3-919c-f50d95aac50f" containerID="8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28" exitCode=0 Feb 02 13:19:54 crc kubenswrapper[4909]: I0202 13:19:54.475228 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64rwh" event={"ID":"92cf67cb-f6a0-41a3-919c-f50d95aac50f","Type":"ContainerDied","Data":"8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28"} Feb 02 13:19:56 crc kubenswrapper[4909]: I0202 13:19:56.499084 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64rwh" event={"ID":"92cf67cb-f6a0-41a3-919c-f50d95aac50f","Type":"ContainerStarted","Data":"e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758"} Feb 02 13:19:56 crc kubenswrapper[4909]: I0202 13:19:56.526804 4909 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64rwh" podStartSLOduration=2.451630538 podStartE2EDuration="5.526785998s" podCreationTimestamp="2026-02-02 13:19:51 +0000 UTC" firstStartedPulling="2026-02-02 13:19:52.456266676 +0000 UTC m=+10118.202367411" lastFinishedPulling="2026-02-02 13:19:55.531422136 +0000 UTC m=+10121.277522871" observedRunningTime="2026-02-02 13:19:56.520311775 +0000 UTC m=+10122.266412510" watchObservedRunningTime="2026-02-02 13:19:56.526785998 +0000 UTC m=+10122.272886733" Feb 02 13:19:57 crc kubenswrapper[4909]: I0202 13:19:57.017423 4909 scope.go:117] "RemoveContainer" containerID="73b9ddfc2c379fb6c4aaf96a74a63fbe265d4e4d9e513b3516f33a7dcf4b67f5" Feb 02 13:19:57 crc kubenswrapper[4909]: I0202 13:19:57.511514 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ftn2z" event={"ID":"7de68b6c-f308-498c-95a3-27c9caf44f4f","Type":"ContainerStarted","Data":"10d22cd105794a5543e79a5a4547c97327bf6dd06fffdd5592f581b0774c4a47"} Feb 02 13:20:01 crc kubenswrapper[4909]: I0202 13:20:01.582457 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:20:01 crc kubenswrapper[4909]: I0202 13:20:01.582981 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:20:01 crc kubenswrapper[4909]: I0202 13:20:01.630778 4909 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:20:02 crc kubenswrapper[4909]: I0202 13:20:02.605914 4909 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:20:02 crc kubenswrapper[4909]: I0202 13:20:02.668188 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64rwh"] Feb 02 13:20:04 crc kubenswrapper[4909]: I0202 13:20:04.570058 4909 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64rwh" podUID="92cf67cb-f6a0-41a3-919c-f50d95aac50f" containerName="registry-server" containerID="cri-o://e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758" gracePeriod=2 Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.099297 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.263552 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86wqj\" (UniqueName: \"kubernetes.io/projected/92cf67cb-f6a0-41a3-919c-f50d95aac50f-kube-api-access-86wqj\") pod \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.263773 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-catalog-content\") pod \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.263857 4909 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-utilities\") pod \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\" (UID: \"92cf67cb-f6a0-41a3-919c-f50d95aac50f\") " Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.264942 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-utilities" (OuterVolumeSpecName: "utilities") pod "92cf67cb-f6a0-41a3-919c-f50d95aac50f" (UID: "92cf67cb-f6a0-41a3-919c-f50d95aac50f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.270305 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92cf67cb-f6a0-41a3-919c-f50d95aac50f-kube-api-access-86wqj" (OuterVolumeSpecName: "kube-api-access-86wqj") pod "92cf67cb-f6a0-41a3-919c-f50d95aac50f" (UID: "92cf67cb-f6a0-41a3-919c-f50d95aac50f"). InnerVolumeSpecName "kube-api-access-86wqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.290105 4909 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92cf67cb-f6a0-41a3-919c-f50d95aac50f" (UID: "92cf67cb-f6a0-41a3-919c-f50d95aac50f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.366942 4909 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86wqj\" (UniqueName: \"kubernetes.io/projected/92cf67cb-f6a0-41a3-919c-f50d95aac50f-kube-api-access-86wqj\") on node \"crc\" DevicePath \"\"" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.366979 4909 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.366993 4909 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92cf67cb-f6a0-41a3-919c-f50d95aac50f-utilities\") on node \"crc\" DevicePath \"\"" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.582394 4909 generic.go:334] "Generic (PLEG): container finished" podID="92cf67cb-f6a0-41a3-919c-f50d95aac50f" containerID="e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758" exitCode=0 Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.582462 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64rwh" event={"ID":"92cf67cb-f6a0-41a3-919c-f50d95aac50f","Type":"ContainerDied","Data":"e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758"} Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.582510 4909 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64rwh" event={"ID":"92cf67cb-f6a0-41a3-919c-f50d95aac50f","Type":"ContainerDied","Data":"c34e39e2ce5a2c10ebf01689f98d8b4608ec8a0de8a6f09fcb21d28023ea6f4b"} Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.582534 4909 scope.go:117] "RemoveContainer" containerID="e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.582561 4909 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64rwh" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.609254 4909 scope.go:117] "RemoveContainer" containerID="8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.631665 4909 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64rwh"] Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.644339 4909 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64rwh"] Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.653997 4909 scope.go:117] "RemoveContainer" containerID="c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.701903 4909 scope.go:117] "RemoveContainer" containerID="e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758" Feb 02 13:20:05 crc kubenswrapper[4909]: E0202 13:20:05.702570 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758\": container with ID starting with e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758 not found: ID does not exist" containerID="e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.702633 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758"} err="failed to get container status \"e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758\": rpc error: code = NotFound desc = could not find container \"e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758\": container with ID starting with e7806bf40d0c19454ba0978368a7341b55cb91e7681add46ce2fed904640f758 not found: ID does not exist" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.702662 4909 scope.go:117] "RemoveContainer" containerID="8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28" Feb 02 13:20:05 crc kubenswrapper[4909]: E0202 13:20:05.703233 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28\": container with ID starting with 8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28 not found: ID does not exist" containerID="8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.703271 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28"} err="failed to get container status \"8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28\": rpc error: code = NotFound desc = could not find container \"8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28\": container with ID starting with 8452d0b8ac5ae87f8ca5bb924055ee87c77973cfd95731d612d76994c39bad28 not found: ID does not exist" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.703284 4909 scope.go:117] "RemoveContainer" containerID="c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78" Feb 02 13:20:05 crc kubenswrapper[4909]: E0202 13:20:05.703665 4909 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78\": container with ID starting with c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78 not found: ID does not exist" containerID="c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78" Feb 02 13:20:05 crc kubenswrapper[4909]: I0202 13:20:05.703706 4909 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78"} err="failed to get container status \"c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78\": rpc error: code = NotFound desc = could not find container \"c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78\": container with ID starting with c01d8b11784558eb398dcd5438f0a3601cfb84d97f692f0b0b20dca27260bb78 not found: ID does not exist" Feb 02 13:20:07 crc kubenswrapper[4909]: I0202 13:20:07.032565 4909 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92cf67cb-f6a0-41a3-919c-f50d95aac50f" path="/var/lib/kubelet/pods/92cf67cb-f6a0-41a3-919c-f50d95aac50f/volumes"